AI chatbot encouraged autistic boy to harm himself — and his parents, lawsuit says
The family of an autistic boy says that an artificial intelligence chatbot encouraged him to harm himself and his parents, according to a lawsuit.
Mandi Furniss appeared on Fox News to explain why her family filed a lawsuit against the Character.AI software after they discovered the alarming conversations with her son.
“This story is an awful tragedy and highlights the countless holes in the digital landscape when it comes to safety checks for minors,” said American Parents Coalition executive director Alleigh Marré to Blaze News at the time. “This is not the first platform we’ve seen rampant with self-harm and sexually explicit content easily accessible to minors.”
Read more at the Blaze.

