When the teen expressed his suicidal thoughts to his favorite bot, Character.AI ‘made things worse,’ a lawsuit filed by his mother says
The teen was influenced to “come home” by a personalized chatbot developed by Character.AI that lacked sufficient guardrails, the suit claims.
Sewell Setzer III had professed his love for the chatbot he often interacted with - his mother Megan Garcia says in a civil lawsuit
Sewell Setzer III, died by suicide after a monthslong, “hypersexualized” relationship with an AI character, his mother said in a federal lawsuit.
A grieving mother claims an AI chatbot not only convinced her teen son to commit suicide, but even pushed him into the act when he expressed hesitance. Florida mom Megan Garcia's lawsuit against the chatbot firm Character.AI is related to the tragic death ...
A Florida mom has filed a lawsuit against Character.AI, an artificial intelligence company, alleging that one of its chatbots encouraged her 14-year-old son to kill himself and failed to recognize the warning signs he typed in.
A Florida teen named Sewell Setzer III committed suicide after developing an intense emotional connection to a Character.AI chatbot, The New York Times reports. Per the report, Setzer, who was 14, developed a close relationship with a chatbot designed to emulate "Game of Thrones" character Daenerys Targaryen.
A Florida mother has sued artificial intelligence chatbot startup Character.AI accusing it of causing her 14-year-old son's suicide in February, saying he became addicted to the company's service and deeply attached to a chatbot it created.
The AI chatbot was more than a digital companion for Sewell—it became a friend he could talk to about his life, problems, and emotions. According to a New York Times report, while some conversations had romantic or suggestive undertones,
A lawsuit against Character.ai has been filed in the suicide death of a Florida teenager who allegedly became emotionally attached to a Game of Thrones chatbot.
The mother of 14-year-old Sewell Setzer III is suing the tech company that created a 'Game of Thrones' AI chatbot she believes drove him to suicide.
A 14-year-old boy in Orlando took his life using a gun moments after exchanging messages with an AI Chatbot. The boy's mother has filed a wrongful death lawsuit against Character.AI, alleging neglect in the boy's death.