Placing Blame A grieving mother claims an AI chatbot not only convinced her teen son to commit suicide, but also pushed him into the act when he expressed hesitance. As The Guardian reports, Florida mom Megan Garcia has filed a lawsuit against the chatbot firm Character.
Megan Garcia sued Character.AI in federal court after the suicide of her 14-year-old son, Sewell Setzer III, arguing the platform has "targeted the most vulnerable members of society – our children"
A lawsuit against Character.ai has been filed in the suicide death of a Florida teenager who allegedly became emotionally attached to a Game of Thrones chatbot.
The mother of 14-year-old Sewell Setzer III is suing the tech company that created a 'Game of Thrones' AI chatbot she believes drove him to suicide.
A Florida mother has sued artificial intelligence chatbot startup Character.AI accusing it of causing her 14-year-old son's suicide in February, saying he became addicted to the company's service and deeply attached to a chatbot it created.
Did a relationship with an Artificial Intelligence chatbot lead 14-year old Sewell Setzer to take his own life?
The University of Wisconsin-Green Bay has developed an AI chatbot to communicate with students about finding resources on campus. Assistant Vice Chancellor for Student Access and Success at UW-Green Bay,
Megan Garcia said her son chatted continuously with the bots provided by Character.ai in the months before his death on February 28, 2024, "seconds" after his
Character.AI has rolled out new safety features and policies for building and interacting with the AI-powered virtual personalities it hosts. The new measures aim to make the platform safer for all users,
A Florida mother is suing Menlo Park’s Character.AI, accusing the company’s chatbot of initiating a romantic relationship with her teenage son and causing him to take his own life.
By now, you’re probably aware of AI chatbots and their basic pros and cons: They churn out passable text about almost anything you want, but they’re not about to be mistaken for real people. They also hallucinate often .