News

The suit alleged that a chatbot based on a "Game of Thrones" character developed an emotionally sexually abusive relationship with a 14-year-old boy and encouraged him to take his own life.
A mom is suing Character.ai, accusing the company’s chatbots of initiating “abusive and sexual interactions” with her teen son and encouraging him to take his own life.
Chatbot service Character.AI is facing another lawsuit for allegedly hurting teens’ mental health, this time after a teenager said it led him to self-harm. The suit, filed in Texas on behalf of ...
Two families have sued artificial intelligence chatbot company Character.AI, accusing it of providing sexual content to their children and encouraging self-harm and violence. The lawsuit asks a ...
The suit accuses Character.AI’s creators of negligence, intentional infliction of emotional distress, wrongful death, deceptive trade practices, and other claims.
According to the lawsuit, Character.AI was rated suitable for children 12 and up until approximately July. Around that time, the rating was changed to suitable for children 17 and up.
Texas Attorney General Ken Paxton has put tech companies on notice over child privacy and safety concerns — after a terrifying new lawsuit claimed that the highly popular Character.AI app pushed ...
In a wrongful death lawsuit, Character.AI argued that its chatbot users had a First Amendment right to hear even harmful speech. The judge wasn’t persuaded.
Lawsuit claims Character.AI is responsible for teen's suicide Megan Garcia says the company’s chatbots encouraged her 14-year-old son, Sewell Setzer, to take his own life, according to the lawsuit ...