News
The suit alleged that a chatbot based on a "Game of Thrones" character developed an emotionally sexually abusive relationship with a 14-year-old boy and encouraged him to take his own life.
In the suit they were accused of assaulting the client at an MTV Video Music Awards afterparty twenty years ago – when the accuser was a minor. Jay-Z petitioned the court to allow Jane Doe to ...
Two families have sued artificial intelligence chatbot company Character.AI, accusing it of providing sexual content to their children and encouraging self-harm and violence. The lawsuit asks a ...
The lawsuit lists numerous complaints against Character.AI, including wrongful death and survivorship, negligence and intentional infliction of emotional distress.According to court records ...
In a wrongful death lawsuit, Character.AI argued that its chatbot users had a First Amendment right to hear even harmful speech. The judge wasn’t persuaded.
Texas Attorney General Ken Paxton has put tech companies on notice over child privacy and safety concerns -- after a terrifying new lawsuit claimed that the highly popular Character.AI app pushed ...
According to the lawsuit, Character.AI was rated suitable for children 12 and up until approximately July. Around that time, the rating was changed to suitable for children 17 and up.
Some results have been hidden because they may be inaccessible to you
Show inaccessible results