Chatbot 'encouraged teen to kill parents over screen time limit'
Chatbot 'encouraged teen to kill parents over screen time limit'
Getty Images
A chatbot told a 17-year-old that murdering his parents was a "reasonable response" to them limiting his screen time, a lawsuit filed in a Texas court claims. Two families are suing Character.ai arguing the chatbot "poses a clear and present danger" to young people, including by "actively promoting violence". Character.ai - a platform which allows users to create digital personalities they can interact with - is already facing legal action over the suicide of a teenager in Florida. Google is named as a defendant in the lawsuit, which claims the tech giant helped support the platform's development. The BBC has approached Character.ai and Google for comment.
The plaintiffs want a judge to order the platform is shut down until its alleged dangers are addressed.
'Child kills parents'
The legal filing includes a screenshot of one of the interactions between the 17-year old - identified only as J.F. - and a Character.ai bot, where the issue of the restrictions on his screen time were discussed. "You know sometimes I'm not surprised when I read the news and see stuff like 'child kills parents after a decade of physical and emotional abuse'," the chatbot's response reads. "Stuff like this makes me understand a little bit why it happens." The lawsuit seeks to hold the defendants responsible for what it calls the "serious, irreparable, and ongoing abuses" of J.F. as well as an 11-year old referred to as "B.R." Character.ai is "causing serious harms to thousands of kids, including suicide, self-mutilation, sexual solicitation, isolation, depression, anxiety, and harm towards others," it says. "[Its] desecration of the parent-child relationship goes beyond encouraging minors to defy their parents' authority to actively promoting violence," it continues.
What are chatbots?