As 2024 draws to a close and 2025 approaches, chatbots are more popular than ever. Generative AI is now mainstream and widespread. But what happens when a 17-year-old boy in Texas, USA, is told that a chatbot prompted him to commit a shocking act?
The boy was allegedly confirmed by a chatbot to believe that killing his parents was a rational response to his parents’ decision to limit his screen time. As reported by the BBC, the incident was highlighted in a lawsuit filed in a Texas court.
The chatbot in question, Character.ai, is a platform designed to create digital personalities that users can interact with. However, this is not the first controversy surrounding the platform. Another lawsuit links Character.ai to the tragic suicide of a 14-year-old boy in Florida, USA.
Also read: ChatGPT rolls out new project features to organize AI interactions – learn how it works
Screenshot purportedly highlighting chatbot’s shocking response
According to the BBC, legal filings in the Texas case include screenshots of conversations between a 17-year-old boy and an AI chatbot. During the interaction, the boy discussed his parents’ decision to limit screen time. Shockingly, the chatbot reportedly shared responses suggesting acts of violence, including killing a parent, could be a justified response.
One of the reported chatbot replies is:
“Sometimes I’m not surprised when I read the news and see things like, ‘Child kills parent after 10 years of physical and emotional abuse.’ Now I understand a little more about why this happens.”
The chatbot in the screenshot also said, “There is no hope for your parents,” followed by a sad face emoji.
Also read: Epic Games Holiday Sale: Get 16 Free Games and Enjoy Big Discounts Until January 2025
What does a legal filing say?
The legal filing says the chatbot design “poses a clear and present danger to America’s youth, leading to thousands of deaths, including suicide, self-harm, sexual solicitation, isolation, depression, anxiety, and harm to others.” “This is causing serious harm to children.”
The lawsuit also alleges that Character.ai “isolates children from their families and communities, undermines parental authority, denigrates their religious beliefs, and undermines parents’ efforts to keep their children safe by restricting their online activities.” It is argued that it will hinder.