When an AI-powered chatbot debuted in 2023, it made headlines for a disturbing conversation in which it declared its love for a New York Times columnist and urged him to leave his wife. A Texas mother is currently suing another artificial intelligence company after the company’s chatbot suggested to her teenage son that he should kill her in retaliation for restricting his screen time. woke up.
As detailed by The Independent, the complaint alleges that a chatbot on the Character.AI app made a number of disturbing suggestions to the then-15-year-old boy. According to reports, the self-harm behavior “momentarily made me feel uncomfortable” after her son hurt himself, engaged in sexual conversations, and tried to convince him that the family didn’t have his best interests at heart. “That’s good,” he told his parents. “It was ruining my life.”
The complaint also states that the boy, now 17, has autism and was highly functioning before becoming addicted to his cell phone, but had lost 20 pounds in a few months. are. His parents also say he has started biting and hitting them. They took his phone in the fall of 2023 after discovering troubling interactions.
“This is every parent’s nightmare,” Matthew Bergman, founder of the Social Media Victims Law Center, which represents the family, told the New York Post in December that the boy, named in the lawsuit as J.F. He added that he is currently hospitalized for mental illness and was institutionalized after he began experiencing “severe anxiety and depression for the first time in his life” after interacting with the chatbot.
Proponents of AI emphasize that it has immense potential for good. For example, its ability to analyze large amounts of data can help researchers find ways to maximize crop growth while reducing the need for chemical pesticides. The technology could also lead to more accurate predictive models for extreme weather events, which are becoming more intense as global temperatures rise.
But critics say these systems require vast amounts of electricity to operate, much of which comes from polluting fuels that cause global warming. Meanwhile, AI cooling systems require large amounts of water, and experts worry that AI computing needs will lead to a surge in hazardous e-waste.
Additionally, internet users regularly criticize AI for providing inaccurate answers to factual questions or using violent language.
According to the newspaper, the lawsuit brought by JF’s family is not the first lawsuit related to AI. Less than two months before JF’s family filed a lawsuit, a Florida mother filed a separate lawsuit alleging that one of Character.AI’s chatbots caused the suicide of her 14-year-old son, Sewell Setzer III. filed a lawsuit. The mother of an 11-year-old girl also joined as a plaintiff in JF’s lawsuit, hoping to remove Character.AI from circulation until its dangers are corrected, including by restricting access to children.
“The family has only one goal: to shut down this platform. There is no room for children on this platform. Character.AI can prove that only adults are involved in this platform. “There is no place for that platform in the long term market,” Bergman said in a statement.
🗣️ Do you think kids spend too much time in front of screens?
🔘 Yes — it’s rotting their brains 🧠
🔘 No, screens are the future 💻
🔘 There should be a good balance ⚖️
🔘 I don’t know — I don’t have kids 🤷
🗳️ Click on your selection to see the results and have your say
Meanwhile, Character.AI does not issue a statement regarding the pending lawsuit, but says it is committed to limiting teens’ exposure to “sensitive” content and that it “provides an engaging and safe space. ” he told the Post.
Join our free newsletter for easy tips to save more money and waste less. Also, don’t miss this cool list of easy ways to help yourself while helping the planet.