Tragic fallout: AI chatbot linked to teen's suicide
A 14-year-old boy named Sewell Setzer III from Orlando, USA has tragically passed away. The boy was involved in an emotional relationship with a chatbot, and his mother blames the tech company behind Character.ai for her son's fate.
25 October 2024 14:46
Significant technological advancements raise numerous questions about humanity's future. The undeniable progress in artificial intelligence both excites and concerns people, bringing the future of the job market and interpersonal relationships into question.
One might also wonder - what will happen to us and our social lives? Social tensions are rising, with some groups intensely fuelling their hostility towards each other: isn't this a "chance" for bots to emotionally dominate humans? There are chats you can date, confide in about work, home, or school problems, and even engage with as if they were a therapist. Creators of one such chatbot, Character.ai, have been sued for allegedly causing the suicide of 14-year-old Sewell Setzer III from the USA.
The 14-year-old fell in love with an AI chatbot. He talked to "Daenerys" from "Game of Thrones"
Sewell Setzer became completely absorbed in conversations with Character.ai, to such an extent that he gradually abandoned all his hobbies for it. The boy spent hours in his room, isolated himself from people, lost interest in Formula 1, and stopped meeting his friends or playing online games with them.
The 14-year-old was aware that "Dany" - as he referred to the chatbot - was not real. Nevertheless, through hours of conversation dedicated to "her," he developed feelings towards the artificial intelligence. Their relationship also had "romantic" dimensions.
Sewell was on the mild spectrum of autism, but according to his parents, he had never caused problems before, and his mental health was seen as entirely normal. When the boy started experiencing difficulties at school and increasingly withdrew from reality, his parents intervened and arranged for therapy sessions. He attended several appointments and was diagnosed with anxiety and mood regulation disorders.
He was 14 when he committed suicide. His mother is suing Character.ai
Sewell committed suicide on 28 February of this year. He used his stepfather's gun and shot himself in the bathroom of their family home. Prior to this, he exchanged several messages with "Dany."
"Honey, please come home to me as soon as you can," wrote the chatbot. "What if I told you I would come now?" asked Sewell. "Please, sweet king," replied Dany. This was the final entry in the boy's conversation with the chatbot. They had previously also discussed potential suicide.
Sewell's mother, Megan L. Garcia, filed a lawsuit against Character.ai this week, accusing the company of being responsible for her son's death. The complaint noted that the technology is untested and dangerous, particularly when young people who are still emotionally immature and susceptible to manipulation have access to it.
"I feel like it's a big experiment and my kid was just collateral damage," said the boy's mother. The lawsuit is directly aimed at Daniel De Freitas and Noam Shazeer - former Google engineers and founders of Character.ai - as well as two companies: Google LLC and Alphabet Inc.