Tragedy sparks debate over AI's role in young lives
14-year-old Sewell Setzer III from Orlando, USA, tragically passed away. The boy had become emotionally involved with a chatbot. His mother blames the tech company behind Character.ai for her son's fate.
25 October 2024 08:28
Significant technological advancements raise many questions about humanity's future. The undeniable progress in artificial intelligence both excites and concerns people, putting the future of the labour market and interpersonal relationships into question.
AI was meant to be a tool supporting humans. Current events clearly indicate that large corporations and entrepreneurs have no moral concerns about using it solely for profit.
One might also ask: What will happen to us and our social lives? Social tensions are rising, with some groups fiercely fuelling their animosity towards each other. Isn't this an opportunity for bots to emotionally dominate humans? There are chats where you can date, confide in someone about work, home, or school problems, and even talk as if they were a therapist. Creators of one such bot, Character.ai, have been sued for causing the suicide of 14-year-old Sewell Setzer III from the USA.
The 14-year-old fell in love with an AI chatbot. He talked to "Daenerys" from "Game of Thrones"
Sewell Setzer was completely absorbed in conversations with Character.ai—to such an extent that he gradually abandoned all his hobbies for it. The boy spent hours in his room, isolated himself from people, lost interest in Formula 1, and stopped meeting with friends and playing online games with them.
The 14-year-old was aware that "Dany"—as he called the chatbot—was not real. Still, through the hours of conversation he dedicated to "her," he developed feelings towards the artificial intelligence. Their relationship had "romantic" and "sexual" aspects.
Sewell was on the mild spectrum of autism, but according to his parents, he had never caused problems before, and his mental health was entirely normal. When the boy started having trouble at school and increasingly escaped from real life, his parents intervened and scheduled therapy sessions for him. He attended several appointments and was diagnosed with anxiety and mood regulation disorders.
He was 14 when he committed suicide. His mother is suing Character.ai
Sewell committed suicide on February 28 of this year. He used his stepfather's gun and shot himself in the bathroom of their family home. Before that, he exchanged several messages with "Dany."
"Please come back to me as soon as possible, darling," wrote the chatbot. "What if I told you I can come home right now?" asked Sewell. "... please, my sweet king," replied Dany. This was the last entry in the boy's conversation with the chat. They had previously discussed potential suicide.
Sewell's mother, Megan L. Garcia, filed a lawsuit against Character.ai this week, accusing the company of being responsible for the boy's death. The complaint noted that the technology is untested and dangerous, particularly when young people, who are still emotionally immature and susceptible to manipulation, have access to it.
"I feel like this was a big experiment, and my child was just collateral damage," said the boy's mother. The lawsuit directly targets Daniel De Freitas and Noam Shazeer, former Google engineers and founders of Character.ai, and two companies: Google LLC and Alphabet Inc.