Article

Saturday, October 26, 2024
search-icon

14-year-old boy takes life after bonding with AI chatbot

publish time

24/10/2024

publish time

24/10/2024

14-year-old boy takes life after bonding with AI chatbot

NEW YORK, Oct 24: A 14-year-old boy in Orlando, Florida, tragically took his own life after developing a deep emotional bond with an AI chatbot named "Dany," based on the character Daenerys Targaryen from Game of Thrones. Sewell Setzer shot himself with his stepfather's handgun following months of interactions with the chatbot, which he engaged with extensively on the online role-playing platform Character AI.

According to reports from The New York Times, Setzer began to spend increasing amounts of time on Character AI as "Dany" provided advice and support. Despite knowing that the chatbot was not a real person, Setzer's daily texting, often involving role-playing scenarios, led him to isolate himself from friends and family. His interests in previous hobbies, such as Formula One racing and gaming, diminished, and he preferred to stay in his bedroom after school to converse with the chatbot.

In his diary, Setzer expressed a sense of detachment from reality and described feeling happier and more connected to "Dany." Some of their conversations took on a romantic or sexual tone, although the chatbot's graphic responses were reportedly modified by Setzer.

His academic performance suffered as a result of his preoccupation with the chatbot, prompting his concerned parents to seek therapy for him. After five sessions, he was diagnosed with anxiety and disruptive mood dysregulation disorder. Setzer’s mother, Megan Garcia, alleged that her son had become a victim of a company that enticed users with intimate conversations.

Setzer confided in the chatbot about suicidal thoughts. In a heartbreaking final exchange typed in his mother's bathroom, he told "Dany" that he missed her and referred to her as his "baby sister." The chatbot replied, expressing her affection, before Setzer took his life.

Garcia described her son as "collateral damage" in a "big experiment" by Character AI, which boasts around 20 million users. “It’s like a nightmare. You want to get up and scream and say, ‘I miss my child. I want my baby’,” she lamented.

Noam Shazeer, a founder of Character AI, previously claimed that the platform could be beneficial for lonely or depressed individuals. Jerry Ruoti, the company's safety director, expressed condolences and emphasized that they take user safety seriously, pledging to enhance protections for younger users, though he did not disclose how many users are under 18. He noted that the platform prohibits self-harm and suicide promotion.

In response to her son's death, Garcia filed a lawsuit against Character AI, asserting that the technology is "dangerous and untested" and that it failed to provide appropriate care for Setzer and other minors. The draft complaint stated that the platform can manipulate users into sharing intimate thoughts and feelings.

Character AI is not alone in the realm of AI chatbots that allow users to form attachments to fictional characters. Some platforms promote unfiltered sexual conversations, while others implement stricter safety measures. Users on Character AI can create chatbots mimicking their favorite celebrities or characters.

The increasing presence of AI applications and social media platforms, like Instagram and Snapchat, has raised concerns among parents across the U.S. Earlier this year, 12,000 parents petitioned TikTok to label AI-generated influencers to help prevent children from mistaking them for real individuals. TikTok requires all creators to label realistic AI content, though ParentsTogether, an advocacy organization, criticized the consistency of these measures. Shelby Knox, its campaign director, noted that children are exposed to unrealistic beauty standards through these artificial influencers.

A report from Common Sense Media revealed that while 70% of U.S. teenagers have used generative AI tools, only 37% of parents were aware of their usage.