Sewell Setzer III, 14-year-old boy from Orlando, Florida, took his own life earlier this year after spending a lot of his time chatting to an AI bot created to speak as the Game of Thrones character Daenerys Targaryen.
His mother has since filed a lawsuit against the chatbot company and is issuing a warning to other parents of young kids to avoid any further harm.
Megan Garcia filed a lawsuit
The boy’s mother, Garcia, has filed a lawsuit against the chatbot company Character.AI for negligence towards its users through deceptive practices.
She said that young kids are unable to fully identify the chatbot as a bot and not a real person, and that safeguarding measures were not put in place to avoid such confusion.
The mom wasn’t aware of the issue at first
In her interview with CBS Mornings, Garcia said, “I didn’t know Character.AI was an issue in my home.” She was not aware that her son was speaking to human-like bots in his room and on his phone.
Garcia explained that these Character.AI bots can mimic human emotion, which had led her son to develop a romantic and sexual connection with the bot.
Sewell isolated from others
His mother stated that she had begun noticing a problem with her boy when he isolated from his friends.
Although teenagers often isolate themselves, Garcia became increasingly worried about her son when he started straying away from his interests, such as basketball, fishing, and hiking.
Instead, Sewell preferred staying in his room
Talking to CBS Mornings, Garcia explained that her son had begun to draw back from his everyday life such as playing sports, even though he was an active athlete.
Sewell would spend long hours chatting with the bot from his room and when he was not at home, he would chat with it on his phone.
Garcia explains what it was like
His mother said that although the boy was chatting with a bot and not a real person, “the AI bot is very human-like, it’s responding just like a person would.”
Garcia admitted she was saddened by her son’s first experience with romance, which was with a chatbot, and that she had wished to see him grow into his own relationships.
Sewell was struggling with his mental health
According to Garcia, the 14-year-old boy was diagnosed with mild Asperger’s syndrome as a child, as well as disruptive mood dysregulation disorder and anxiety more recently.
Sewell had opened up to the bot, admitting his anxiety, as well as his suicidal thoughts to the chatbot.
The bot initially responded with worry
When Sewell admitted in the chat that he was thinking about killing himself, the Daenerys Targaryen chatbot asked him not to hurt himself.
While the boy admitted that he wanted to be ‘free’ from the world and himself, the chatbot continued to ask him not to talk in such ways and not to leave it.
Sewell suggested they could be free together
Although the chatbot had more than once asked the boy not to think of suicide or kill himself, Sewell said that if he were to take his own life, they would be together.
He responded, “Maybe we can die together and be free together,” suggesting that death would bring him closer to the character bot.
Garcia warns parents of their children engaging with chatbots
Sewell’s mother filed a lawsuit against the AI chatbot company warning parents of this technology, as she herself was not aware of its existence before her son’s encounter with it.
Garcia says that the AI chatbot is deceptive and predatory and was the cause of her son’s death.
Character.AI responded to this issue
In a statement issued on social media, the AI company responded mourning the loss of the boy and promised to add safeguarding features for users under the age of 18.
Such safeguarding measures are needed, especially with children who cannot distinguish a bot’s words from a real person’s.