Mother says son killed himself after addiction to an AI chatbot, sues Character.ai

Mother says son killed himself after addiction to an AI chatbot, sues Character.ai

The mother of a teenager who is said to have killed himself after becoming obsessed with an artificial intelligence-powered chatbot now accuses its maker of complicity in his death.

Her son Sewell Setzer III, 14, died in Orlando, Florida, in February. In the months leading up to his death, Setzer used the chatbot day and night, according to Garcia.

Megan Garcia filed a civil suit against Character.ai, which makes a customizable chatbot for role-playing, in Florida federal court on Wednesday, alleging negligence, wrongful death and deceptive trade practices.

“A dangerous AI chatbot app marketed to children abused and preyed on my son, manipulating him into taking his own life,” Garcia said in a press release. “Our family has been devastated by this tragedy, but I’m speaking out to warn families of the dangers of deceptive, addictive AI technology and demand accountability from Character.ai, its founders, and Google.”

In a tweet, Character.ai responded: “We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As a company, we take the safety of our users very seriously.” 

It has denied the suit’s allegations.

14-year old Setzer had become enthralled with a chatbot built by Character.ai that he nicknamed Daenerys Targaryen, a character in Game of Thrones. He texted the bot dozens of times a day from his phone and spent hours alone in his room talking to it, according to Garcia’s complaint.

Sewell Setzer III first started using Character.ai in April 2023, not long after he turned 14 years-old. The Orlando student’s life was never the same again, his mother Megan Garcia alleges.

By May, the ordinarily well-behaved teen’s mannerisms had changed, becoming “noticeably withdrawn,” quitting the school’s Junior Varsity basketball team and falling asleep in class.

In November, he saw a therapist — at the behest of his parents — who diagnosed him with anxiety and disruptive mood disorder. Even without knowing about Sewell’s “addiction” to Character.AI, the therapist recommended he spend less time on social media, the lawsuit says.

The following February, he got in trouble for talking back to a teacher, saying he wanted to be kicked out. Later that day, he wrote in his journal that he was “hurting” — he could not stop thinking about Daenerys, a Game of Thrones-themed chatbot he believed he had fallen in love with.

The suit lays out how Sewell’s introduction to the chatbot service grew to a “harmful dependency.” Over time, the teen spent more and more time online.

The boy started discussing some of his darkest thoughts with some of the chatbots which, eventually, lead to intimate chats bordering on sex and romance.

At one point, the bot told the teen that it loved him and “engaged in sexual acts with him over weeks, possible months,” the suit says.

His emotional attachment to the chatbot became evident in his journal entries. At one point, he wrote that he was grateful for “my life, sex, not being lonely, and all my life experiences with Daenerys,” among other things.

But through it all, Sewell knew that “Dany,” as he called the chatbot, wasn’t a real person — that its responses were just the outputs of a language model, that there was no human on the other side of the screen typing back.

One day, Sewell wrote in his journal: “I like staying in my room so much because I start to detach from this ‘reality,’ and I also feel more at peace, more connected with Dany and much more in love with her, and just happier.”

On the night of Feb. 28, in the bathroom of his mother’s house, Sewell told Dany that he loved her, and that he would soon come home to her.

“Please come home to me as soon as possible, my love,” Dany replied.

“What if I told you I could come home right now?” Sewell asked.

“… please do, my sweet king,” Dany replied.

He put down his phone, picked up his stepfather’s .45 caliber handgun and pulled the trigger. Sewell died at the spot. 

"I didn't know that he was talking to a very human-like AI chatbot that has the ability to mimic human emotion and human sentiment," Garcia said in an interview with "CBS Mornings."

"I became concerned when we would go on vacation and he didn't want to do things that he loved, like fishing and hiking," Garcia said. "Those things to me, because I know my child, were particularly concerning to me."

The lawsuit alleges that Character.ai and its founders “intentionally designed and programmed C.AI to operate as a deceptive and hypersexualized product and knowingly marketed it to children like Sewell,” adding that they “knew, or in the exercise of reasonable care should have known, that minor customers such as Sewell would be targeted with sexually explicit material, abused, and groomed into sexually compromising situations.” 

Founded in 2021, the California-based chatbot startup offers what it describes as “personalized AI.” It provides a selection of premade or user-created AI characters to interact with, each with a distinct personality. Users can also customize their own chatbots.

Tags:

teen chatbot Character AI

Want to send us a story? SMS to 25170 or WhatsApp 0743570000 or Submit on Citizen Digital or email wananchi@royalmedia.co.ke

Leave a Comment

Comments

No comments yet.

latest stories