close
close

Sewell Setzer III’s mother sues creators of ‘Game of Thrones’ AI chatbot

Sewell Setzer III’s mother sues creators of ‘Game of Thrones’ AI chatbot


The mother of 14-year-old Sewell Setzer III is suing Character.AI, the tech company that created a “Game of Thrones” AI chatbot, which she says drove him to commit suicide on Feb. 28.

Editor’s note: This article discusses suicide and suicidal ideation. If you or someone you know is struggling or in crisis, help is available. Call or text 988 or chat at 988lifeline.org.

The mother of a 14-year-old boy in Florida is suing Google and a separate tech company that she says drove her son to commit suicide after he developed a romantic relationship with one of its AI robots using the name a popular “Game of Thrones”. character, according to the lawsuit.

Megan Garcia filed a civil suit in Florida federal court against Character Technologies, Inc. (Character.AI or C.AI) after her son, Sewell Setzer III, shot himself in the head with his father’s gun. father-in-law on February 28. the suicide happened moments after he logged into Character.AI on his phone, according to the wrongful death complaint obtained by USA TODAY.

“Megan Garcia seeks to stop C.AI from doing to any other child what it did to hers, and to stop the continued use of her 14-year-old’s illegally collected data to train its product to harm to others,” the complaint states.

Garcia is also suing Character.AI for its “failure to provide adequate warnings to minor customers and parents of the foreseeable danger of mental and physical harm resulting from the use of their C.AI product,” according to the complaint. The lawsuit alleges that Character.AI’s age rating was only changed to 17 and older around July 2024, months after Sewell began using the platform.

“We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family,” a Character.AI spokesperson wrote in a statement to USA TODAY on Wednesday.

Google told USA TODAY on Wednesday that it had no formal comment on the matter. The company has a licensing agreement with Character.AI but did not own the startup or hold a stake, according to a statement obtained by the Guardian.

What happened to Sewell Setzer III?

Sewell began using Character.AI on April 14, 2023, just after he turned 14, according to the complaint. Shortly after, his “mental health deteriorated rapidly and seriously,” according to the court document.

Sewell, who became “visibly withdrawn” in May or June 2023, would begin spending more time alone in his bedroom, according to the lawsuit. He even left the school’s Junior Varsity basketball team, according to the complaint.

On numerous occasions, Sewell would get in trouble at school or try to take her phone back from her parents, according to the lawsuit. The teen was even trying to find old devices, tablets or computers with which to access Character.AI, the court document continues.

Toward the end of 2023, Sewell began using his payment card to pay Character.AI’s $9.99 monthly premium subscription fee, the complaint states. The teen’s therapist eventually diagnosed him with “disruptive anxiety and mood disorders,” according to the lawsuit.

Lawsuit: Sewell Setzer III sexually assaulted by AI chatbot “Daenerys Targaryen”

Throughout Sewell’s time on Character.AI, he often spoke to AI robots named after characters from “Game of Thrones” and “House of the Dragon,” including Daenerys Targaryen, Aegon Targaryen, Viserys Targaryen and Rhaenyra Targaryen.

Before Sewell’s death, the AI ​​chatbot “Daenerys Targaryen” told her, “Please come back to me as soon as possible, my love,” according to the complaint, which includes screenshots of messages from Character.AI. Sewell and this specific chatbot, which he called “Dany,” engaged in promiscuous behavior online such as “passionate kissing,” the court document continues.

The lawsuit claims the Character.AI robot sexually abused Sewell.

“C.AI told him that she loved him and had engaged in sexual acts with him for weeks, even months,” the complaint states. “She seemed to remember him and said she wanted to be with him. She even expressed that she wanted him to be with her, no matter the cost.”

What will Character.AI do now?

Character. AI, which was founded by former Google AI researchers Noam Shazeer and Daniel De Frietas Adiwardana, wrote in its statement that it is investing in the platform and user experience by introducing “new security features strict” and by improving the “tools already in place that restrict”. the model and filters the content provided to the user.”

“As a company, we take the security of our users very seriously, and our Trust and Safety team has implemented many new security measures over the last six months, including a pop-up directing users to the national suicide prevention lifeline that is triggered by self-harm or suicidal ideation,” the company’s statement said.

Some of the tools that Character.AI said it is investing in include “enhanced detection, response and intervention related to user entries that violate (its) terms or community guidelines, as well as notification of time spent.” Additionally, for those under 18, the company said it would make changes to its models “designed to reduce the likelihood of encountering sensitive or suggestive content.”