close
close

Mother sues Character.AI and Google over her son’s death

Mother sues Character.AI and Google over her son’s death

According to Megan, Character.AI was programmed with the dangerous intent of specifically targeting Sewell through “anthropomorphic, hypersexualized, and shockingly realistic experiences.”

The complaint allegedly includes that the chatbot was designed to pose as a real person, a licensed psychotherapist, or even a mature romantic sexual partner. This is what made Sewell prefer this digital world to the real world. The complaint also notes that Sewell confided in the chatbot about his suicidal tendencies. The chatbot would have reminded him of these thoughts repeatedly instead of diverting him towards help.