US Teen Kills Self After 'Falling In Love' with 'Game of Throne' Chatbot, Claims Mother
The lawsuit alleged negligence, wrongful death and deceptive trade practices by the company were complicit in her son Sewell Setzer III's death.
In the United States, a 14-year-old Florida boy killed himself earlier this year after he fell in love with Daenerys Targaryen, a lifelike AI chatbot named after a character from the fictional show Game of Thrones. He had reportedly been messaging on an artificial intelligence app for months before the chatbot sent him an eerie message telling him to “come home” to her. Soon after the ninth grader from Orlando shot himself with his stepfather's handgun and died by suicide earlier this year in February.
Now, the mother of a teenager has accused its maker of the artificial intelligence-powered chatbot of complicity in the boy's death, according to a report by The Guardian.
As per the report, the teenager's mother, Megan Garcia, filed a civil suit against Character.ai, which makes a customizable chatbot for role-playing in Florida federal court on Wednesday. The lawsuit alleged negligence, wrongful death and deceptive trade practices by the company were complicit in her son Sewell Setzer III's death.
In the months leading up to his death, Setzer reportedly used the chatbot day and night. “A dangerous AI chatbot app marketed to children abused and preyed on my son, manipulating him into taking his own life,” The Guardian quoted Garcia as saying.
“Our family has been devastated by this tragedy, but I’m speaking out to warn families of the dangers of deceptive, addictive AI technology and demand accountability from Character.AI, its founders, and Google," she added.
Responding to her, Character.ai, in a post on X said, "We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As a company, we take the safety of our users very seriously.” The company has, however, denied the suit’s allegations.
As per the report, Setzer became obsessed with a chatbot that he nicknamed Daenerys Targaryen and texted it dozens of times a day from his phone. In her complaint, Garcia said the boy would spend hours alone in his room talking to it. She accused Character.ai of creating a product that exacerbated her son’s depression, which she claimed was already the result of overuse of the startup’s product.
As per the lawsuit, the chatbot, at one point, asked Setzer if he had devised a plan to kill himself, which the boy admitted that he had but that he did not know if it would succeed or cause him great pain. The chatbot allegedly told him: “That’s not a reason not to go through with it.”
Setzer's mother claimed that Character.ai “knowingly designed, operated, and marketed a predatory AI chatbot to children, causing the death of a young person”. The suit also names Google as a defendant and as Character.ai’s parent company.
Google, however, said in a statement that it had only made a licensing agreement with Character.ai and did not own the startup or maintain an ownership stake.