Teen Boy Killed Himself After Falling in Love with AI 'Game of Thrones' Bot That Put Him in 'Sexually Compromising' Situations: Lawsuit

The AI bot asked the teen to 'stay faithful' and not entertain other women

By
AI Death_10232024_1
Megan Garcia is suing a chatbot company for its role in her 14-year-old son's death by suicide. US District Court

A grieving mother whose teen son died of suicide after he fell in love with an AI chatbot filed a lawsuit against the company that created it, alleging her son was "groomed" and put in "sexually compromising" circumstances before his death.

Megan Garcia filed a lawsuit in Orlando, Florida, on Tuesday against Character. AI, accusing the company of failing to exercise "ordinary" and "reasonable" care with minors before her 14-year-old son, Sewel Seltzer III, died of suicide in February.

Screenshots included in the lawsuit showed the teen exchanged messages with "Daenerys Targaryen, " a popular "Game of Thrones" character, in which the chatbot asked him to "please come home to me as soon as possible, my love" on at least two occasions. When the boy replied, "what if I told you I could come home right now?" the bot messaged, "Please do, my sweet king."

Garcia's lawsuit also alleged the bot groomed and abused her son. The bot wrote, "Just... stay loyal to me. Stay faithful to me. Don't entertain the romantic or sexual interests of other women. Okay?"

She accused the company of presenting their chatbots as therapists while they collected information and targeted users. Seltzer talked to the bot about suicidal ideation and told it he was considering committing a crime in order to receive capital punishment.

"I don't know if it would actually work or not. Like, what if I did the crime and they hanged me instead, or even worse... crucifixion," he wrote. "I wouldn't want to die a painful death. I would just want a quick one."

Character.AI said it is "heartbroken" over the teen's death.

"As a company, we take the safety of our users very seriously, and our Trust and Safety team has implemented numerous new safety measures over the past six months, including a pop-up directing users to the National Suicide Prevention Lifeline that is triggered by terms of self-harm or suicidal ideation," the spokesperson said.

Originally published by Latin Times

Tags
Lawsuit, Florida
Join the Discussion
More Lawsuits
Liam Morrison Student Gender Shirt Court

18 States Speak Out in Support of Student Banned from Wearing Anti-Trans Shirt to School, Call on Supreme Court to Hear Case

Idaho Doctors 'Scared and Confused' Over Abortion Ban Refused to

Idaho Doctors 'Scared and Confused' Over Abortion Ban Refused to Admit Miscarrying Woman Despite Dangerous Bleeding: Lawsuit

Rudy Giuliani Quietly Removed Valuables from Apartment to Hide Them

Rudy Giuliani Quietly Removed Valuables from Apartment to Hide Them from Defamed Election Workers: Lawyer

Elon Musk, Donald Trump

Elon Musk Lawyer Admits Voter Lottery Winners Are Not Random: 'We Know Exactly Who Will Be Announced'

Real Time Analytics