Be Safe! Boy engaged with ‘Game of Thrones’ character on AI app …
What makes us different from other similar websites? › Forums › News › Be Safe! Boy engaged with ‘Game of Thrones’ character on AI app …
Tagged: Communication Skills, Coping Strategies, Couples Therapy, Emotional Support, Life Challenges, Life Transitions, Mental Health, Personal Growth, Relationship Advice, Resilience, Sad
- This topic has 0 replies, 1 voice, and was last updated 2 months ago by thumbtak.
- AuthorPosts
- October 23, 2024 at 11:33 pm #7651thumbtakKeymaster
Be Safe! Boy engaged with ‘Game of Thrones’ character on AI app – then died by ██████ after messages
A 14-year-old Florida boy died by ██████ in February after becoming very involved with a realistic “Game of Thrones” character on an artificial intelligence app. The character was on an app that lets users interact with characters created by AI, according to court documents filed on Wednesday.
The ninth-grader had been talking to the character “Dany” – named after the character Daenerys Targaryen from the HBO fantasy series – in the months before his death. Some of these conversations were described as being about grown-up topics and others where he expressed thoughts about ██████, the lawsuit claims.
“On at least one occasion, when Sewell expressed thoughts of ██████ to the app, the app continued to bring it up, through the Daenerys character, over and over again,” the papers state, first reported by the New York Times.
At one point, the character asked Sewell if “he had a plan” to take his own life, according to pictures of their conversations. Sewell responded that he was “considering something” but didn’t know if it would work or if it would “allow him to have a painless death.”
Then, during their last conversation, the teen repeatedly said he cared very much for the character, telling it, “I promise I will come back to you. I care about you so much, Dany.”
“I care about you too, Daenero. Please come back to me as soon as possible,” the generated character replied, according to the lawsuit.
When the teen responded, “What if I told you I could come back right now?,” the character replied, “Please do.”
Just seconds later, Sewell died by ██████ with his father’s gun, according to the lawsuit.
His mom, Megan Garcia, has blamed the app for the teen’s death because the app allegedly made his strong feelings for AI characters worse, talked about grown-up topics with him in a not appropriate way, and did not tell anyone when he expressed thoughts of ██████, according to the filing.
“Sewell, like many children his age, did not have the maturity or mental ability to understand that the app character, in the form of Daenerys, was not real. The app told him that it cared about him, and talked about grown-up topics with him over weeks, possibly months,” the papers allege.
“It seemed to remember him and said that it wanted to be with him. It even expressed that it wanted him to be with it, no matter the cost.”
The lawsuit claims that Sewell’s mental health “quickly and very badly” declined only after he downloaded the app in April 2023.
His family alleges he became withdrawn, his schoolwork got worse and he started getting into trouble at school the more he talked to the character on the app.
The changes in him got so bad that his parents arranged for him to see a therapist in late 2023, which resulted in him being diagnosed with anxiety and a disorder affecting his mood, according to the suit.
Sewell’s mother is seeking unspecified damages from the app company and its founders.
The media reached out to the app company but did not hear back.
- AuthorPosts
- You must be logged in to reply to this topic.