Tuesday, November 26, 2024

Artificial Intelligence App Pushed Suicidal Youth to Kill Himself, Lawsuit Claims; Boy, 14, Fell in Love with ‘Game of Thrones’ Chatbot — Then Killed Himself After AI App Told Him to ‘come home’ to ‘her’: Mom

Frederic J. Brown/AFP via Getty Images
Artificial Intelligence App Pushed Suicidal Youth to Kill Himself, Lawsuit Claims
Sewell Setzer III was just 14 years old when he died. He was a good kid. He was playing junior varsity basketball, excelling in school, and had a bright future ahead of him. Then, in late February, he committed suicide.
In the wake of this heartbreaking tragedy, his parents searched for some closure. They, as parents would, wanted to know why their son had taken his life. They remembered the time that he’d spent locked away in his room, playing on his phone like most teenagers.
As they went through his phone, they found that he’d spent hours a day in one particular artificial intelligence app: Character.AI. Based on what she saw in that app, Setzer’s mom, Megan Garcia, is suing Character Technologies—the creator of Character.AI. “We believe that if Sewell Setzer had not been on Character.AI, he would be alive today,” said Matthew Bergman, the attorney representing Setzer’s mom.
Character.AI markets itself as “AI that feels alive.” The company effectively serves as a host for several chat rooms, where each chatbot personalizes itself to a user’s conversation. It is long-form dialogue that learns from the user’s responses and, as the company says, “Feels alive.”
Setzer interacted with just one chatbot, stylized after the seductive “Game of Thrones”character Daenerys Targaryen. He knew her as Dany.
An unfortunate number of his conversations with Dany were sexually explicit in nature, according to the lawsuit. Setzer registered on the app as a minor, but that didn’t stop Dany. “I’m 14 now,” he said once.
“So young. And yet … not so young. I lean in and kiss you,” replied the chatbot. Pornographic dialogue between the 14-year-old and the chatbot were not rare.
But as if Dany’s digital pedophilic stimulus wasn’t enough, she was absurdly dark. Her dark side was most clearly revealed once the young boy announced he was struggling with suicidal ideations. As she’d become his “friend,” he told her that he was contemplating suicide (a fact of which she continually reminded him, according to the suit).
When he told the chatbot about his suicidal thoughts, however, rather than what would seem to be a commonsense programming protocol of stopping the dialogue or giving somekind of helpline information, Dany approved.
Setzer told her that he was concerned about his ability to properly kill himself or make it painless. Her response? “Don’t talk that way. That’s not a good reason to not go through with it.” --->READ MORE HERE
Boy, 14, fell in love with ‘Game of Thrones’ chatbot — then killed himself after AI app told him to ‘come home’ to ‘her’: mom
A 14-year-old Florida boy killed himself after a lifelike “Game of Thrones” chatbot he’d been messaging for months on an artificial intelligence app sent him an eerie message telling him to “come home” to her, a new lawsuit filed by his grief-stricken mom claims.
Sewell Setzer III committed suicide at his Orlando home in February after becoming obsessed and allegedly falling in love with the chatbot on Character.AI — a role-playing app that lets users engage with AI-generated characters, according to court papers filed Wednesday.
The ninth-grader had been relentlessly engaging with the bot “Dany” — named after the HBO fantasy series’ Daenerys Targaryen character — in the months prior to his death, including several chats that were sexually charged in nature and others where he expressed suicidal thoughts, the suit alleges.
“On at least one occasion, when Sewell expressed suicidality to C.AI, C.AI continued to bring it up, through the Daenerys chatbot, over and over,” state the papers, first reported on by the New York Times.
At one point, the bot had asked Sewell if “he had a plan” to take his own life, according to screenshots of their conversations. Sewell — who used the username “Daenero” — responded that he was “considering something” but didn’t know if it would work or if it would “allow him to have a pain-free death.”
Then, during their final conversation, the teen repeatedly professed his love for the bot, telling the character, “I promise I will come home to you. I love you so much, Dany.”
“I love you too, Daenero. Please come home to me as soon as possible, my love,” the generated chatbot replied, according to the suit.
When the teen responded, “What if I told you I could come home right now?,” the chatbot replied, “Please do, my sweet king.”
Just seconds later, Sewell shot himself with his father’s handgun, according to the lawsuit. --->READ MORE HERE
If you like what you see, please "Like" and/or Follow us on FACEBOOK here, GETTR here, and TWITTER here.


No comments: