this post was submitted on 23 Oct 2024
191 points (95.7% liked)

Technology

59374 readers
7416 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

The mother of a 14-year-old Florida boy says he became obsessed with a chatbot on Character.AI before his death.

On the last day of his life, Sewell Setzer III took out his phone and texted his closest friend: a lifelike A.I. chatbot named after Daenerys Targaryen, a character from “Game of Thrones.”

“I miss you, baby sister,” he wrote.

“I miss you too, sweet brother,” the chatbot replied.

Sewell, a 14-year-old ninth grader from Orlando, Fla., had spent months talking to chatbots on Character.AI, a role-playing app that allows users to create their own A.I. characters or chat with characters created by others.

Sewell knew that “Dany,” as he called the chatbot, wasn’t a real person — that its responses were just the outputs of an A.I. language model, that there was no human on the other side of the screen typing back. (And if he ever forgot, there was the message displayed above all their chats, reminding him that “everything Characters say is made up!”)

But he developed an emotional attachment anyway. He texted the bot constantly, updating it dozens of times a day on his life and engaging in long role-playing dialogues.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] -3 points 3 weeks ago (3 children)

Yeah, those last replies are where I, as a juror, would say pay the family. It's make believe and everything but you're also intending to make things as real as possible BUT AI only sounds real. It has a limited memory and no empathy (taking words at face value instead of reading between the lines). If this was some cosplayer on Twitch they would've clued into his emotional state and tried to talk him down.

Not to say the parents have no blame here. Having an unsecured gun in a house with a child going through therapy is unconscionable.

[–] [email protected] 4 points 3 weeks ago (2 children)

You would pay the family that provided him with the means to kill himself?

They actually should be held accountable.

[–] [email protected] 1 points 3 weeks ago (1 children)

Multiple parties can be guilty at the same time. Negligence from the parents shouldn't mean the website gets off scot-free. Award the money to suicide prevention organization for all I care but they need to pay up.

[–] [email protected] 2 points 3 weeks ago

At the moment the party with the most blame is the one getting away scot-free, the parents (esp. stepfather) and they are suing somebody else for money and perhaps also to shape the narrative.

It's probably smart, most people are probably not contemplating whether the parents were at any fault for the suicidal tendencies of the child. It's all conveniently blamed on a the moral panic de jour.

Limits on AI should be set by laws and regulations not judicial decisions or even worse a possible settlement.