this post was submitted on 12 Apr 2024
1001 points (98.5% liked)

Technology

59287 readers
5759 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 4 points 7 months ago (1 children)

In your analogy a proposed regulation would just be requiring the book in question to report that it's endorsed by a nazi. We may not be inclined to change our views because of an LLM like this but you have to consider a world in the future where these things are commonplace.

There are certainly people out there dumb enough to adopt some views without considering the origins.

[–] [email protected] 1 points 7 months ago (1 children)

They are commonplace now. At least 3 people I work with always have a chatgpt tab open.

[–] [email protected] 2 points 7 months ago (1 children)

And you don't think those people might be upset if they discovered something like this post was injected into their conversations before they have them and without their knowledge?

[–] [email protected] 1 points 7 months ago (1 children)

No. I don't think anyone who searches out in gab for a neutral LLM would be upset to find Nazi shit, on gab

[–] [email protected] 2 points 7 months ago (1 children)

You think this is confined to gab? You seem to be looking at this example and taking it for the only example capable of existing.

Your argument that there's not anyone out there at all that can ever be offended or misled by something like this is both presumptuous and quite naive.

What happens when LLMs become widespread enough that they're used in schools? We already have a problem, for instance, with young boys deciding to model themselves and their world view after figureheads like Andrew Tate.

In any case, if the only thing you have to contribute to this discussion boils down to "nuh uh won't happen" then you've missed the point and I don't even know why I'm engaging you.

[–] [email protected] 0 points 7 months ago (1 children)

You have a very poor opinion of people

[–] [email protected] 1 points 7 months ago* (last edited 7 months ago) (1 children)

You have a very lofty misconception about people.

I gave you reasoning and a real world example of a vulnerable demographic. You have given me an anecdote about your friends and a variation of "nuh uh" over and over.

[–] [email protected] 1 points 7 months ago

No you didn't. You mentioned some rapist in jail