this post was submitted on 08 Oct 2024
136 points (95.9% liked)
Technology
59374 readers
7248 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
One of my big worries with the way people are using LLMs is that they're being trained to trust whatever they spit out. Hey Google, what's the nutritional content of peanuts? And people are learning not to ask where the information came from or to check sources.
One of the many reasons this worries me is that very soon these businesses are going to need to recoup the billions they're spending, and I wonder how long until these systems start feeding paid promotions to a population that's been trained to accept whatever they're told. imagine what some businesses, or governments, would pay to have exactly their choice of words produced on demand in response to knowledge queries.
My search engine usage for 25 years has been just me going "yeah right" and changing the query to make it better. But I'm wired to distrust what I feel is bullshit, and I've experienced not many people are.