this post was submitted on 26 Jun 2024
382 points (97.3% liked)

Technology

59421 readers
2819 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

LLMs certainly hold potential, but as we’ve seen time and time again in tech over the last fifteen years, the hype and greed of unethical pitchmen has gotten way out ahead of the actual locomotive. A lot of people in “tech” are interested in money, not tech. And they’re increasingly making decisions based on how to drum up investment bucks, get press attention and bump stock, not on actually improving anything.

The result has been a ridiculous parade of rushed “AI” implementations that are focused more on cutting corners, undermining labor, or drumming up sexy headlines than improving lives. The resulting hype cycle isn’t just building unrealistic expectations and tarnishing brands, it’s often distracting many tech companies from foundational reality and more practical, meaningful ideas.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 3 points 4 months ago (5 children)

Those mistakes would be easily solved by something that doesn't even need to think. Just add a filter of acceptable orders, or hire a low wage human who does not give a shit about the customers special orders.

In general, AI really needs to set some boundaries. "No" is a perfectly good answer, but it doesn't ever do that, does it?

[–] [email protected] 2 points 4 months ago (3 children)

sure it does. it won’t tell you how to build a bomb or demonstrate explicit biases that have been fine tuned out of it. the problem is McDonald’s isn’t an AI company and probably is just using ChatGPT on the backend, and GPT doesn’t give a shit about bacon ice cream out of the box.

[–] [email protected] 1 points 4 months ago* (last edited 4 months ago) (1 children)

So, what happens if you order a bomb at the McD?

[–] [email protected] -2 points 4 months ago

You get bacon on ice cream.

load more comments (1 replies)
load more comments (2 replies)