this post was submitted on 05 Apr 2024
869 points (96.2% liked)

Technology

59287 readers
5759 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

A shocking story was promoted on the "front page" or main feed of Elon Musk's X on Thursday:

"Iran Strikes Tel Aviv with Heavy Missiles," read the headline.

This would certainly be a worrying world news development. Earlier that week, Israel had conducted an airstrike on Iran's embassy in Syria, killing two generals as well as other officers. Retaliation from Iran seemed like a plausible occurrence.

But, there was one major problem: Iran did not attack Israel. The headline was fake.

Even more concerning, the fake headline was apparently generated by X's own official AI chatbot, Grok, and then promoted by X's trending news product, Explore, on the very first day of an updated version of the feature.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 10 points 7 months ago

I love that example. Microsoft's Copilot (based on GTP-4) immediately doesn't disappoint:

Microsoft Copilot: Two pounds of feathers and a pound of lead both weigh the same: two pounds. The difference lies in the material—feathers are much lighter and less dense than lead. However, when it comes to weight, they balance out equally.

It's annoying that for many things, like basic programming tasks, it manages to generate reasonable output that is good enough to goat people into trusting it, yet hallucinates very obviously wrong stuff or follows completely insane approaches on anything off the beaten path. Every other day, I have to spend an hour to justify to a coworker why I wrote code this way when the AI has given him another "great" suggestion, like opening a hidden window with an UI control to query a database instead of going through our ORM.