this post was submitted on 22 Feb 2024
488 points (96.2% liked)

Technology

59390 readers
4323 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Google apologizes for ‘missing the mark’ after Gemini generated racially diverse Nazis::Google says it’s aware of historically inaccurate results for its Gemini AI image generator, following criticism that it depicted historically white groups as people of color.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] -1 points 8 months ago* (last edited 8 months ago) (1 children)

It’s putting human biases on full display at a grand scale.

The skin color of people in images doesn't matter that much.

The problem is these AI systems have more subtle biases, ones that aren't easily revealed with simple prompts and amusing images, and these AIs are being put to work making decisions who knows where.

[–] [email protected] 9 points 8 months ago (1 children)

In India they’ve been used to determine whether people should be kept on or kicked off of programs like food assistance.

[–] [email protected] -1 points 8 months ago* (last edited 8 months ago)

Well, humans are similar to pigs in the sense that they'll always find the stinkiest pile of junk in the area and taste it before any alternative.

EDIT: That's about popularity of "AI" today, and not some semantic expert systems like what they'd do with Lisp machines.