this post was submitted on 05 Nov 2023
-1 points (48.4% liked)

Technology

59347 readers
4956 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

cross-posted from: https://lemmy.ml/post/7481270

Automated image generators are often accused of spreading harmful stereotypes, but studies usually only look at MidJourney. Other tools make serious efforts to increase diversity in their output, but effective remedies remain elusive.

top 4 comments
sorted by: hot top controversial new old
[–] [email protected] 7 points 1 year ago (1 children)

If you use google images to do basically the same searches you get the same diversity issues. It’s reflecting the training data, and the larger world by extension. Whatever they would have us do to fix that must be applied to reality before it can or should be artificially skewed in AI models. Because if you bias the model to compensate you will create a worse bias. One that was intentional.

Even if you don’t agree with that take, have a look at the Firefly example. they asked for a trucker named Paul, and they got a woman in the result set. Maybe somewhere out there exists a woman trucker named Paul, but it’s a clear reduction in accuracy and quality because Adobe attempted to inject artificial diversity.

[–] [email protected] 7 points 1 year ago (1 children)

Yes, but on the other hand biasing the models could be a way to influence reality.

[–] [email protected] 2 points 1 year ago

Could be, maybe. Or maybe not. Not sure. But the thing for sure is that forcing the diversity reduces the quality of the model.

[–] [email protected] 3 points 1 year ago

Yeah, AI generated images reflect various biases from training data

  • Who engages in activity X the most

  • And photographs themselves

  • And posts those photographs online

  • And labels them in a way that an AI might put correlate them

If most champaigne pictures are taken with selfie making white girls and eating waffles are black families (something I actually ran across earlier on), that's going to be the bias in the AI images as well