this post was submitted on 28 Feb 2024
82 points (94.6% liked)

Technology

59374 readers
3125 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Google CEO says Gemini AI diversity errors are ‘completely unacceptable’::In an internal memo, Google CEO Sundar Pichai told employees that the historically inaccurate photos generated by the company’s Gemini AI were ‘completely unacceptable.’

top 22 comments
sorted by: hot top controversial new old
[–] [email protected] 47 points 8 months ago (2 children)

if you find QA people do to QA and pay them to do QA this won’t happen.

but that’s not cost saving, is it?

[–] [email protected] 28 points 8 months ago

It's okay. He sent an email. That'll make up for laying off QA staff, surely. /s

[–] [email protected] 5 points 8 months ago (1 children)

They had QA people, this was actually intended behavior

[–] [email protected] 10 points 8 months ago* (last edited 8 months ago) (1 children)

no? alphabet lost $90bil in value over this, why on earth would an entity choose to make a design decision that breaks investor confidence?

the initial choice to make the AI “diverse,” sure, that’s intentional. but the method by which it was done had unintentional consequences, and that is the problem which QA would have fixed.

[–] [email protected] 12 points 8 months ago (1 children)

They got the result they wanted, they just didn't get the response they wanted from the public. There's no way to QA this

[–] [email protected] 9 points 8 months ago* (last edited 8 months ago) (1 children)

there’s no way to QA public response

https://en.m.wikipedia.org/wiki/Focus_group

there are literally dozens of ways marketers have come up with to do exactly what you are claiming is impossible 🙃

[–] [email protected] 2 points 8 months ago

It's not the QA team's job to run a focus group. Ideally, you decide what customers want before building a product.

[–] [email protected] 27 points 8 months ago (2 children)

Well, Americans think Jesus was white with blue eyes. So it was just being human.

[–] [email protected] 4 points 8 months ago (2 children)
[–] [email protected] 8 points 8 months ago (1 children)

Have you ever seen a depiction of Jesus in a church in North America? That dude doesn't look like he's from the middle east at all.

[–] [email protected] 3 points 8 months ago

"The artist feels this way"

[–] [email protected] 4 points 8 months ago

I would guess the vast majority, but I have no data to back that up.

I mean, that's how western depictions of him are 99% of the time so intuitively it's true, but not sure.

[–] [email protected] 3 points 8 months ago

"Americans"

[–] [email protected] 17 points 8 months ago

I shrug at the whole thing...we spent a decade bitching about real world data being racist.

When we put in fake data, we got fake results.

Queue outrage.

[–] [email protected] 11 points 8 months ago

I'm not sure if the goal of txt2img models is supposed to be historically accurate photos....

[–] [email protected] 9 points 8 months ago

Faux outrage from white nationalists who are hopped up on replacement theory algorithms. Nothing to see here.

[–] [email protected] 5 points 8 months ago (1 children)

We want diversity... Wait no not like that!

[–] [email protected] 4 points 8 months ago (1 children)

Yep. Black Nazis are not weirder than black vikings or black people in Arthurian Britain. Or European main characters in samurai-themed movies, God forbid.

Anyway, an image generation tool should be orthogonal to this, it generates what's asked from it.

[–] [email protected] 1 points 8 months ago (1 children)

I'm not disagreeing, just providing (I hope) interesting context.

Black Nazis

At the time, there were a few Afro-Germans, obviously hated by the nazis. A few were children of individual travelers, such as diplomats, like Hans-Jürgen Massaquoi. More were children of French Colonial soldiers. They were denigrated as "Rhineland bastards" by racists/nationalists even before the nazis rose. Obviously, they were prosecuted; subjected to forced sterilization. Perhaps surprisingly, they were never systematically murdered.

There could not be such a thing as a black nazi. However, a few "mixed race" children (African and Asian) were conscripted into the Wehrmacht.

One should also bear in mind that the term Aryan existed before the nazis and was used by white supremacists in the US even pre WW1. The nazis stopped using it in the last years of their rule.

black vikings

Black vikings are historically quite possible. In Western Europe and the English-speaking world, the vikings are known as raiders and conquerors. But they were also merchants and mercenaries. An archeological dig, near today's Stockholm, found a viking age Buddha statue from India. "Vikings" served as the Varangian Guard in today's Istanbul. People also travelled the other way, of course, such as notably Ahmad ibn Fadlan.

There's nothing particularly implausible about a dark-skinned African travelling north to the Bosporus, meeting vikings, and travelling with them to their homeland to go on raids. It's not like we would expect to know if this happened once or twice during the viking age.

Arthurian Britain

I don't know much about this period. It was basically just after the Roman presence had ended. The Roman Empire included northern Africa, so thinking of black people in the Roman Legion makes perfect sense. After the decline/fall of the empire and the disruption of trade routes, the presence of dark skinned people in Britain becomes rather less plausible.

[–] [email protected] 2 points 8 months ago

This is interesting, I agree, but, first,

1 - I know that, but the images in question had SS symbols on helmets;

2 - I know that too, and it's likely to have happened once or twice or a dozen times, but I think no genetic or archeological trace of something like this definitely happening has yet been found;

3 - now here I think I've read about some genetic traces found even;

... and, second, it's more about something being normal as your first association.

Say, when you think of a universally gifted person, Dolph Lundgren won't quite fit your stereotype.

[–] [email protected] 1 points 8 months ago

This is the best summary I could come up with:


The historically inaccurate images and text generated by Google’s Gemini AI have “offended our users and shown bias,” CEO Sundar Pichai told employees in an internal memo obtained by The Verge.

Last week, Google paused Gemini’s ability to generate images after it was widely discovered that the model generated racially diverse, Nazi-era German soldiers, US Founding Fathers who were non-white, and even inaccurately portrayed the races of Google’s own co-founders.

While Google has since apologized for “missing the mark” and said it’s working to re-enable image generation in the coming weeks, Tuesday’s memo is the first time the CEO has widely addressed the controversy.

In the memo, which was first reported by Semafor, Pichai says the company has “been working around the clock” to address “problematic text and image responses in the Gemini app.” He doesn’t say that Google has fixed the problem.

“No Al is perfect, especially at this emerging stage of the industry’s development, but we know the bar is high for us and we will keep at it for however long it takes,” he writes.

You can read Sundar Pichai’s full memo to Google employees below:


The original article contains 189 words, the summary contains 189 words. Saved 0%. I'm a bot and I'm open source!

[–] [email protected] 0 points 8 months ago

That’s what you consider unacceptable about it? Really? Just that one thing?