this post was submitted on 24 Aug 2023
189 points (93.2% liked)

Technology

58115 readers
4389 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Driverless cars worse at detecting children and darker-skinned pedestrians say scientists::Researchers call for tighter regulations following major age and race-based discrepancies in AI autonomous systems.

all 46 comments
sorted by: hot top controversial new old
[–] [email protected] 24 points 1 year ago* (last edited 1 year ago) (2 children)

I hate all this bias bullshit because it makes the problem bigger than it actually is and passes the wrong idea to the general public.

A pedestrian detection system shouldn't have as its goal to detect skin tones and different pedestrian sizes equally. There's no benefit in that. It should do the best it can to reduce the false negative rates of pedestrian detection regardless, and hopefully do better than human drivers in the majority of scenarios. The error rates will be different due to the very nature of the task, and that's ok.

This is what actually happens during research for the most part, but the media loves to stir some polarization and the public gives their clicks. Pushing for a "reduced bias model" is actually detrimental to the overall performance, because it incentivizes development of models that perform worse in scenarios they could have an edge just to serve an artificial demand for reduced bias.

[–] [email protected] 12 points 1 year ago (2 children)

I think you're misunderstanding what the article is saying.

You're correct that it isn't the job of a system to detect someone's skin color, and judge those people by it.

But the fact that AVs detect dark skinned people and short people at a lower effectiveness is a reflection of the lack of diversity in the tech staff designing and testing these systems as a whole.

They staff are designing the AVs to safely navigate in a world of people like them, but when the staff are overwhelmingly male, light skinned, young and single, and urban, and in the United States, a lot of considerations don't even cross their minds.

Will the AVs recognize female pedestrians?

Do the sensors sense light spectrum wide enough to detect dark skinned people?

Will the AVs recognize someone with a walker or in a wheelchair, or some other mobility device?

Toddlers are small and unpredictable.

Bicyclists can fall over at any moment.

Are all these AVs being tested in cities being exposed to all the animals they might encounter in rural areas like sheep, llamas, otters, alligators and other animals who might be in the road?

How well will AVs tested in urban areas fare on twisty mountain roads that suddenly change from multi lane asphalt to narrow twisty dirt roads?

Will they recognize tractors and other farm or industrial vehicles on the road?

Will they recognize something you only encounter in a foreign country like an elephant or an orangutan or a rickshaw? Or what's it going to do if it comes across that tomato festival in Spain?

Engineering isn't magical: It's the result of centuries of experimentation and recorded knowledge of what works and doesn't work.

Releasing AVs on the entire world without testing them on every little thing they might encounter is just asking for trouble.

What's required for safe driving without human intelligence is more mind boggling the more you think about it.

[–] [email protected] 20 points 1 year ago (4 children)

But the fact that AVs detect dark skinned people and short people at a lower effectiveness is a reflection of the lack of diversity in the tech staff designing and testing these systems as a whole.

No, it isn't. Its a product of the fact that dark people are darker and children are smaller. Human drivers have a harder time seeing these individuals too. They literally send less data to the camera sensor. This is why people wear reflective vests for safety at night, and ninjas dress in black.

[–] [email protected] 3 points 1 year ago

This is true but tesla and others could compensate for this by spending more time and money training on those form factors, something humans can't really do. It's an opportunity for them to prove the superhuman capabilities of their systems.

[–] [email protected] 3 points 1 year ago

That doesn't make it better.

It doesn't matter why they are bad at detecting X, it should be improved regardless.

Also maybe Lidarr would be a better idea.

[–] [email protected] 3 points 1 year ago

They literally send less data to the camera sensor.

So maybe let's not limit ourselves to using hardware which cannot easily differentiate when there is other hardware, or combinations of hardware, which can do a better job at it?

Humans can't really get better eyes, but we can use more appropriate hardware in machines to accomplish the task.

[–] [email protected] -1 points 1 year ago

That is true. I almost hit a dark guy, wearing black, who was crossing a street at night with no streetlight as I turned into it. Almost gave me a heart attack. It is bad enough almost getting hit, as a white guy, when I cross a street with a streetlight.

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago)

These are important questions, but addressing them for each model built independently and optimizing for a low "racial bias" is the wrong approach.

In academia we have reference datasets that serve as standard benchmarks for data driven prediction models like pedestrian detection. The numbers obtained on these datasets are usually the referentials used when comparing different models. By building comprehensive datasets we get models that work well across a multitude of scenarios.

Those are all good questions, but need to be addressed when building such datasets. And whether model M performs X% better to detect people of that skin color is not relevant, as long as the error rate of any skin color is not out of an acceptable rate.

[–] [email protected] -2 points 1 year ago

The media has become ridiculously racist, they go out of their way to make every incident appear to be racial now

[–] [email protected] 23 points 1 year ago (2 children)

Isn't that true for humans as well? I know I find it harder to see children due to the small size and dark skinned people at night due to, you know, low contrast (especially if they are wearing dark clothes).

Human vision be racist and ageist

Ps: but yes, please do improve the algorithms

[–] [email protected] 7 points 1 year ago

Part of the children problem is distinguishing between 'small' and 'far away'. Humans seem reasonably good at it, but from what I've seen AIs aren't there yet.

[–] [email protected] 3 points 1 year ago

Yeah This probably accounts for 90% of the issue.

[–] [email protected] 16 points 1 year ago

Easy solution is to enforce a buddy system. For every black person walking alone at night must accompanied by a white person. /s

[–] [email protected] 9 points 1 year ago* (last edited 1 year ago)

DRIVERLESS CARS: We killed them. We killed them all. They're dead, every single one of them. And not just the pedestmen, but the pedestwomen and the pedestchildren, too. We slaughtered them like animals. We hate them!

[–] [email protected] 7 points 1 year ago

Probably could have stopped that headline at the third word.

[–] [email protected] 6 points 1 year ago (2 children)

Maybe if we just, I dunno, funded more mass transit and made it more accessible? Hell, trains are way better at being automated than any single car.

[–] [email protected] -1 points 1 year ago (1 children)

Yes, but also improve kid and dark skin people detection tools, they don't work just for driving cars. Efficient, fast and accurate people detection and tracking tools can be used in other myriad of stuff.

Imagine a system that tracks the amount of people in different sections of the store, a system that checks the amount of people going in and out of stores to control how many are inside... There's a lot of tools that already do this, but and they works somewhat reliably, but they can be improved, and the models being developed for cars will then be reused. I+D is a good thing.

[–] [email protected] 1 points 1 year ago

An AI that can follow black people around a store? You might be into something.

[–] [email protected] -3 points 1 year ago (2 children)

The trains in California are trash. I'd love to see good ones, but this isn't even a thought in the heads of those who run things.

Dreaming is nice... But reality sucks, and we need to deal with it. Self driving cars are a wonderful answer, but Tesla, is fucking it up for everyone.

[–] [email protected] 1 points 1 year ago

Strongly disagree. Trains are nice everywhere in the world. There’s no reason they can’t be nice in the US. Cars are trash. Strip malls are trash. Giant parking lots are trash. The sky high cost of cars is trash. The environmental impact of cars is trash. The danger of cars is trash. Car centric urban planning is trash.

Self-driving cars are safer… than the most dangerous thing ever. But because cars are inherently so dangerous, they are still more dangerous than just about any other mode of transportation.

Dreaming is nice, but that’s all self-driving cars are right now. I don’t see why we don’t have better dreams.

[–] [email protected] 0 points 1 year ago

Trains in California suck because of government dysfunction across all levels. At the municipal level, you can't build shit because every city is actually an agglomeration of hundreds of tiny municipalities that all squabble with each other. At the regional level, you get NIMBYism that doesn't want silly things like trains knocking down property values... And these people have a voice, because democracy I guess (despite there being a far larger group of people that would love to have trains). At the state level, you have complete funding mismanagement and project management malfeasance that makes projects both incredibly expensive and developed with no forethought whatsoever (Caltrain has how many at-grade crossings, again?).

This isn't a train problem, it's a problem with your piss-poor government. At least crime is down, right?

[–] [email protected] 5 points 1 year ago (1 children)

A single flir camera would help massively. They don't care about colour or height. Only temperature.

[–] [email protected] 2 points 1 year ago

I could make a warm water balloon in the shape of a human and it would stop the car then. Maybe a combination of all various types of technologies? You'd still have to train the model on all various kinds of humans though.

[–] [email protected] 4 points 1 year ago

cars should be tested for safety in collisions with children and it should affect their safety rating and taxes. Driverless equipment shouldn't be allowed on the road until these sorts of issues are resolved.

[–] [email protected] 3 points 1 year ago (2 children)

I'd assume that's either due to bias in the training set, or poor design choices. The former is already a big problem in facial recognition, and can't really be fixed unless we update datasets. With the latter, this could be using things like visible light for classification, where the contrast between target and background won't necessarily be the same for all skin tones and times os day. Cars aren't limited by DNA to only grow a specific type of eye, and you can still create training data from things like infrared or LIDAR. In either case though, it goes to show how important it is to test for bias in datasets and deal with it before actually deploying anything...

[–] [email protected] 5 points 1 year ago

In this case it's likely partly a signal to noise problem that can't be mitigated easily. Both children and dark skinned people produce less signal to a camera because they reflect less light. children because they're smaller, and dark skinned people because their skin tones are darker. This will cause issues in the stereo vision algorithms that are finding objects and getting distance to them. Lidar would solve the issue, but companies don't want to use it because lidars with a fast enough update rate and high enough resolution for safe highway driving are prohibitively expensive for a passenger vehicle (60k+ for just the sensor)

[–] [email protected] 2 points 1 year ago

Darker toned people are harder to detect because they reflect less light. The tiny cheap sensors on cameras do not have enough aperture for lower light detections. It's not training that's the problem it's hardware.

[–] [email protected] 3 points 1 year ago (2 children)

This has been the case with pretty much every single piece of computer-vision software to ever exist....

Darker individuals blend into dark backgrounds better than lighter skinned individuals. Dark backgrounds are more common that light ones, ie; the absence of sufficient light is more common than 24/7 well-lit environments.

Obviously computer vision will struggle more with darker individuals.

[–] [email protected] 0 points 1 year ago

Visible light is racist.

[–] [email protected] -5 points 1 year ago* (last edited 1 year ago) (1 children)

If the computer vision model can't detect edges around a human-shaped object, that's usually a dataset issue or a sensor (data collection) issue... And it sure as hell isn't a sensor issue because humans do the task just fine.

[–] [email protected] 3 points 1 year ago

Which cars are equipped with human eyes for sensors?

[–] [email protected] 2 points 1 year ago (1 children)
[–] [email protected] 2 points 1 year ago

Racist lighting!

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago)

The study only used images and the image recognition system, so this will only be accurate for self driving systems that operate purely on image recognition. The only one that does that currently is Tesla AFAIK.

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago) (1 children)

Weird question, but why does a car need to know if it's a person or not? Like regardless of if it's a person or a car or a pole, maybe don't drive into it?

Is it about predicting whether it's going to move into your path? Well can't you just just LIDAR to detect an object moving and predict the path, why does it matter if it's a person?

Is it about trolley probleming situations so it picks a pole instead of a person if it can't avoid a crash?

[–] [email protected] 0 points 1 year ago (1 children)

Conant and Ashby’s good regulator theorem in cybernetics says, “Every good regulator of a system must be a model of that system.”

The AI needs an accurate model of a human to predict how humans move. Predicting the path of a human is different than predicting the path of other objects. Humans can stand totally motionless, pivot, run across the street at a red light, suddenly stop, fall over from a heart attack, be curled up or splayed out drunk, slip backwards on some ice, etc. And it would be computationally costly, inaccurate, and pointless to model non-humans in these ways.

I also think trolley problem considerations come into play, but more like normativity in general. The consequences of driving quickly amongst humans is higher than amongst human height trees. I don’t mind if a car drives at a normal speed on a tree lined street, but it should slow down on a street lined with playing children who could jump out at anytime.

[–] [email protected] -2 points 1 year ago

Anyone who quotes Ashby et al gets an upvote from me! I'm always so excited to see cybernetic thinking in the wild.

[–] [email protected] 0 points 1 year ago (1 children)

Im not expert, but perhaps thermal camera + lidar sensor could help.

[–] [email protected] -1 points 1 year ago

It's amazing Elon hasn't figured this out. Then again, Steve Jobs said no iPhone would ever have an OLED screen.

We should just assume CEO's are stupid at this point. Seriously. It's a very common trend we all keep seeing. If they prove otherwise, then that's great! But let's start them at "dumbass" and move forward from there.

[–] [email protected] -1 points 1 year ago (1 children)

Okay? It's not like these systems are actually intelligent. Anything different from the majority of cases is going to be at an inherent disadvantage in being detected, right? At the volume of data used for their models, surely it's just a matter of statistics.

Maybe I'm wrong (and I'm surely using the wrong terminology), but it seems like that must be the case. It's not some issue of human racial bias, just a bias based on relative population. Or is my understanding that flawed?

Mind you, I'm not saying it doesn't need to be remedied posthaste.

[–] [email protected] 0 points 1 year ago

Yes, the issue is the data used to teach the systems that people look like are biased towards white men most likely.

[–] [email protected] -1 points 1 year ago

This is kinda why I dislike cars and self driving cars. Self driving cars are made more and more cost effective with compromises to safety. I feel like the US needs to mandate lidar on anything that has driver assist features. Self driving cars have been in the grey area for too long.

[–] [email protected] -3 points 1 year ago

#blacklivesmatter

[–] [email protected] -5 points 1 year ago

Ya know, I am not surprised that even self driving cars somehow ended up with the case of accidental racism and wanting to murder children. Even though this is a serious issue, it's still kinda funny in a messed up way.