this post was submitted on 07 Jun 2024
185 points (96.5% liked)

Technology

59390 readers
2896 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

A team of researchers from prominent universities – including SUNY Buffalo, Iowa State, UNC Charlotte, and Purdue – were able to turn an autonomous vehicle (AV) operated on the open sourced Apollo driving platform from Chinese web giant Baidu into a deadly weapon by tricking its multi-sensor fusion system, and suggest the attack could be applied to other self-driving cars.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 21 points 5 months ago (7 children)

Because humans have more accountability. Also it has implications for military/police use of self-guided stuff.

[–] [email protected] 0 points 5 months ago (6 children)

What is the purpose of accountability other than to force people to do better? If the lack of accountability doesn't stop a computer from outperforming a human, why worry about it?

[–] [email protected] 11 points 5 months ago* (last edited 5 months ago) (5 children)

The lack of accountability means that there is nothing and no one to take responsibility when the robot/computer inevitably kills someone. A human can be faced with legal ramifications for their actions, the companies that make these computers have shown thus far that they are exempt from such consequences.

[–] [email protected] 3 points 5 months ago (1 children)

That is true for most current "self driving" systems, because they are all just glorified assist features. Tesla is misleading its customers massively with their advertisement, but on paper it's very clear that the car will only assist in safe conditions, the driver needs to be able to react immediately at all times and therefore is also liable.

However, Mercedes (I think it was them) have started to roll out a feather where they will actually take responsibility for any accidents that happen due to this system. For now it's restricted to nice weather and a few select roads, but the progress is there!

[–] [email protected] 3 points 5 months ago

The driverless robo-taxis are also a concern. When one of them killed someone in San Francisco there was not a clear responsible entity to charge with the crime.

load more comments (3 replies)
load more comments (3 replies)
load more comments (3 replies)