this post was submitted on 30 Aug 2023
202 points (93.2% liked)
Technology
59312 readers
5184 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I'm not claiming it is so, but I'm saying it's conceivable that if the autonomous vehicle drives slightly over the speed limit, with the flow of traffic, it may actually lead to a statistically significant drop in accidents compared to the scenario where it follows the speed limit. Yes, no one is forcing other drivers to behave in such a way, but they do, and because of that, people die. In this case, forcing self-driving cars to follow traffic rules to the letter would paradoxically mean you're choosing to kill and injure more people.
I don't think the answer to this kind of moral question is obvious. Traffic is such a complex system, and there are probably many other examples where the actually safer thing to do is not what you'd intuitively think.