this post was submitted on 30 Aug 2023
202 points (93.2% liked)
Technology
60071 readers
4967 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I obviously don't know for sure, but at least it's conceivable that, in fact, it may be the case that erratic behavior of other drivers, caused by someone else driving slower than them, leads to a significant number of accidents every year that would not have happened had they been driving at the same speed as everyone else.
In this case, forcing the self-driving vehicle to never go over the speed limit literally means you're knowingly choosing an option that leads to more people dying instead of less.
I think there's a pretty clear moral dilemma here. I'm not claiming to know the right way forward, but I just want to point out that strictly following the rules without an exception is not always what leads to the best results. Of course, allowing self-driving cars to break the rules comes with its own issues, but this just further points to the complexity of this issue.
Tehnyt again if that follow others behavior is drive faster, that also leads to accidents. Not many with the other frustrated drivers, but with say wildlife. People not being aboe to stop in time more often dye to the increased speed and thus increased braking distance.
That is why bendy narrow roads have slower speed limit. It is function of what is the predicted reaction time, the amount of sight distance one had.
Can't cheat physics, the more speeding there is, the longer the braking distances, the more often it isn't anymore a near miss due to braking in time and instead a full on collision.
So sure one is more synch, but every is in synch with less reaction time available, when the unavoidable chaos factor raises its head. Chaos factor like wild live (who are not obligated nor obliged to follow traffic rules) or say someone bursting a tire leading to sudden change in speed and control.
When a self-driving car drives at or below the speed limit on a fast-moving highway, it can disrupt the natural flow of traffic. This can lead to a higher chance of accidents when other human drivers resort to aggressive maneuvers like tailgating, risky overtaking, or sudden lane changes. I'm not claiming that it does so for a fact, but it is conceivable, and that's the point of my argument.
Now, contrast this with a self-driving car that adjusts its speed to match the prevailing traffic conditions, even if it means slightly exceeding the speed limit. By doing so, it can blend with the surrounding traffic and reduce the chances of accidents. It's not about encouraging speeding but rather adapting to the behavior of other human drivers.
Of course, we should prioritize safety and adhere to traffic rules whenever possible. However, sometimes the safest thing to do might be temporarily going with the flow, even if it means bending the speed limit rules slightly. The paradox lies in the fact that by mimicking human behavior to a certain extent, self-driving cars can contribute to overall road safety. It's a nuanced issue, but it underscores the complexity of integrating autonomous vehicles into a world where human drivers are far from perfect. This would not be an issue if every car was driven by an competent AI and there was no human drivers.