this post was submitted on 28 Oct 2023
166 points (97.7% liked)
Technology
59390 readers
2960 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
But is this actually true? I hate that they just printed this without any attempt to verify it. Surely some independent body has looked into this by now.
I suspect there is something more to this than just that. After all, the car in question did this:
It seems like there are unsolvable safety problems going on.
Yes, the car does not appear to have safety features that let it know a body is caught underneath, but it did try to get out of traffic after the collision.
Since this never happens to human drivers that means autonomous cars are unfeasible.
Or it is an opportunity to add some additional sensors underneath that will make it miles better than human drivers.
Really the main problem with autonomous cars at this point in time is a combination of the co panes hiding issues and the public expecting perfection. More transparency and a 3rd party comparison to human drivers would be the best way to both improve automation and gain public trust when they actually see how bad human drivers can be.
Also charge corporations for betatesting on the fucking public... they're using tax payer funded roads and putting our lives at risk for their profits. They should share those profits far, far more than they do.
There is no logic for this issue and I swear everyone will see this problem reoccur for every scenario that hasn’t been accounted for. This will happen in almost every self driving car because it just hasn’t been accounted for.
You would think a self driving car could have 360 degrees of vision and not run into things, whether it's a firetruck or a cardboard box or a person. That should be job 1 for self driving.