this post was submitted on 27 Apr 2024
884 points (95.8% liked)

Technology

60052 readers
2774 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 23 points 7 months ago (5 children)

Is the investigation exhaustive? If these are all the crashes they could find related to the driver assist / self driving features, then it is probably much safer than a human driver. 1000 crashes out of 5M+ Teslas sold the last 5 years is actually a very small amount

I would want an article to try and find the rate of accidents per 100,00, group it by severity, and then compare and contrast that with human caused accidents.

Because while it's clear by now Teslas aren't the perfect self driving machines we were promised, there is no doubt at all that humans are bad drivers.

We lose over 40k people a year to car accidents. And fatal car accidents are rare, so multiple that by like 100 to get the total number of car accidents.

[–] [email protected] 30 points 7 months ago (8 children)

The question isn't "are they safer than the average human driver?"

The question is "who goes to prison when that self driving car has an oopsie, veers across three lanes of traffic and wipes out a family of four?"

Because if the answer is "nobody", they shouldn't be on the road. There's zero accountability, and because it's all wibbly-wobbly AI bullshit, there's no way to prove that the issues are actually fixed.

[–] [email protected] 17 points 7 months ago (1 children)

So it's better to put more lives in danger so that there can be someone to blame?

[–] [email protected] 9 points 7 months ago (2 children)

Accountability is important. If a human driver is dangerous, they get taken off the roads and/or sent to jail. If a self driving car kills somebody, it's just "oops, oh well, these things happen, but shareholder make a lot of money so never mind".

I do not want "these things happen" on my headstone.

[–] [email protected] 4 points 7 months ago

So you would prefer to have higher chances of dying, just to write "Joe Smith did it" on it?

[–] [email protected] 2 points 7 months ago* (last edited 7 months ago)

But if a human driver is dangerous, and gets put in jail or get taken off the roads, there are likely already more dangerous human drivers taking their place. Not to mention, genuine accidents, even horrific ones, do happen with human drivers. If the rate of accidents and rate of fatal accidents with self-driving vehicles is way down versus human drivers, you are actually risking your life more by trusting in human drivers and taking way more risks that way. Having someone be accountable for your death doesn't matter if you've already died because of them.

Is it any better if you have "Killed by Bill Johnson's SUV" on your headstone?

[–] [email protected] 12 points 7 months ago (2 children)

The answer is the person behind the wheel.

Tesla makes it very clear to the driver they you still have to pay attention and be ready to take over any time. Full self driving engages the in cabin nanny cam to enforce that you pay attention, above and beyond the frequent reminders to apply turning force to the steering wheel.

Now, once Tesla goes Mercedes and says you don't have to pay attention, it's gonna be the company that should step in. I know that's a big old SHOULD, but right now that's not the situation anyway.

[–] [email protected] 6 points 7 months ago

Now, once Tesla goes Mercedes and says you don't have to pay attention, it's gonna be the company that should step in

That doesn't give me warm and fuzzies either... Imagine a poor dude having to fight Mercedes or Testla because he was crippled by a sleeping driver and bad AI... Not even counting the lobbying that would certainly happen to reduce and then eliminate their liability

[–] [email protected] 0 points 7 months ago (1 children)

That’s today because “full self driving” doesn’t exist yet but when it does?

[–] [email protected] 6 points 7 months ago

There will be legal battles for sure. I don't know how you can argue for anything besides the manufacturer taking responsibility. I don't know how that doesn't end up with auto pilot fatalities treated as a class where there's a lookup table of payouts though. This is the intersection of liability and money/power, so it's functionally uncharted territory at least in the US.

[–] [email protected] 7 points 7 months ago* (last edited 7 months ago) (2 children)

The question isn’t “are they safer than the average human driver?”

How is that not the question? That absolutely is the question. Just because someone is accountable for your death doesn't mean you aren't already dead, it doesn't bring you back to life. If a human driver is actively dangerous and get taken off the road or put in jail, there are very likely already plenty more taking that human drivers place. Plus genuine accidents, even horrific ones, do happen with human drivers. If the death rate for self-driving vehicles is really that much lower, you are risking your life that much more by trusting in human drivers.

[–] [email protected] 5 points 7 months ago (1 children)

Yeah that person's take seems a little unhinged as throwing people in prison after a car accident only happens if they're intoxicated or driving recklessly. These systems don't have to be perfect to save lives. They just have to be better than the average driver.

[–] [email protected] 2 points 7 months ago

Hell, let's put the threshold at "better than 99% of drivers", because every driver I know thinks they are better than average.

[–] [email protected] 4 points 7 months ago (1 children)

Exactly.

We should solve the accountability problem, but the metric should be lives and accidents. If the self-driving system proves it causes fewer accidents and kills fewer people, it should be preferred. Full stop.

Throwing someone in jail may be cathartic, but the goal is fewer issues on the road, not more people in jail.

[–] [email protected] 1 points 7 months ago

Because I'm sure that's what corporations are interested in.

[–] [email protected] 3 points 7 months ago

I don't agree with your argument.

Making a human go to prison for wiping out a family of 4 isn't going to bring back the family of 4. So you're just using deterrence to hopefully make drivers more cautious.

Yet, year after year.. humans cause more deaths by negligence than tools can cause by failing.

The question is definitely "How much safer are they compared to human drivers"

It's also much easier to prove that the system has those issues fixed compared to training a human hoping that their critical faculties are intact. Rigorous Software testing and mechanical testing are within legislative reach and can be made strict requirements.

[–] [email protected] 2 points 7 months ago (22 children)

Because if the answer is "nobody", they shouldn't be on the road

Do you understand how absurd this is? Let's say AI driving results in 50% less deaths. That's 20,000 people every year that isn't going to die.

And you reject that for what? Accountability? You said in another comment that you don't want "shit happens sometimes" on your headstone.

You do realize that's exactly what's going on the headstones of those 40,000 people that die annually right now? Car accidents happen. We all know they happen and we accept them as a necessary evil. "Shit happens"

By not changing it, ironically, you're advocating for exactly what you claim you're against.

load more comments (22 replies)
[–] [email protected] 1 points 7 months ago

The question for me is not what margins the feature is performing on, as they will likely be better than human error raters, but how they market the product irresponsiblely.

load more comments (1 replies)
[–] [email protected] 19 points 7 months ago* (last edited 7 months ago) (9 children)

I was looking up info for another comment and found this site. It's from 2021, but the information seems solid.

https://www.flyingpenguin.com/?p=35819

This table was probably most interesting, unfortunately the formatting doesn't work on mobile, but I think you can make sense of it.

Car 2021 Sales So Far Total Deaths

Tesla Model S 5,155 40

Porsche Taycan 5,367 ZERO

Tesla Model X 6,206 14

Volkswagen ID 6,230 ZERO

Audi e-tron 6,884 ZERO

Nissan Leaf 7,729 2

Ford Mustang Mach-e 12,975 ZERO

Chevrolet Bolt 20,288 1

Tesla Model 3 51,510 87

So many cars with zero deaths compared to Tesla.

It isn't if Tesla's FSD is safer than humans, it's if it's keeping up with the automotive industry in terms of safety features. It seems like they are falling behind (despite what their marketing team claims).

load more comments (8 replies)
[–] [email protected] 18 points 7 months ago* (last edited 7 months ago) (3 children)

~~I know this is going to sound bad but bear with me and read my entire post. I think in this case it might be that people are trying to hate on Tesla because it's Elon (and fair enough) rather than self-driving itself.~~ Although there's also the side of things that self-driving vehicles are already very likely safer than human-driven ones, have lower rates of accidents, etc but people expect there to be zero accidents whatsoever with self-driving which is why I think self-driving may never actually take off and become mainstream. Then again, there's the lack of accountability, people prefer being able to place the blame and liability on something concrete, like an actual human. It's possible I'm wrong but I don't think I am wrong about this.

edit: I looked further into this, and it seems I am partially wrong. It seems that Tesla is not keeping up with the average statistics in the automotive industry in terms of safety statistics, the self-driving in their vehicles seem less safe than their competitors.

load more comments (3 replies)
[–] [email protected] 12 points 7 months ago (1 children)

I would highlight that not all Teslas will be being driven in this mode on a regular basis, if ever.

[–] [email protected] 6 points 7 months ago (1 children)

For example, I dont really trust mine and mostly use it in slow bumper to bumper traffic, or so I can adjust my AC on the touchscreen without swerving around in my lane.

[–] [email protected] 1 points 7 months ago

If you adjust your AC frequently, map it to the left scroll wheel.

[–] [email protected] 4 points 7 months ago* (last edited 7 months ago)

Only Elon calls his level 2 automation “FSD” or even “Autopilot”. That alone proves that Tesla is more guilty of these deaths than other makers are who choose less evil marketing terms. The dummies who buy Elon’s crap take those terms at face value and the Nazi CEO knows that, he doesn’t care though because just like Trump he thinks of his fans as little more than maggots. Can’t say I blame him.