this post was submitted on 28 Apr 2024
289 points (99.3% liked)

Technology

59421 readers
3034 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Without paywall: https://archive.ph/NGkbf

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 8 points 6 months ago (1 children)

This is the best summary I could come up with:


SAN FRANCISCO — As CEO Elon Musk stakes the future of Tesla on autonomous driving, lawyers from California to Florida are picking apart the company’s most common driver assistance technology in painstaking detail, arguing that Autopilot is not safe for widespread use by the public.

Evidence emerging in the cases — including dash-cam video obtained by The Washington Post — offers sometimes-shocking details: In Phoenix, a woman allegedly relying on Autopilot plows into a disabled car and is then struck and killed by another vehicle after exiting her Tesla.

Late Thursday, the National Highway Traffic Safety Administration launched a new review of Autopilot, signaling concern that a December recall failed to significantly improve misuse of the technology and that drivers are misled into thinking the “automation has greater capabilities than it does.”

The company’s decision to settle with Huang’s family — along with a ruling from a Florida judge concluding that Tesla had “knowledge” that its technology was “flawed” under certain conditions — is giving fresh momentum to cases once seen as long shots, legal experts said.

In Riverside, Calif., last year, a jury heard the case of Micah Lee, 37, who was allegedly using Autopilot when his Tesla Model 3 suddenly veered off the highway at 65 mph, crashed into a palm tree and burst into flames.

Last year, Florida Circuit Judge Reid Scott upheld a plaintiff’s request to seek punitive damages in a case concerning a fatal crash in Delray Beach, Fla., in 2019 when Jeremy Banner and his Tesla in Autopilot failed to register a semi truck crossing its path.


The original article contains 1,850 words, the summary contains 263 words. Saved 86%. I'm a bot and I'm open source!

[–] [email protected] 20 points 6 months ago (2 children)

Even when the driver is fully responsible, the assistance software must work properly in all situations. And it must be tested fully.

In case the software makes severe mistakes surprisingly, normal drivers maybe don't have a chance to regain control. Normal drivers are not like educated test drivers.

[–] [email protected] -1 points 6 months ago (1 children)

The article keeps calling it “Autopilot”, which is different from “Full Self Driving”.

If they are correct, then it’s all on the driver. Autopilot is just a nicer adaptive cruise control, and should be treated as such. Many cars have them, even non-smart vehicles. Even my seven year old Subaru had similar (much dumber but similar)

That being said, people seem to confuse the names of these different functionalities all the time, including throughout this thread. However, even if they were confused and meant FSD, my car has feedback to require your hands in the wheel, so I don’t understand how you can claim ignorance

[–] [email protected] 5 points 6 months ago

The article keeps calling it “Autopilot”, which is different from “Full Self Driving”. If they are correct, then [...]

No. That difference is meaningless, since both softwares provide autonomy level 2. The responsibilities are exactly the same.

[–] [email protected] -5 points 6 months ago (2 children)

My morality says both are accountable. The driver, and Tesla. Tesla for damage caused by their system, and the driver for and if he does not retake control of the vehicle given the chance.

[–] [email protected] 6 points 6 months ago* (last edited 6 months ago) (1 children)

But does the driver have a reasonable chance with adequate timeframe to regain control?

Like what happened with Boeing 737 Max MCAS incident, Boeing expects the pilot to disengage the trim motor in mere 4 seconds, which accoriding to a pilot "a lot to ask in an overwheming situation" or something similar.

Normal people in soon-to-crash situation are likely to freeze for a second or two, and the fear kicks up. How the driver reacts next is hard to predict. Yet, at the speed most US drivers love to go (I saw 70+ mph on freeway is the norm), the time avalible for them to make an well thought out decision I guess is quite short.

[–] [email protected] 1 points 6 months ago (2 children)

You made me think about this for a second.

In my head, the reason is not specifically to punish the driver, but to make drivers always be aware and ready to take control again. Yes 100 ppl will have 1000 different ways to react to such a software error, but you need ppl to pay attention, and in law the only way is to use punishment. Obviously this needs to be well calculated but either you have multiple lines of defense (the software, the driver, maybe even additional safety features) or you have to remove the autonomous system.

[–] [email protected] 5 points 6 months ago

People are naturally going to pay less attention the more cars drive for them. You can't partially automate steering. Driver assisted steering is as close as it can be before the liability needs to fall on Tesla and other software manufacturers. A car isn't a plane. The driver needs to be in control when split second decisions happen, like a child running after a ball.

If I'm paying for an autopilot, I'm not the pilot. I.e., the driver. The car is. And Tesla's marketing bullshit and lawyers are going to fail here. This does not fall under puffery. It's false advertising that's causing consumers to place undue trust in a product. And the insurance industry is quite concerned just where the liability falls in all of this as well. And as they're the ones currently having to pay out claims when Tesla wins, they have a vested interest seeing that Tesla doesn't.

[–] [email protected] 2 points 6 months ago

It doesn't matter for practical purposes you can't make people pay attention as if driving without the actual engagement of driving. There is going to be a delay in taking over and in a lot of cases it wont matter by the time the human is effectively in control.

[–] [email protected] 4 points 6 months ago* (last edited 6 months ago) (1 children)

Imagine you are going along a straight road, not too much traffic, the speed limit is high and you are enjoying it. Suddenly your assistant software decides to turn your steering wheel hard to the left.

You will have no chance.

What have you done wrong? What is it what you are accountable for?

[–] [email protected] 0 points 6 months ago (1 children)

For mine

  • there’s feedback to ensure you’re alert, touching the wheel every once in a while
  • when it made me nervous, it was drifting to the right or slowing, not suddenly moving anywhere.

So did the car think there was an impending collision? That should be obvious in the logs and the only reason for sudden maneuvers

[–] [email protected] -1 points 6 months ago

Cars do not think LOL