this post was submitted on 20 Sep 2023
73 points (94.0% liked)

Technology

59174 readers
4341 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 11 points 1 year ago* (last edited 1 year ago) (3 children)

I'm going to get downvoted to hell here, but if you defend google here, you should be defending Tesla when someone severely misuses auto-pilot.

Play games on AP and don't pay attention causing crash, not Tesla's fault. Drive off a bridge cause the GPS tells you to, not Google's fault.

You're responsible for driving your car at all times.

[–] [email protected] 14 points 1 year ago (1 children)

At least AP presents itself (somewhat) as an autonomous system though… even the best GPS obviously still requires you to look at the road.

[–] [email protected] 0 points 1 year ago* (last edited 1 year ago)

When you sit in the car, it really doesn't though. When you enable it, it clearly warns you about the dangers involved and to always pay attention. The radar versions even warned you about specific situations it would fail in and potentially cause a fatal accident. All cars that rely on radar have that issue and warn their users.

Anyone who's used it knows it clearly has problems, and honestly, it can be a little nerve racking getting used to using at first as well, because it does have problems and you need to learn them. My partner doesn't like using it because of those problems.

The majority of people causing accidents on it have simply grown accustomed to it. They know when it will usually fail, and then make poor choices and end up in a rare circumstance. People are just people and make all sorts of bad choices. Some people follow the GPS off a bridge or into a lake.

That's not to say I don't think there's room for Tesla to improve on this, like using the in cabin camera to further help detect if someone is paying attention or not, but ultimately it falls on the driver to pay attention.

If you happen to be given a Tesla with AP already enabled on your profile, and you've only gone off what you heard in the media then sure maybe, but those aren't the people causing problems. And really, if you rent a Tesla, I really do hope it's all disabled by default so you have to turn it on and go through the setup. That would be a legit problem in my mind otherwise.

[–] [email protected] 10 points 1 year ago (2 children)

The biggest fault here would be whoever was in charge of that bridge. If it collapsed 9 years ago why was it not blocked off?

[–] [email protected] 3 points 1 year ago

My exact first thought. And why not a billion BIG Red SIGNS saying shit like: "Collapsed Bridge ahead", "Warning: immediate death ahead", "What the fuck are doing?! Turn around", etc.

[–] [email protected] 1 points 1 year ago

That's the biggest question for me.

[–] [email protected] 6 points 1 year ago (1 children)

A GPS is a tool that aids a person

Tesla FSD is marketed as Self Driving

User error caused this man to go over a cliff. User error does not excuse tesla accidents when the user is supposed to be hands off. One was an accident caused by a person, the other was an accident caused by a machine attempting to make human decisions. There’s a huge difference.

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago)

I wasn't talking about FSD, I was talking about AP.

Although if you use FSD, to sign up, you need to acknowledge this (among other things)

"Full Self-Driving is in early limited access Beta and must be used with additional caution. It may do the wrong thing at the worst time, so you must always keep your hands on the wheel and pay extra attention to the road. Do not become complacent,"

If it leaves Beta in V12, and that warning is gone, there will be problems probably =( It's not ready to lose such an extreme warning. And it legit shouldn't leave beta until they take on liability and it's legit FSD.