this post was submitted on 03 Sep 2023
186 points (90.1% liked)

Technology

34912 readers
567 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 117 points 1 year ago (25 children)

Who the hell thinks beta software is appropriate for real-world applications in something as dangerous as vehicle control at highway speeds?

I've come to believe that all Teslas should be recalled until they get their act together. They're getting people hurt and killed by field testing their experiments on roadways that we paid for.

[–] [email protected] 10 points 1 year ago (3 children)

While I agree, let's not pretend like this is limited just to Tesla. My feed lately has had numerous stories of crazy FSD taxis as well.

I also have to say that one of my concerns with FSD is the deterioration in people's driving skills and their awareness of their car's abilities (especially as those change over time). Leaving aside all the wisecracks about people's normal abilities or not paying attention anyway, let's take a snowstorm. FSD can't drive in it, so you're left with regular human drivers going manual in their cars. But they haven't actually driven themselves in a while, so they've forgotten some of the lessons they learned like how to apply the brakes differently in ice and snow, they don't know where the corners of their car are, they're driving entirely too fast and - because their FSD car was compensating for mechanical issues - they're not aware that their tires are near-bald and the brakes are iffy.

Thing is, I know this is something that's going to happen. I just don't know how we can mitigate the risks.

[–] [email protected] 8 points 1 year ago

IMHO, Waymo a Cruise AVs are different animals. They have LiDAR. Musk is still hell bent on developing a camera-only system, which is inferior. But it’s cheaper and less bulky, so Musk is all about it.

[–] [email protected] 5 points 1 year ago

Oh, I completely agree on all points. None of them are ready for full autopilot.

[–] [email protected] 2 points 1 year ago

https://arstechnica.com/cars/2023/09/are-self-driving-cars-already-safer-than-human-drivers/

It seems that both Waymo and Cruise are more likely to already surpass average human driving safety, than not.

I'm really curious on how the next FSD version (which apparently completely relies on neural nets for driving) play out.
Not that I think it'll be particularily good, just particularily interesting.

load more comments (21 replies)