this post was submitted on 27 Apr 2024
884 points (95.8% liked)

Technology

59174 readers
4341 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 165 points 6 months ago* (last edited 6 months ago) (15 children)

I’ve often wondered why the FTC allows it to be marketed as “Full Self-Driving”. That’s blatant false advertising.

[–] [email protected] 78 points 6 months ago (1 children)

As is “autopilot”. There’s no automatic pilot. You’re still expected to keep your hands on the wheel and your eyes on the road.

[–] [email protected] 23 points 6 months ago (13 children)

I am so sick and tired of this belief because it's clear people have no idea what Autopilot on a plane actually does. They always seem to assume it flies the plane and the pilot doesn't do anything apparently. Autopilot alone does not fly the damned plane by itself.

"Autopilot" in a plane keeps the wings level at a set heading, altitude, and speed. It's literally the same as cruise control with lane-centering, since there's an altitude issue on a road.

There are more advanced systems available on the market that can be installed on smaller planes and in use on larger jets that can do things like auto takeoff, auto land, following waypoints, etc. without pilot input, but basic plain old autopilot doesn't do any of that.

That expanded capability is similar to how things like "Enhanced Autopilot" on a Tesla can do extra things like change lanes, follow highway exits on a navigated route, etc. Or how "Full Self-Driving" is supposed to follow road signs and lights, etc. but those are additional functions, not part of "Autopilot" and differentiated with their own name.

Autopilot, either on a plane or a Tesla, alone doesn't do any of that extra shit. It is a very basic system.

The average person misunderstanding what a word means doesn't make it an incorrect name or description.

[–] [email protected] 32 points 6 months ago* (last edited 6 months ago)

I say let Tesla market it as Autopilot if they pass similar regulatory safety frameworks as aviation autopilot functions.

[–] [email protected] 24 points 6 months ago

Flight instructor here.

I've seen autopilot systems that have basically every level of complexity you can imagine. A lot of Cessna 172s were equipped with a single axis autopilot that can only control the ailerons and can only maintain wings level. Others have control of the elevators and can do things like altitude hold, or ascend/descend at a given rate. More modern ones have control of all three axes and integration with the attitude instruments, and can do things like climb to an altitude and level off, turn to a heading and stop, or even something like fly a holding pattern over a fix. They still often don't have any control over the power plant, and small aircraft typically cannot land themselves, but there are autopilots installed in piston singles that can fly an approach to minimums.

And that's what's available on piston singles; airline pilots seldom fly the aircraft by hand anymore.

[–] [email protected] 15 points 6 months ago (3 children)

“But one reason that pilots will opt to turn the system on much sooner after taking off is if it’s stormy out or there is bad weather. During storms and heavy fog, pilots will often turn autopilot on as soon as possible.

This is because the autopilot system can take over much of the flying while allowing the pilot to concentrate on other things, such as avoiding the storms as much as possible. Autopilot can also be extremely helpful when there is heavy fog and it’s difficult to see, since the system does not require eyesight like humans do.”

Does that sound like something Tesla’s autopilot can do?

https://www.skytough.com/post/when-do-pilots-turn-on-autopilot

load more comments (3 replies)
[–] [email protected] 14 points 6 months ago (6 children)

I'd wager most people, when talking about a plane's autopilot mean the follow waypoints or Autoland capability.

Also, it's hard to argue "full self driving" means anything but the car is able to drive fully autonomously. If they were to market it as "advanced driver assist" I'd have no issue with it.

load more comments (6 replies)
load more comments (9 replies)
[–] [email protected] 29 points 6 months ago (19 children)

It’s not even the closest thing to self driving on the market, Mercedes has started selling a car that doesn’t require you to look at the road.

load more comments (19 replies)
load more comments (13 replies)
[–] [email protected] 76 points 6 months ago

Move fast, break shit. Fake it till you sell it, then move the goal posts down. Shift human casualties onto individual responsibility, a core libertarian theme. Profit off the lies because it's too late, money already in the bank.

[–] [email protected] 51 points 6 months ago (1 children)

They just recalled all the Cybertrucks, because their 'smort' technology is too stupid to realize when an accelerator sensor is stuck...

[–] [email protected] 24 points 6 months ago* (last edited 6 months ago) (4 children)

The accelerator sensor doesn’t get stuck, pedal does. The face of the accelerator falls off and wedges the pedal into the down position.

[–] [email protected] 24 points 6 months ago (1 children)

Pedal, not petal.

Not trying to be an asshole, just a nudge to avoid misunderstandings (although the context is clear in this case)

[–] [email protected] 12 points 6 months ago (1 children)

Given the number of other issues in the post I'm going to guess it was hurried and autocorrected wrong. Happens to me all the time.

load more comments (1 replies)
load more comments (3 replies)
[–] [email protected] 39 points 6 months ago (4 children)

Accoring to the math in this video: :

  • 150 000 000 miles have been driven with Teslas "FSD", which equals to
  • 375 miles per tesla purchased with FSD capabilities
  • 736 known FSD crashes with 17 fatalities
  • equals 11.3 deaths per 100M miles of teslas FSD

Doesnt sound to bad, until you hear that a human produces 1.35 deaths per 100M miles driven...

Its rough math, but holy moly that already is a completely other class of deadly than a non FSD car

load more comments (4 replies)
[–] [email protected] 38 points 6 months ago

If Red Bull can be successfully sued for false advertising from their slogan "It gives you wings", I think it stands that Tesla should too.

[–] [email protected] 36 points 6 months ago

Any time now it will be released. Like 7 years ago the taxis.

[–] [email protected] 29 points 6 months ago* (last edited 6 months ago) (1 children)

“If you’ve got, at scale, a statistically significant amount of data that shows conclusively that the autonomous car has, let’s say, half the accident rate of a human-driven car, I think that’s difficult to ignore,” Musk said.

That's a very problematic claim - and it might only be true if you compare completely unassited vehicles to L2 Teslas.

Other brands also have a plethora of L2 features, but they are marketed and designed in a different way. The L2 features are activate but designed in a way to keep the driver engaged in driving.

So L2 features are for better safety, not for a "wow we live in the future" show effect.

For example lane keeping in my car - you don't notice it when driving, it is just below your level of attention. But when I'm unconcentrated for a moment the car just stays on the lane, even on curving roads. It's just designed to steer a bit later than I would do. (Also, even before, the wheel turns minimally lighter into the direction to keep the car center of lane, than turning it to the other direction - it's just below what you notice, however if you don't concentrate on that effect)

Adaptive speed control is just sold as adaptive speed control - it did notice it uses radar AND the cameras once, as it considers. my lane free as soon the car in front me clears the lane markings with its wheels (when changing lanes)

It feels like the software in my car could do a lot more, but its features are undersold.

The combination of a human driver and the driver assist systems in combination makes driving a lot safer than relying on the human or the machine alone.

In fact the braking assistant has once stopped my car in tight traffic before I could even react, as the guy in front of me suddenly slammed their brakes. If the system had failed and not detected the situation then it would have been my job to react in time. (I did react, but can't say if I might have been fast enough with reaction times)

What Tesla does with technology is impressive, but I feel the system could be so. much better if they didn't compromise saftey in the name of marketing and hyperbole.

If Tesla's Autopilot was designed frim ground up to keep the driver engaged, I believe it would really be the safest car on the road.

I feel they are rather designed to be able to show off "cool stuff".

[–] [email protected] 21 points 6 months ago (4 children)

Tesla's autopilot isn't the best around. It's just the most deployed and advertised. People creating autopilot responsibly don't beta test them with the kind of idiots that think Tesla autopilot is the best approach.

load more comments (4 replies)
[–] [email protected] 27 points 6 months ago (5 children)

VERGE articles seem to be getting worse over the years, they've almost reached Forbes level, yes this does raise some valid safety concerns. No Tesla isn't bad just because it's Tesla.

It doesn't really give us the full picture. For starters, there's no comparison with Level 2 systems from other car makers, which also require driver engagement and have their own methods to ensure attention. This would help us understand how Tesla's tech actually measures up.

Plus, the piece skips over extremely important stats that would give us a clearer idea of how safe (or not) Tesla's systems are compared to good old human driving.

We're left in the dark about how Tesla compares in scenarios like drunk, distracted, or tired driving—common issues that automation aims to mitigate. (probably on purpose).

It feels like the article is more about stirring up feelings against Tesla rather than diving deep into the data. A more genuine take would have included these comparisons and variables, giving us a broader view of what these technologies mean for road safety.

I feel like any opportunity to jump on the Elon hate wagon is getting tiresome. (and yes i hate Elon too).

load more comments (5 replies)
[–] [email protected] 24 points 6 months ago (7 children)

I love to hate on musky boi as much as the next guy, but how does this actually compare to vehicular accidents and deaths overall? CGP Grey had the right idea when he said they didn't need to be perfect, just as good as or better than humans.

[–] [email protected] 19 points 6 months ago* (last edited 6 months ago) (1 children)

Grey had the right idea when he said they didn't need to be perfect, just as good as or better than humans.

The better question - is Tesla's FSD causing drivers to have more accidents than other driving assist technologies? It seems like a yes from this article and other data I've linked elsewhere in this thread.

load more comments (1 replies)
[–] [email protected] 13 points 6 months ago (4 children)

CGP Grey also seems to believe self driving cars with the absence of traffic lights is the solution to traffic as opposed to something like trains.

load more comments (4 replies)
load more comments (5 replies)
[–] [email protected] 23 points 6 months ago (5 children)

Is the investigation exhaustive? If these are all the crashes they could find related to the driver assist / self driving features, then it is probably much safer than a human driver. 1000 crashes out of 5M+ Teslas sold the last 5 years is actually a very small amount

I would want an article to try and find the rate of accidents per 100,00, group it by severity, and then compare and contrast that with human caused accidents.

Because while it's clear by now Teslas aren't the perfect self driving machines we were promised, there is no doubt at all that humans are bad drivers.

We lose over 40k people a year to car accidents. And fatal car accidents are rare, so multiple that by like 100 to get the total number of car accidents.

[–] [email protected] 30 points 6 months ago (34 children)

The question isn't "are they safer than the average human driver?"

The question is "who goes to prison when that self driving car has an oopsie, veers across three lanes of traffic and wipes out a family of four?"

Because if the answer is "nobody", they shouldn't be on the road. There's zero accountability, and because it's all wibbly-wobbly AI bullshit, there's no way to prove that the issues are actually fixed.

[–] [email protected] 17 points 6 months ago (3 children)

So it's better to put more lives in danger so that there can be someone to blame?

load more comments (3 replies)
[–] [email protected] 12 points 6 months ago (3 children)

The answer is the person behind the wheel.

Tesla makes it very clear to the driver they you still have to pay attention and be ready to take over any time. Full self driving engages the in cabin nanny cam to enforce that you pay attention, above and beyond the frequent reminders to apply turning force to the steering wheel.

Now, once Tesla goes Mercedes and says you don't have to pay attention, it's gonna be the company that should step in. I know that's a big old SHOULD, but right now that's not the situation anyway.

load more comments (3 replies)
load more comments (32 replies)
[–] [email protected] 19 points 6 months ago* (last edited 6 months ago) (10 children)

I was looking up info for another comment and found this site. It's from 2021, but the information seems solid.

https://www.flyingpenguin.com/?p=35819

This table was probably most interesting, unfortunately the formatting doesn't work on mobile, but I think you can make sense of it.

Car 2021 Sales So Far Total Deaths

Tesla Model S 5,155 40

Porsche Taycan 5,367 ZERO

Tesla Model X 6,206 14

Volkswagen ID 6,230 ZERO

Audi e-tron 6,884 ZERO

Nissan Leaf 7,729 2

Ford Mustang Mach-e 12,975 ZERO

Chevrolet Bolt 20,288 1

Tesla Model 3 51,510 87

So many cars with zero deaths compared to Tesla.

It isn't if Tesla's FSD is safer than humans, it's if it's keeping up with the automotive industry in terms of safety features. It seems like they are falling behind (despite what their marketing team claims).

load more comments (10 replies)
[–] [email protected] 18 points 6 months ago* (last edited 6 months ago) (3 children)

~~I know this is going to sound bad but bear with me and read my entire post. I think in this case it might be that people are trying to hate on Tesla because it's Elon (and fair enough) rather than self-driving itself.~~ Although there's also the side of things that self-driving vehicles are already very likely safer than human-driven ones, have lower rates of accidents, etc but people expect there to be zero accidents whatsoever with self-driving which is why I think self-driving may never actually take off and become mainstream. Then again, there's the lack of accountability, people prefer being able to place the blame and liability on something concrete, like an actual human. It's possible I'm wrong but I don't think I am wrong about this.

edit: I looked further into this, and it seems I am partially wrong. It seems that Tesla is not keeping up with the average statistics in the automotive industry in terms of safety statistics, the self-driving in their vehicles seem less safe than their competitors.

load more comments (3 replies)
[–] [email protected] 12 points 6 months ago (2 children)

I would highlight that not all Teslas will be being driven in this mode on a regular basis, if ever.

load more comments (2 replies)
load more comments (1 replies)
[–] [email protected] 19 points 6 months ago* (last edited 6 months ago) (2 children)

Obviously the time to react to the problem was before the system told you about it, that's the whole point, THE SYSTEM IS NOT READY. Cars are not ready to drive themselves, and obviously the legal system is too slow and backwards to deal with it so it's not ready either. But fuck it let's do it anyway, sure, and while we're at it we can do away with the concept of the driver's license in the first place because nothing matters any more and who gives a shit we're all obviously fucking retarded.

load more comments (2 replies)
[–] [email protected] 15 points 6 months ago (4 children)

Fuck cars, those ones specifically

load more comments (4 replies)
[–] [email protected] 14 points 6 months ago

What!!!!!! I thought Elon had it all figured out, No Way!

https://twitter.com/elonmusk/status/1744821656990675184

\s

load more comments
view more: next ›