this post was submitted on 02 Aug 2024
301 points (98.7% liked)

Technology

58115 readers
4097 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 11 points 1 month ago* (last edited 1 month ago) (2 children)

I don't know about your city, but I trust technology a lot more than the average driver. At least technology can detect a red light vs a green light. I nearly got hit by a ford mega truck in broad daylight who thought the small, green bicycle symbol was his indicator to ignore his massive red "no left turn" indicator across a protected bike lane. :P

[–] [email protected] -2 points 1 month ago

I agree. Less margin for error, but leaves people who depend on automation vulnerable. I just imagine lots of growing pains before we get to ideal state.

[–] [email protected] -3 points 1 month ago* (last edited 1 month ago) (1 children)

I don't know about your city, but I trust technology a lot more than the average driver.

I don't. Technology can be subject to glitches, bugs, hacking, deciding to plow right through pedestrians (hello Tesla!), etc.

While the case can be made that human drivers are worse at reaction time and paying attention, at least a "dumb" car can't be hacked, won't be driven off the road due to a bug, won't try to knock people over itself without stopping, etc.

A human, when they catch these things happening, can correct them (even if it is caused by them). But if a computer develops a fatal fault like that, or is hijacked, it cannot.

EDIT: It seems like this community is full of AI techbro yes-men. Any criticism or critical analysis of their ideas seems to be met with downvotes, but I've yet to get a reply justifying how what I said is wrong.

[–] [email protected] 7 points 1 month ago (1 children)

Plenty of dumb cars get recalls all the time for shitty parts or design. Remember that Prius with the brakes that would just decide to stop working?

[–] [email protected] 0 points 1 month ago* (last edited 1 month ago) (1 children)

Self-driving cars are no less prone to mechanical failures.

[–] [email protected] 4 points 1 month ago (1 children)

Yeah, but you said that already

[–] [email protected] 0 points 1 month ago* (last edited 1 month ago) (1 children)

No, I was talking about software issues.

And if you know that both non-self-driving cars and self-driving cars are both equally prone to mechanical issues, why bring it up as a counterpoint?

[–] [email protected] 1 points 1 month ago

It wasn't a counterpoint you silly goose, I was agreeing with you