85
Tesla drivers run Autopilot where it’s not intended — with deadly consequences
(www.washingtonpost.com)
This is a most excellent place for technology news and articles.
This is about as predictable a failure as passwordless logins. If you can't secure your software product against it being used outside its intended use case, then stop, go back, you fucked up.
To me, how is this different than someone using cruise control on a 1999 car and reading a newspaper while he blows through stop signs and smashes into a wall. Driver error, reset try again.
An important difference is that cruise control is simpler to understand. It's a basic mechanic dressed up as a driver aid. A smaller slice of the population will incorrectly use cruise control.
FSD is a driver aid dressed up as... well, "Full Self-Driving." It's not Full, and it's not Self-Driving. It's mostly functional in limited circumstances and even then requires driver attention.
I think another good example is how people would never allow a Stasi agent to live in their house, unless the Stasi agent was redefined as a slew of websites, a collection of disparate laws, and multiple steps involving technology.
This is talking about Autopilot not FSD.