this post was submitted on 28 Sep 2023
117 points (96.1% liked)

Technology

34862 readers
145 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
 

SAN FRANCISCO, Sept 28 (Reuters) - Opening statements are set to begin on Thursday in the first U.S. trial over allegations that Tesla's (TSLA.O) Autopilot driver assistant feature led to a death, and its results could help shape similar cases across the country.

The trial, in a California state court, stems from a civil lawsuit alleging the Autopilot system caused owner Micah Lee’s Model 3 to suddenly veer off a highway east of Los Angeles at 65 miles per hour (105 kph), strike a palm tree and burst into flames, all in the span of seconds.

The 2019 crash killed Lee and seriously injured his two passengers, including a then-8-year-old boy who was disemboweled, according to court documents. The lawsuit, filed against Tesla by the passengers and Lee's estate, accuses Tesla of knowing that Autopilot and other safety systems were defective when it sold the car.

Tesla has denied liability, saying Lee consumed alcohol before getting behind the wheel. The electric-vehicle maker also claims it was not clear whether Autopilot was engaged at the time of crash.

Tesla has been testing and rolling out its Autopilot and more advanced Full Self-Driving (FSD) system, which Chief Executive Elon Musk has touted as crucial to his company's future but which has drawn regulatory and legal scrutiny.

Tesla won a bellwether trial in Los Angeles in April with a strategy of saying that it tells drivers that its technology requires human monitoring, despite the "Autopilot" name. A Model S swerved into a curb in 2019 and injured its driver, and jurors told Reuters after the verdict that they believed Tesla warned drivers about its system and that driver distraction was to blame.

The stakes are higher in the trial this week, and in other cases, because people died. Tesla and plaintiff attorneys jousted in the runup about what evidence and arguments each side could make.

Tesla, for instance, won a bid to exclude some of Musk’s public statements about Autopilot. However, attorneys for the crash victims can argue that Lee’s blood alcohol content was below the legal limit, according to court filings.

The trial, in Riverside County Superior Court, is expected to last a few weeks.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] -3 points 1 year ago (1 children)

Anybody who crashes while using Autopilot is their own damn fault, and this is what the court will find.

Autopilot is nothing more than an advanced driver assistance system. Every other manufacturer has a similar system but Tesla is the only one being used and plastered all over the news.

[–] [email protected] 0 points 1 year ago (1 children)

The way it is marketed is not in line with it's functionality. I expect the prosecution will claim the term "Full Self Driving" is confusing to consumers

[–] [email protected] 0 points 1 year ago (1 children)

Except "full self driving" is not in question. It's autopilot.

[–] [email protected] 3 points 1 year ago (1 children)

Sigh..

Jonathan Michaels, an attorney for the plaintiffs, in his opening statement at the trial in Riverside, California, said that when the 37-year-old Lee bought Tesla's “full self-driving capability package” for $6,000 for his Model 3 in 2019, the system was in "beta," meaning it was not yet ready for release.

RTFA

[–] [email protected] 0 points 1 year ago (1 children)

🤦‍♂️ you're the one who needs to read the article. There's nothing there to indicate it was in use at the time.

[–] [email protected] 2 points 1 year ago* (last edited 1 year ago) (1 children)

That's irrelevant. The plaintiff bought the FSD package and his attorney (not prosecutor, I missed that this was a civil suit not criminal trial) will likely argue that it introduced confusion on the part of his client. It doesn't matter that the FSD package wasn't actually in use if the plaintiff believed it was (or, more importantly, that he believed it could do things that it could not due to the confusing terminology)

[–] [email protected] 0 points 1 year ago

Of course it matters. The plaintiff not knowing how to use their car is not a valid defense.