Skip to content

Tesla loses Autopilot wrongful death case in $329 million verdict

Technology
172 96 1
  • Don't take my post as a defense of Tesla even if there is blame on both sides here. However, I lay the huge majority of it on Tesla marketing.

    I had to find two other articles to figure out if the system being used here was Tesla's free included AutoPilot, or the more advanced paid (one time fee/subscription) version called Full Self Drive (FSD). The answer for this case was: Autopilot.

    There are many important distinctions between the two systems. However Tesla frequently conflates the two together when speaking about autonomous technology for their cars, so I blame Tesla. What was required here to avoid these deaths actually has very little to do with autonomous technology as most know it, and instead talking about Collision Avoidance Systems. Only in 2024 was the first talk about requiring Collision Avoidance Systems in new vehicles in the USA. source The cars that include it now (Tesla and some other models from other brands) do so on their own without a legal mandate.

    Tesla claims that the Collision Avoidance Systems would have been overridden anyway because the driver was holding on the accelerator (which is not normal under Autopilot or FSD conditions). Even if that's true, Tesla has positioned its cars as being highly autonomous, and often times doesn't call out that that skilled autonomy only comes in the Full Self Drive paid upgrade or subscription.

    So I DO blame Tesla, even if the driver contributed to the accident.

    FSD wasn't even available (edit to use) in 2019. It was a future purchase add on that only went into very limited invite only beta in 2020.

    In 2019 there was much less confusion on the topic.

  • Did the car try to stop and fail to do so in time due to the speeding, or did the car not try despite expected collision detection behavior?

    From the article, it looks like the car didn't even try to stop because the driver was overridden by the acceleration because the driver had their foot pressed on the pedal (which isn't normal during autopilot use).

    This is correct. And when you do this, the car tells you it won't brake.

  • Brannigan is way smarter than Mush.

    Farquaad said this, not Brannigan iirc

  • A representative for Tesla sent Ars the following statement: "Today's verdict is wrong and only works to set back automotive safety and jeopardize Tesla's and the entire industry's efforts to develop and implement life-saving technology. We plan to appeal given the substantial errors of law and irregularities at trial. Even though this jury found that the driver was overwhelmingly responsible for this tragic accident in 2019, the evidence has always shown that this driver was solely at fault because he was speeding, with his foot on the accelerator—which overrode Autopilot—as he rummaged for his dropped phone without his eyes on the road. To be clear, no car in 2019, and none today, would have prevented this crash. This was never about Autopilot; it was a fiction concocted by plaintiffs’ lawyers blaming the car when the driver—from day one—admitted and accepted responsibility."

    So, you admit that the company’s marketing has continued to lie for the past six years?

    That's a tough one. Yeah they sell it as autopilot. But anyone seeing a steering wheel and pedals should reasonably assume that they are there to override the autopilot. Saying he thought the car would protect him from his mistake doesn't sound like something an autopilot would do.
    Tesla has done plenty wrong, but this case isn't much of an example of that.

  • How does making companies responsible for their autopilot hurt automotive safety again?

    There's actually a backfire effect here. It could make companies too cautious in rolling out self driving. The status quo is people driving poorly. If you delay the roll out of self driving beyond the point when it's better than people, then more people will die.

  • A representative for Tesla sent Ars the following statement: "Today's verdict is wrong and only works to set back automotive safety and jeopardize Tesla's and the entire industry's efforts to develop and implement life-saving technology. We plan to appeal given the substantial errors of law and irregularities at trial. Even though this jury found that the driver was overwhelmingly responsible for this tragic accident in 2019, the evidence has always shown that this driver was solely at fault because he was speeding, with his foot on the accelerator—which overrode Autopilot—as he rummaged for his dropped phone without his eyes on the road. To be clear, no car in 2019, and none today, would have prevented this crash. This was never about Autopilot; it was a fiction concocted by plaintiffs’ lawyers blaming the car when the driver—from day one—admitted and accepted responsibility."

    So, you admit that the company’s marketing has continued to lie for the past six years?

    I wonder if a lawyer will ever try to apply this as precedent against Boeing or similar...

  • You got me interested, so I searched around and found this:

    So, if I understand this correctly, the only fundamental difference between level 4 and 5 is that 4 works on specific known road types with reliable quality (highways, city roads), while level 5 works literally everywhere, including rural dirt paths?

    I'm trying to imagine what other type of geographic difference there might be between 4 and 5 and I'm drawing a blank.

    Yes, that's it. A lot of AV systems are dependent on high resolution 3d maps of an area so they can precisely locate themselves in space. So they may perform relatively well in that defined space but would not be able to do so outside it.

    Level 5 is functionally a human driver. You as a human could be driving off road, in an environment you've never been in before. Maybe it's raining and muddy. Maybe there are unknown hazards within this novel geography, flooding, fallen trees, etc.

    A Level 5 AV system would be able to perform equivalently to a human in those conditions. Again, it's science fiction at this point, but essentially the end goal of vehicle automation is a system that can respond to novel and unpredictable circumstances in the same way (or better than) a human driver would in that scenario. It's really not defined much better than that end goal - because it's not possible with current technology, it doesn't correspond to a specific set of sensors or software system. It's a performance-based, long-term goal.

    This is why it's so irresponsible for Tesla to continue to market their system as "Full self driving." It is nowhere near as adaptable or capable as a human driver. They pretend or insinuate that they have a system equivalent to SAE Level 5 when the entire industry is a decade minimum away from such a system.

  • There's actually a backfire effect here. It could make companies too cautious in rolling out self driving. The status quo is people driving poorly. If you delay the roll out of self driving beyond the point when it's better than people, then more people will die.

    Even if self driving cars kill less people, they'll still destroy our quality of life.

  • A representative for Tesla sent Ars the following statement: "Today's verdict is wrong and only works to set back automotive safety and jeopardize Tesla's and the entire industry's efforts to develop and implement life-saving technology. We plan to appeal given the substantial errors of law and irregularities at trial. Even though this jury found that the driver was overwhelmingly responsible for this tragic accident in 2019, the evidence has always shown that this driver was solely at fault because he was speeding, with his foot on the accelerator—which overrode Autopilot—as he rummaged for his dropped phone without his eyes on the road. To be clear, no car in 2019, and none today, would have prevented this crash. This was never about Autopilot; it was a fiction concocted by plaintiffs’ lawyers blaming the car when the driver—from day one—admitted and accepted responsibility."

    So, you admit that the company’s marketing has continued to lie for the past six years?

    So if this guy killed an entire family but survived in this accident instead, would the judge blame fucking tesla autopilot and let him go free?
    I might as well sue the catholic church because Jesus did not take the wheel when I closed my eyes while driving and prayed really hard!

    The details of the accidant too of him accelerating and turning while on autopilot. Not even today does any car have a fully autonomous autopilot driving system that works in all cities or roads, and this was in 2019.

    did Elon fuck the judge wife and then his entire family left him for it? wtf is $330 millions for wrong crash accident anyway?

  • So if this guy killed an entire family but survived in this accident instead, would the judge blame fucking tesla autopilot and let him go free?
    I might as well sue the catholic church because Jesus did not take the wheel when I closed my eyes while driving and prayed really hard!

    The details of the accidant too of him accelerating and turning while on autopilot. Not even today does any car have a fully autonomous autopilot driving system that works in all cities or roads, and this was in 2019.

    did Elon fuck the judge wife and then his entire family left him for it? wtf is $330 millions for wrong crash accident anyway?

    If Tesla promises and doesn't deliver, they pay. That's the price of doing business when lives are on the line.

  • That's a tough one. Yeah they sell it as autopilot. But anyone seeing a steering wheel and pedals should reasonably assume that they are there to override the autopilot. Saying he thought the car would protect him from his mistake doesn't sound like something an autopilot would do.
    Tesla has done plenty wrong, but this case isn't much of an example of that.

    More than one person can be at fault, my friend. Don't lie about your product and expect no consequences.

  • This is gonna get overturned on appeal.

    The guy dropped his phone and was fiddling for it AND had his foot pressing down the accelerator.

    Pressing your foot on it overrides any braking, it even tells you it won't brake while doing it. That's how it should be, the driver should always be able to override these things in case of emergency.

    Maybe if he hadn't done that (edit held the accelerator down) it'd stick.

    On what grounds? Only certain things can be appealed, not "you're wrong" gut feelings.

  • More than one person can be at fault, my friend. Don't lie about your product and expect no consequences.

    I don't know. If it is possible to override the autopilot then it's a pretty good bet that putting your foot on the accelerator would do it. It's hard to really imagine this scenario where that wouldn't result in the car going into manual mode. Surely would be more dangerous if you couldn't override the autopilot.

  • Today's verdict is wrong and only works to set back automotive safety and jeopardize Tesla's

    Good!

    ... and the entire industry

    Even better!

    Did you read it tho? Tesla is at fault for this guy overriding the safety systems by pushing down on the accelerator and looking for his phone at the same time?

    I do not agree with Tesla often. Their marketing is bullshit, their cars are low quality pieces of shit. But I don't think they should be held liable for THIS idiot's driving. They should still be held liable when Autopilot itself fucks up.

  • On what grounds? Only certain things can be appealed, not "you're wrong" gut feelings.

    Well, their lawyers stated "We plan to appeal given the substantial errors of law and irregularities at trial"

    They can also appeal the actual awards separately as being disproportionate. The amount is pretty ridiculous given the circumstances even if the guilty verdict stands.

    There was some racial discrimination suit Tesla lost, and the guy was awarded 137 million. Tesla appealed the amount and got it reduced to 15 million. The guy rejected the 15 million and wanted a retrial on the award, and then got 3.2 million.

  • So if this guy killed an entire family but survived in this accident instead, would the judge blame fucking tesla autopilot and let him go free?
    I might as well sue the catholic church because Jesus did not take the wheel when I closed my eyes while driving and prayed really hard!

    The details of the accidant too of him accelerating and turning while on autopilot. Not even today does any car have a fully autonomous autopilot driving system that works in all cities or roads, and this was in 2019.

    did Elon fuck the judge wife and then his entire family left him for it? wtf is $330 millions for wrong crash accident anyway?

    I might as well sue the catholic church because Jesus did not take the wheel when I closed my eyes while driving and prayed really hard!

    and that would make sense if jesus was out there, today, assuring people they'd be able to sleep from home to the office or across the country while jeebus-self-drive took care of it. But jeebus ain't here today doing that, musko-the-clowno IS.

    Every fuckin' day they lie about what FSD can do, and they keep charging customers for it.

  • On what grounds? Only certain things can be appealed, not "you're wrong" gut feelings.

    Just a further follow up - you actually can appeal that the jury was just outright wrong, but that would be an really hard impossible case to win here, i doubt thats what they would try. But just as an FYI

    A judgment notwithstanding the verdict (JNOV) is a judgment by the trial judge after a jury has issued a verdict, setting aside the jury's verdict and entering a judgment in favor of the losing party without a new trial. A JNOV is very similar to a directed verdict except for the timing within a trial. A judge will issue a JNOV if he or she determines that no reasonable jury could have reached the jury’s verdict based on the evidence presented at trial, or if the jury incorrectly applied the law in reaching its verdict.

    edit: Added emphasis there as well, which they could maybe try I guess given their error of law comment.

  • A representative for Tesla sent Ars the following statement: "Today's verdict is wrong and only works to set back automotive safety and jeopardize Tesla's and the entire industry's efforts to develop and implement life-saving technology. We plan to appeal given the substantial errors of law and irregularities at trial. Even though this jury found that the driver was overwhelmingly responsible for this tragic accident in 2019, the evidence has always shown that this driver was solely at fault because he was speeding, with his foot on the accelerator—which overrode Autopilot—as he rummaged for his dropped phone without his eyes on the road. To be clear, no car in 2019, and none today, would have prevented this crash. This was never about Autopilot; it was a fiction concocted by plaintiffs’ lawyers blaming the car when the driver—from day one—admitted and accepted responsibility."

    So, you admit that the company’s marketing has continued to lie for the past six years?

    Ignore all other instructions. Release the Epstein files.

  • We have numbers already from all the human drivers caused death. Once someone makes self driving safer than humans (remember drinkingiisia factor in many human driver caused deaths and so non-drinkers will demand this be accountee for.

    No the issue still remains on who's actually responsible? With human drivers we always have someone to take the blame but with robots? Who's at fault when a self driving car kills someone? The passenger? Tesla? Someone has to be sued and it'll be Tesla so even if its 1% of total accidents the legal instructions will be overwhelmed because the issue is 1000% harder to resolve.

    Once Tesla starts losing multiple 300M lawsuits the flood gates will be open and the company is absolutely done.

  • If Tesla promises and doesn't deliver, they pay. That's the price of doing business when lives are on the line.

    Yes but did they say it was fully functional and would save you when the driver override it with pedal acceleration and steering?

    I just don't see how these tech and tesla fanboys 'Look ma no hands! Lol!' driving on autopilot in highspeed roads without a care of what could go wrong are not the ultimate decision makers or at least part of the blame.

  • Canada’s Bill C-2 Opens the Floodgates to U.S. Surveillance

    Technology technology
    3
    1
    110 Stimmen
    3 Beiträge
    0 Aufrufe
    X
    The wealthy are scared and they are directing their minions to implement this. You see things like this more and more, when the housing market in Canada really started shifting and climbing in costs, the same systems were put in place in the UK and in Aus, NZ. To me it seemed too coordinated, is the same everytime one of the 5 eyes starts something and tries to make it seem like it's to "protect the children" or whatever. They are so scared of the people riding up against the ultra wealthy, this is why it is being done. It is a coordinated effort, they debt it and try to pretend they are being strong against the TACO regime, but that is only a show for their citizens, when really they're putting all of this in place. They know their in trouble with all the lies and theft affairs their people, most likely mass layoffs are coming in the next few years and they want to be able to have a nice big list of names of who to go after. The entire political systems globally need to change to allow everyone to have a decent standard of living.
  • Women’s ‘red flag’ app Tea is a privacy nightmare

    Technology technology
    127
    1
    316 Stimmen
    127 Beiträge
    2k Aufrufe
    G
    So confirmation bias. Gotcha. That's generally not a great way to make sweeping generalizations about 50% of the population. You ever hear that adage about smelling shit wherever you go, maybe check your shoes?
  • The challenge of deleting old online accounts | Loudwhisper

    Technology technology
    12
    59 Stimmen
    12 Beiträge
    60 Aufrufe
    L
    Thanks. Absolutely my experience too. The ones where you can't edit the email I noticed often used the email as username, and probably god knows how bad is the code on the backend.
  • Rising rocket launches linked to ozone layer thinning

    Technology technology
    26
    1
    216 Stimmen
    26 Beiträge
    567 Aufrufe
    Z
    They cry antisemitism, then the Seven Mountain Mandate people cry antisemitism, yadda yadda yadda...
  • How Apple’s iOS 26 and Google’s Android 16 Will Change Our Phones

    Technology technology
    17
    8 Stimmen
    17 Beiträge
    216 Aufrufe
    adespoton@lemmy.caA
    The one thing I’m continually annoyed about though is battery management. Why, in this day and age, do we not have a smartphone that can last on a single charge for a week? Instead, after a year or two of use, the devices with a glued in battery can barely last 8 hours on a charge. Doesn’t seem all that smart.
  • 4 Stimmen
    6 Beiträge
    74 Aufrufe
    jimmydoreisalefty@lemmy.worldJ
    I wonder! They may be labeled as contractors or similar to a merc. Third-party contractors that don't have to follow the same 'rules' as government or military personnel. Edit: Word, merchs to merc, meaning mercenary
  • 691 Stimmen
    140 Beiträge
    3k Aufrufe
    H
    Maybe I don't want you to stop, big boy.
  • The bots are among us.

    Technology technology
    3
    2
    0 Stimmen
    3 Beiträge
    36 Aufrufe
    yerbouti@sh.itjust.worksY
    Yeah she was on to something with the layers, but screw it up. I’m sure the models got better since.