Skip to content

Tesla loses Autopilot wrongful death case in $329 million verdict

Technology
172 96 1
  • Yes. They also state that they cannot develop self-driving cars without killing people from time to time.

    "Ya gotta break some eggs," or some shit. /s

  • I don't know, most experimental technologies aren't allowed to be tested in public till they are good and well ready. This whole move fast break often thing seems like a REALLY bad idea for something like cars on public roads.

    Not to defend Tesla here, but how does the technology become "good and well ready" for road testing if you're not allowed to test it on the road? There are a million different driving environments in the US, so it'd be impossible to test all these scenarios without a real-world environment.

  • Yes. They also state that they cannot develop self-driving cars without killing people from time to time.

    Listen, if we make it safe it could take an entire extra fiscal year! I have payments to make on my 3 vacation homes NOW!

  • Yes. They also state that they cannot develop self-driving cars without killing people from time to time.

    All they really need to do is make self-driving cars safer than your average human driver.

  • I don't know, most experimental technologies aren't allowed to be tested in public till they are good and well ready. This whole move fast break often thing seems like a REALLY bad idea for something like cars on public roads.

    I'm pretty sure millions of people have been killed by cars over the last 100 years.

  • A representative for Tesla sent Ars the following statement: "Today's verdict is wrong and only works to set back automotive safety and jeopardize Tesla's and the entire industry's efforts to develop and implement life-saving technology. We plan to appeal given the substantial errors of law and irregularities at trial. Even though this jury found that the driver was overwhelmingly responsible for this tragic accident in 2019, the evidence has always shown that this driver was solely at fault because he was speeding, with his foot on the accelerator—which overrode Autopilot—as he rummaged for his dropped phone without his eyes on the road. To be clear, no car in 2019, and none today, would have prevented this crash. This was never about Autopilot; it was a fiction concocted by plaintiffs’ lawyers blaming the car when the driver—from day one—admitted and accepted responsibility."

    So, you admit that the company’s marketing has continued to lie for the past six years?

    "Today’s verdict is wrong"
    I think a certain corporation needs to be reminded to have some humility toward the courts
    Corporations should not expect the mercy to get away from saying the things a human would

  • Don't take my post as a defense of Tesla even if there is blame on both sides here. However, I lay the huge majority of it on Tesla marketing.

    I had to find two other articles to figure out if the system being used here was Tesla's free included AutoPilot, or the more advanced paid (one time fee/subscription) version called Full Self Drive (FSD). The answer for this case was: Autopilot.

    There are many important distinctions between the two systems. However Tesla frequently conflates the two together when speaking about autonomous technology for their cars, so I blame Tesla. What was required here to avoid these deaths actually has very little to do with autonomous technology as most know it, and instead talking about Collision Avoidance Systems. Only in 2024 was the first talk about requiring Collision Avoidance Systems in new vehicles in the USA. source The cars that include it now (Tesla and some other models from other brands) do so on their own without a legal mandate.

    Tesla claims that the Collision Avoidance Systems would have been overridden anyway because the driver was holding on the accelerator (which is not normal under Autopilot or FSD conditions). Even if that's true, Tesla has positioned its cars as being highly autonomous, and often times doesn't call out that that skilled autonomy only comes in the Full Self Drive paid upgrade or subscription.

    So I DO blame Tesla, even if the driver contributed to the accident.

    Did the car try to stop and fail to do so in time due to the speeding, or did the car not try despite expected collision detection behavior?

    Going off of OP's quote, the jury found the driver responsible but Tesla is found liable, which is pretty confusing. It might make some sense if expected autopilot functionality despite the drivers foot being on the pedal didn't work.

  • Not to defend Tesla here, but how does the technology become "good and well ready" for road testing if you're not allowed to test it on the road? There are a million different driving environments in the US, so it'd be impossible to test all these scenarios without a real-world environment.

    You are defending Tesla and being disingenuous about it.

    The other car companies working on this are spending millions of dollars to test their vehicles in closed areas that simulate real world conditions in order to not kill people.

    You sound like a psychopath.

  • Don't take my post as a defense of Tesla even if there is blame on both sides here. However, I lay the huge majority of it on Tesla marketing.

    I had to find two other articles to figure out if the system being used here was Tesla's free included AutoPilot, or the more advanced paid (one time fee/subscription) version called Full Self Drive (FSD). The answer for this case was: Autopilot.

    There are many important distinctions between the two systems. However Tesla frequently conflates the two together when speaking about autonomous technology for their cars, so I blame Tesla. What was required here to avoid these deaths actually has very little to do with autonomous technology as most know it, and instead talking about Collision Avoidance Systems. Only in 2024 was the first talk about requiring Collision Avoidance Systems in new vehicles in the USA. source The cars that include it now (Tesla and some other models from other brands) do so on their own without a legal mandate.

    Tesla claims that the Collision Avoidance Systems would have been overridden anyway because the driver was holding on the accelerator (which is not normal under Autopilot or FSD conditions). Even if that's true, Tesla has positioned its cars as being highly autonomous, and often times doesn't call out that that skilled autonomy only comes in the Full Self Drive paid upgrade or subscription.

    So I DO blame Tesla, even if the driver contributed to the accident.

    I feel like calling it AutoPilot is already risking liability, Full Self Driving is just audacious. There's a reason other companies with similar technology have gone with things like driving assistance. This has probably had lawyers at Tesla sweating bullets for years.

  • "Some of you will die, but that's a risk I'm willing to take."

    Brannigan is way smarter than Mush.

  • Holding them accountable would be jail time. I'm fine with even putting the salesman in jail for this. Who's gonna sell your vehicles when they know there's a decent chance of them taking the blame for your shitty tech?

    Don't you love how corporations can be people when it comes to bribing politicians but not when it comes to consequences for their criminal actions? Interestingly enough, the same is happening to AI...

  • I'm pretty sure millions of people have been killed by cars over the last 100 years.

    And we're having less and less deadly injured people on developed countries (excluding the USA, if the statistics are correct I've read).

    Tesla's autopilot seems to be a step backwards with a future promise of being better than human drivers.

    But they slimmed down their sensors to fucking simple 2D cams.
    That's just cheaping out on the cost of Tesla owners - but also of completely uninvolved people around a self driving Tesla, that didn't take the choice to trust this tech, that's living more on PR, than actual results

  • I'm pretty sure millions of people have been killed by cars over the last 100 years.

    Cars, yes, driven by humans. But not by AI bullshit.

  • And we're having less and less deadly injured people on developed countries (excluding the USA, if the statistics are correct I've read).

    Tesla's autopilot seems to be a step backwards with a future promise of being better than human drivers.

    But they slimmed down their sensors to fucking simple 2D cams.
    That's just cheaping out on the cost of Tesla owners - but also of completely uninvolved people around a self driving Tesla, that didn't take the choice to trust this tech, that's living more on PR, than actual results

    Can't comment specifically about Tesla's but self driving is going to have to go through the same decades of iterative improvement that car safety went through. Thats just expected

    However its not appropriate for this to be done at the risk to lives.

    But somehow it needs the time and money to run through a decade of improvement

  • Holding them accountable would be jail time. I'm fine with even putting the salesman in jail for this. Who's gonna sell your vehicles when they know there's a decent chance of them taking the blame for your shitty tech?

    You'd have to prove that the salesman said exactly that, and without a record it's at best a he said / she said situation.

    I'd be happy to see Musk jailed though, he's definitely taunted self driving as fully functional.

  • A representative for Tesla sent Ars the following statement: "Today's verdict is wrong and only works to set back automotive safety and jeopardize Tesla's and the entire industry's efforts to develop and implement life-saving technology. We plan to appeal given the substantial errors of law and irregularities at trial. Even though this jury found that the driver was overwhelmingly responsible for this tragic accident in 2019, the evidence has always shown that this driver was solely at fault because he was speeding, with his foot on the accelerator—which overrode Autopilot—as he rummaged for his dropped phone without his eyes on the road. To be clear, no car in 2019, and none today, would have prevented this crash. This was never about Autopilot; it was a fiction concocted by plaintiffs’ lawyers blaming the car when the driver—from day one—admitted and accepted responsibility."

    So, you admit that the company’s marketing has continued to lie for the past six years?

    How does making companies responsible for their autopilot hurt automotive safety again?

  • A representative for Tesla sent Ars the following statement: "Today's verdict is wrong and only works to set back automotive safety and jeopardize Tesla's and the entire industry's efforts to develop and implement life-saving technology. We plan to appeal given the substantial errors of law and irregularities at trial. Even though this jury found that the driver was overwhelmingly responsible for this tragic accident in 2019, the evidence has always shown that this driver was solely at fault because he was speeding, with his foot on the accelerator—which overrode Autopilot—as he rummaged for his dropped phone without his eyes on the road. To be clear, no car in 2019, and none today, would have prevented this crash. This was never about Autopilot; it was a fiction concocted by plaintiffs’ lawyers blaming the car when the driver—from day one—admitted and accepted responsibility."

    So, you admit that the company’s marketing has continued to lie for the past six years?

    Seems like jury verdicts don't set a legal precedent in the US but still often considered to have persuasive impact on future cases.

    This kinda makes sense but the articles on this don't make it very clear how impactful this actually is - here crossing fingers for Tesla's down fall. I'd imagine launching robo taxis would be even harder now.

    It's funny how this legal bottle neck was the first thing AI driving industry research ran into. Then, we kinda collectively forgot that and now it seems like it actually was as important as we thought it would be. Let's say once robo taxis scale up - there would be thousands of these every year just due sheer scale of driving. How could that ever work outside of places like China?

  • Seems like jury verdicts don't set a legal precedent in the US but still often considered to have persuasive impact on future cases.

    This kinda makes sense but the articles on this don't make it very clear how impactful this actually is - here crossing fingers for Tesla's down fall. I'd imagine launching robo taxis would be even harder now.

    It's funny how this legal bottle neck was the first thing AI driving industry research ran into. Then, we kinda collectively forgot that and now it seems like it actually was as important as we thought it would be. Let's say once robo taxis scale up - there would be thousands of these every year just due sheer scale of driving. How could that ever work outside of places like China?

    What jury results do is cost real money - companies often (not always) change in hopes to avoid more.

  • What jury results do is cost real money - companies often (not always) change in hopes to avoid more.

    Yeah but also how would this work at full driving scale. If 1,000 cases and 100 are settled for 0.3 billion that's already 30 billion a year, almost a quarter of Tesla's yearly revenue. Then in addition, consider the overhead of insurance fraud etc. It seems like it would be completely legally unsustainable unless we do "human life costs X number of money, next".

    I genuinely think we'll be stuck with humans for a long time outside of highly controlled city rides like Wayno where the cars are limited to 40km hour which makes it very difficult to kill anyone either way.

  • Yeah but also how would this work at full driving scale. If 1,000 cases and 100 are settled for 0.3 billion that's already 30 billion a year, almost a quarter of Tesla's yearly revenue. Then in addition, consider the overhead of insurance fraud etc. It seems like it would be completely legally unsustainable unless we do "human life costs X number of money, next".

    I genuinely think we'll be stuck with humans for a long time outside of highly controlled city rides like Wayno where the cars are limited to 40km hour which makes it very difficult to kill anyone either way.

    We have numbers already from all the human drivers caused death. Once someone makes self driving safer than humans (remember drinkingiisia factor in many human driver caused deaths and so non-drinkers will demand this be accountee for.

  • Radio geeks say you can still get 'lost' DoD hurricane data

    Technology technology
    9
    1
    217 Stimmen
    9 Beiträge
    84 Aufrufe
    jimerson@lemmy.worldJ
    That's the one, and thanks! My son and I got into Meshtastic as a hobby, and I guess ham radio was the next logical direction for our interest. It's fun learning the science behind how it works.
  • 0 Stimmen
    1 Beiträge
    16 Aufrufe
    Niemand hat geantwortet
  • 122 Stimmen
    3 Beiträge
    49 Aufrufe
    captainastronaut@seattlelunarsociety.orgC
    Anytime I get one as an Uber I try to play stupid like I can’t figure out the door handles. Slam the doors, pull the emergency door release (if there is one), push against the motorized door close mechanism. Ask if there’s a shade for the glass roof. Anything to remind the driver that it’s not a good car, especially as a taxi.
  • 'I can't drink the water' - life next to a US data centre

    Technology technology
    21
    1
    262 Stimmen
    21 Beiträge
    285 Aufrufe
    C
    They use adiabatic coolers to minimize electrical cost for cooling and maximize cooling capacity. The water isn't directly used as the cooling fluid. It's just used to provide evaporative cooling to boost the efficiency of a conventional refrigeration system. I also suspect that many of them are starting to switch to CO2 based refrigeration systems which heavily benefit from adiabatic gas coolers due to the low critical temp of CO2. Without an adiabatic cooler the efficiency of a CO2 based system starts dropping heavily when the ambient temp gets much above 80F. They could acheive the same results without using water, however their refrigeration systems would need larger gas coolers which would increase their electricity usage.
  • CBDC Explained : Can your money really expire?

    Technology technology
    4
    6 Stimmen
    4 Beiträge
    48 Aufrufe
    S
    CBDCs could well take the prize for most dangerous thing in our lifetime, similar to nuclear weapons during the Cold War. I'm thinking of that line from the song in Les Mis. Look down, look down. You'll always be a slave. Look down, look down. You're standing in your grave.
  • How the US is turning into a mass techno-surveillance state

    Technology technology
    66
    1
    483 Stimmen
    66 Beiträge
    498 Aufrufe
    D
    Are these people retarded? Did they forget Edward Snowden?
  • Microsoft pulls MS365 Business Premium from nonprofits

    Technology technology
    37
    1
    48 Stimmen
    37 Beiträge
    334 Aufrufe
    S
    That's the thing, I wish we could just switch all enterprises to Linux, but Microsoft developed a huge ecosystem that really does have good features. Unless something comparable comes up in the Linux world, I don't see Europe becoming independent of Microsoft any time soon
  • 24 Stimmen
    2 Beiträge
    31 Aufrufe
    toastedravioli@midwest.socialT
    Im all for making the traditional market more efficient and transparent, if blockchain can accommodate that, so long as we can also make crypto more like the traditional market. At least in terms of criminalizing shit that would obviously be illegal to do with securities