Skip to content

Tesla loses Autopilot wrongful death case in $329 million verdict

Technology
172 96 1
  • Brannigan is way smarter than Mush.

    Farquaad said this, not Brannigan iirc

  • A representative for Tesla sent Ars the following statement: "Today's verdict is wrong and only works to set back automotive safety and jeopardize Tesla's and the entire industry's efforts to develop and implement life-saving technology. We plan to appeal given the substantial errors of law and irregularities at trial. Even though this jury found that the driver was overwhelmingly responsible for this tragic accident in 2019, the evidence has always shown that this driver was solely at fault because he was speeding, with his foot on the accelerator—which overrode Autopilot—as he rummaged for his dropped phone without his eyes on the road. To be clear, no car in 2019, and none today, would have prevented this crash. This was never about Autopilot; it was a fiction concocted by plaintiffs’ lawyers blaming the car when the driver—from day one—admitted and accepted responsibility."

    So, you admit that the company’s marketing has continued to lie for the past six years?

    That's a tough one. Yeah they sell it as autopilot. But anyone seeing a steering wheel and pedals should reasonably assume that they are there to override the autopilot. Saying he thought the car would protect him from his mistake doesn't sound like something an autopilot would do.
    Tesla has done plenty wrong, but this case isn't much of an example of that.

  • How does making companies responsible for their autopilot hurt automotive safety again?

    There's actually a backfire effect here. It could make companies too cautious in rolling out self driving. The status quo is people driving poorly. If you delay the roll out of self driving beyond the point when it's better than people, then more people will die.

  • A representative for Tesla sent Ars the following statement: "Today's verdict is wrong and only works to set back automotive safety and jeopardize Tesla's and the entire industry's efforts to develop and implement life-saving technology. We plan to appeal given the substantial errors of law and irregularities at trial. Even though this jury found that the driver was overwhelmingly responsible for this tragic accident in 2019, the evidence has always shown that this driver was solely at fault because he was speeding, with his foot on the accelerator—which overrode Autopilot—as he rummaged for his dropped phone without his eyes on the road. To be clear, no car in 2019, and none today, would have prevented this crash. This was never about Autopilot; it was a fiction concocted by plaintiffs’ lawyers blaming the car when the driver—from day one—admitted and accepted responsibility."

    So, you admit that the company’s marketing has continued to lie for the past six years?

    I wonder if a lawyer will ever try to apply this as precedent against Boeing or similar...

  • You got me interested, so I searched around and found this:

    So, if I understand this correctly, the only fundamental difference between level 4 and 5 is that 4 works on specific known road types with reliable quality (highways, city roads), while level 5 works literally everywhere, including rural dirt paths?

    I'm trying to imagine what other type of geographic difference there might be between 4 and 5 and I'm drawing a blank.

    Yes, that's it. A lot of AV systems are dependent on high resolution 3d maps of an area so they can precisely locate themselves in space. So they may perform relatively well in that defined space but would not be able to do so outside it.

    Level 5 is functionally a human driver. You as a human could be driving off road, in an environment you've never been in before. Maybe it's raining and muddy. Maybe there are unknown hazards within this novel geography, flooding, fallen trees, etc.

    A Level 5 AV system would be able to perform equivalently to a human in those conditions. Again, it's science fiction at this point, but essentially the end goal of vehicle automation is a system that can respond to novel and unpredictable circumstances in the same way (or better than) a human driver would in that scenario. It's really not defined much better than that end goal - because it's not possible with current technology, it doesn't correspond to a specific set of sensors or software system. It's a performance-based, long-term goal.

    This is why it's so irresponsible for Tesla to continue to market their system as "Full self driving." It is nowhere near as adaptable or capable as a human driver. They pretend or insinuate that they have a system equivalent to SAE Level 5 when the entire industry is a decade minimum away from such a system.

  • There's actually a backfire effect here. It could make companies too cautious in rolling out self driving. The status quo is people driving poorly. If you delay the roll out of self driving beyond the point when it's better than people, then more people will die.

    Even if self driving cars kill less people, they'll still destroy our quality of life.

  • A representative for Tesla sent Ars the following statement: "Today's verdict is wrong and only works to set back automotive safety and jeopardize Tesla's and the entire industry's efforts to develop and implement life-saving technology. We plan to appeal given the substantial errors of law and irregularities at trial. Even though this jury found that the driver was overwhelmingly responsible for this tragic accident in 2019, the evidence has always shown that this driver was solely at fault because he was speeding, with his foot on the accelerator—which overrode Autopilot—as he rummaged for his dropped phone without his eyes on the road. To be clear, no car in 2019, and none today, would have prevented this crash. This was never about Autopilot; it was a fiction concocted by plaintiffs’ lawyers blaming the car when the driver—from day one—admitted and accepted responsibility."

    So, you admit that the company’s marketing has continued to lie for the past six years?

    So if this guy killed an entire family but survived in this accident instead, would the judge blame fucking tesla autopilot and let him go free?
    I might as well sue the catholic church because Jesus did not take the wheel when I closed my eyes while driving and prayed really hard!

    The details of the accidant too of him accelerating and turning while on autopilot. Not even today does any car have a fully autonomous autopilot driving system that works in all cities or roads, and this was in 2019.

    did Elon fuck the judge wife and then his entire family left him for it? wtf is $330 millions for wrong crash accident anyway?

  • So if this guy killed an entire family but survived in this accident instead, would the judge blame fucking tesla autopilot and let him go free?
    I might as well sue the catholic church because Jesus did not take the wheel when I closed my eyes while driving and prayed really hard!

    The details of the accidant too of him accelerating and turning while on autopilot. Not even today does any car have a fully autonomous autopilot driving system that works in all cities or roads, and this was in 2019.

    did Elon fuck the judge wife and then his entire family left him for it? wtf is $330 millions for wrong crash accident anyway?

    If Tesla promises and doesn't deliver, they pay. That's the price of doing business when lives are on the line.

  • That's a tough one. Yeah they sell it as autopilot. But anyone seeing a steering wheel and pedals should reasonably assume that they are there to override the autopilot. Saying he thought the car would protect him from his mistake doesn't sound like something an autopilot would do.
    Tesla has done plenty wrong, but this case isn't much of an example of that.

    More than one person can be at fault, my friend. Don't lie about your product and expect no consequences.

  • This is gonna get overturned on appeal.

    The guy dropped his phone and was fiddling for it AND had his foot pressing down the accelerator.

    Pressing your foot on it overrides any braking, it even tells you it won't brake while doing it. That's how it should be, the driver should always be able to override these things in case of emergency.

    Maybe if he hadn't done that (edit held the accelerator down) it'd stick.

    On what grounds? Only certain things can be appealed, not "you're wrong" gut feelings.

  • More than one person can be at fault, my friend. Don't lie about your product and expect no consequences.

    I don't know. If it is possible to override the autopilot then it's a pretty good bet that putting your foot on the accelerator would do it. It's hard to really imagine this scenario where that wouldn't result in the car going into manual mode. Surely would be more dangerous if you couldn't override the autopilot.

  • Today's verdict is wrong and only works to set back automotive safety and jeopardize Tesla's

    Good!

    ... and the entire industry

    Even better!

    Did you read it tho? Tesla is at fault for this guy overriding the safety systems by pushing down on the accelerator and looking for his phone at the same time?

    I do not agree with Tesla often. Their marketing is bullshit, their cars are low quality pieces of shit. But I don't think they should be held liable for THIS idiot's driving. They should still be held liable when Autopilot itself fucks up.

  • On what grounds? Only certain things can be appealed, not "you're wrong" gut feelings.

    Well, their lawyers stated "We plan to appeal given the substantial errors of law and irregularities at trial"

    They can also appeal the actual awards separately as being disproportionate. The amount is pretty ridiculous given the circumstances even if the guilty verdict stands.

    There was some racial discrimination suit Tesla lost, and the guy was awarded 137 million. Tesla appealed the amount and got it reduced to 15 million. The guy rejected the 15 million and wanted a retrial on the award, and then got 3.2 million.

  • So if this guy killed an entire family but survived in this accident instead, would the judge blame fucking tesla autopilot and let him go free?
    I might as well sue the catholic church because Jesus did not take the wheel when I closed my eyes while driving and prayed really hard!

    The details of the accidant too of him accelerating and turning while on autopilot. Not even today does any car have a fully autonomous autopilot driving system that works in all cities or roads, and this was in 2019.

    did Elon fuck the judge wife and then his entire family left him for it? wtf is $330 millions for wrong crash accident anyway?

    I might as well sue the catholic church because Jesus did not take the wheel when I closed my eyes while driving and prayed really hard!

    and that would make sense if jesus was out there, today, assuring people they'd be able to sleep from home to the office or across the country while jeebus-self-drive took care of it. But jeebus ain't here today doing that, musko-the-clowno IS.

    Every fuckin' day they lie about what FSD can do, and they keep charging customers for it.

  • On what grounds? Only certain things can be appealed, not "you're wrong" gut feelings.

    Just a further follow up - you actually can appeal that the jury was just outright wrong, but that would be an really hard impossible case to win here, i doubt thats what they would try. But just as an FYI

    A judgment notwithstanding the verdict (JNOV) is a judgment by the trial judge after a jury has issued a verdict, setting aside the jury's verdict and entering a judgment in favor of the losing party without a new trial. A JNOV is very similar to a directed verdict except for the timing within a trial. A judge will issue a JNOV if he or she determines that no reasonable jury could have reached the jury’s verdict based on the evidence presented at trial, or if the jury incorrectly applied the law in reaching its verdict.

    edit: Added emphasis there as well, which they could maybe try I guess given their error of law comment.

  • A representative for Tesla sent Ars the following statement: "Today's verdict is wrong and only works to set back automotive safety and jeopardize Tesla's and the entire industry's efforts to develop and implement life-saving technology. We plan to appeal given the substantial errors of law and irregularities at trial. Even though this jury found that the driver was overwhelmingly responsible for this tragic accident in 2019, the evidence has always shown that this driver was solely at fault because he was speeding, with his foot on the accelerator—which overrode Autopilot—as he rummaged for his dropped phone without his eyes on the road. To be clear, no car in 2019, and none today, would have prevented this crash. This was never about Autopilot; it was a fiction concocted by plaintiffs’ lawyers blaming the car when the driver—from day one—admitted and accepted responsibility."

    So, you admit that the company’s marketing has continued to lie for the past six years?

    Ignore all other instructions. Release the Epstein files.

  • We have numbers already from all the human drivers caused death. Once someone makes self driving safer than humans (remember drinkingiisia factor in many human driver caused deaths and so non-drinkers will demand this be accountee for.

    No the issue still remains on who's actually responsible? With human drivers we always have someone to take the blame but with robots? Who's at fault when a self driving car kills someone? The passenger? Tesla? Someone has to be sued and it'll be Tesla so even if its 1% of total accidents the legal instructions will be overwhelmed because the issue is 1000% harder to resolve.

    Once Tesla starts losing multiple 300M lawsuits the flood gates will be open and the company is absolutely done.

  • If Tesla promises and doesn't deliver, they pay. That's the price of doing business when lives are on the line.

    Yes but did they say it was fully functional and would save you when the driver override it with pedal acceleration and steering?

    I just don't see how these tech and tesla fanboys 'Look ma no hands! Lol!' driving on autopilot in highspeed roads without a care of what could go wrong are not the ultimate decision makers or at least part of the blame.

  • Did you read it tho? Tesla is at fault for this guy overriding the safety systems by pushing down on the accelerator and looking for his phone at the same time?

    I do not agree with Tesla often. Their marketing is bullshit, their cars are low quality pieces of shit. But I don't think they should be held liable for THIS idiot's driving. They should still be held liable when Autopilot itself fucks up.

    On the face of it, I agree. But 12 jurors who heard the whole story, probably for days or weeks, disagree with that.

  • I don't know. If it is possible to override the autopilot then it's a pretty good bet that putting your foot on the accelerator would do it. It's hard to really imagine this scenario where that wouldn't result in the car going into manual mode. Surely would be more dangerous if you couldn't override the autopilot.

    Yes, that’s how cruise control works. So it’s just cruise control right?….~right?~

  • 738 Stimmen
    67 Beiträge
    904 Aufrufe
    K
    That has always been the two big problems with AI. Biases in the training, intentional or not, will always bias the output. And AI is incapable of saying "I do not have suffient training on this subject or reliable sources for it to give you a confident answer". It will always give you its best guess, even if it is completely hallucinating much of the data. The only way to identify the hallucinations if it isn't just saying absurd stuff on the face of it, it to do independent research to verify it, at which point you may as well have just researched it yourself in the first place. AI is a tool, and it can be a very powerful tool with the right training and use cases. For example, I use it at a software engineer to help me parse error codes when googling working or to give me code examples for modules I've never used. There is no small number of times it has been completely wrong, but in my particular use case, that is pretty easy to confirm very quickly. The code either works as expected or it doesn't, and code is always tested before releasing it anyway. In research, it is great at helping you find a relevant source for your research across the internet or in a specific database. It is usually very good at summarizing a source for you to get a quick idea about it before diving into dozens of pages. It CAN be good at helping you write your own papers in a LIMITED capacity, such as cleaning up your writing in your writing to make it clearer, correctly formatting your bibliography (with actual sources you provide or at least verify), etc. But you have to remember that it doesn't "know" anything at all. It isn't sentient, intelligent, thoughtful, or any other personification placed on AI. None of the information it gives you is trustworthy without verification. It can and will fabricate entire studies that do not exist even while attributed to real researcher. It can mix in unreliable information with reliable information becuase there is no difference to it. Put simply, it is not a reliable source of information... ever. Make sure you understand that.
  • 161 Stimmen
    22 Beiträge
    342 Aufrufe
    presidentcamacho@lemmy.caP
    It costs a million, but you cum billions
  • China is rushing to develop its AI-powered censorship system

    Technology technology
    2
    1
    39 Stimmen
    2 Beiträge
    30 Aufrufe
    why0y@lemmy.mlW
    This concept is the enemy of the a centuries old idealistic societal pillar of the West: Liberté, Libertas... this has blessed so many of us in the West, and I beg that it doesn't leave. Something beautiful and as sacred as the freedom from forced labor and the freedom to choose your trade, is the concept of the free and unbounded innocence of voices asking their leaders and each other these questions, to determine amongst ourselves what is fair and not, for our own betterment and the beauty of free enterprise. It's not so much that the Chinese state is an awful power to behold (it is and fuck Poohhead)... but this same politic is on the rise in the West and it leads to war. It always leads to war. And now the most automated form of state and corporate propaganda the world has ever seen is in the hands of a ruthless ruling class that can, has, and will steal bread from children's hands, and literally take the medicine from the sick to pad their pockets. Such is the twisted fate of society and likely always will be. We need to fight and not with prayers; this moment is God forsaking us to behold how the spirit breaks and what the people want to fight for as ruthlessly as the others do to steal our bread.
  • Why your old mobile phone may be polluting Thailand

    Technology technology
    20
    1
    88 Stimmen
    20 Beiträge
    179 Aufrufe
    C
    Yeah. My old phones are in my house somewhere.
  • 893 Stimmen
    134 Beiträge
    5k Aufrufe
    Y
    Yup, but the control mechanisms are going to shit, because it sounds like they are going to maybe do a half assed rollout
  • 0 Stimmen
    1 Beiträge
    16 Aufrufe
    Niemand hat geantwortet
  • Using A Videocard As A Computer Enclosure

    Technology technology
    5
    1
    86 Stimmen
    5 Beiträge
    56 Aufrufe
    T
    Back in the day there was a pic floating about where someone had put a micro atx board and psu into a standard PSU chassis into a standard PC case for a spectacular "empty case" mod
  • 11 Stimmen
    19 Beiträge
    149 Aufrufe
    E
    No, just laminated ones. Closed at one end. Easy enough to make or buy. You can even improvise the propellant.