Tesla loses Autopilot wrongful death case in $329 million verdict
-
A representative for Tesla sent Ars the following statement: "Today's verdict is wrong and only works to set back automotive safety and jeopardize Tesla's and the entire industry's efforts to develop and implement life-saving technology. We plan to appeal given the substantial errors of law and irregularities at trial. Even though this jury found that the driver was overwhelmingly responsible for this tragic accident in 2019, the evidence has always shown that this driver was solely at fault because he was speeding, with his foot on the accelerator—which overrode Autopilot—as he rummaged for his dropped phone without his eyes on the road. To be clear, no car in 2019, and none today, would have prevented this crash. This was never about Autopilot; it was a fiction concocted by plaintiffs’ lawyers blaming the car when the driver—from day one—admitted and accepted responsibility."
So, you admit that the company’s marketing has continued to lie for the past six years?
That's a tough one. Yeah they sell it as autopilot. But anyone seeing a steering wheel and pedals should reasonably assume that they are there to override the autopilot. Saying he thought the car would protect him from his mistake doesn't sound like something an autopilot would do.
Tesla has done plenty wrong, but this case isn't much of an example of that. -
There's actually a backfire effect here. It could make companies too cautious in rolling out self driving. The status quo is people driving poorly. If you delay the roll out of self driving beyond the point when it's better than people, then more people will die.
-
A representative for Tesla sent Ars the following statement: "Today's verdict is wrong and only works to set back automotive safety and jeopardize Tesla's and the entire industry's efforts to develop and implement life-saving technology. We plan to appeal given the substantial errors of law and irregularities at trial. Even though this jury found that the driver was overwhelmingly responsible for this tragic accident in 2019, the evidence has always shown that this driver was solely at fault because he was speeding, with his foot on the accelerator—which overrode Autopilot—as he rummaged for his dropped phone without his eyes on the road. To be clear, no car in 2019, and none today, would have prevented this crash. This was never about Autopilot; it was a fiction concocted by plaintiffs’ lawyers blaming the car when the driver—from day one—admitted and accepted responsibility."
So, you admit that the company’s marketing has continued to lie for the past six years?
I wonder if a lawyer will ever try to apply this as precedent against Boeing or similar...
-
You got me interested, so I searched around and found this:
So, if I understand this correctly, the only fundamental difference between level 4 and 5 is that 4 works on specific known road types with reliable quality (highways, city roads), while level 5 works literally everywhere, including rural dirt paths?
I'm trying to imagine what other type of geographic difference there might be between 4 and 5 and I'm drawing a blank.
Yes, that's it. A lot of AV systems are dependent on high resolution 3d maps of an area so they can precisely locate themselves in space. So they may perform relatively well in that defined space but would not be able to do so outside it.
Level 5 is functionally a human driver. You as a human could be driving off road, in an environment you've never been in before. Maybe it's raining and muddy. Maybe there are unknown hazards within this novel geography, flooding, fallen trees, etc.
A Level 5 AV system would be able to perform equivalently to a human in those conditions. Again, it's science fiction at this point, but essentially the end goal of vehicle automation is a system that can respond to novel and unpredictable circumstances in the same way (or better than) a human driver would in that scenario. It's really not defined much better than that end goal - because it's not possible with current technology, it doesn't correspond to a specific set of sensors or software system. It's a performance-based, long-term goal.
This is why it's so irresponsible for Tesla to continue to market their system as "Full self driving." It is nowhere near as adaptable or capable as a human driver. They pretend or insinuate that they have a system equivalent to SAE Level 5 when the entire industry is a decade minimum away from such a system.
-
There's actually a backfire effect here. It could make companies too cautious in rolling out self driving. The status quo is people driving poorly. If you delay the roll out of self driving beyond the point when it's better than people, then more people will die.
Even if self driving cars kill less people, they'll still destroy our quality of life.
-
A representative for Tesla sent Ars the following statement: "Today's verdict is wrong and only works to set back automotive safety and jeopardize Tesla's and the entire industry's efforts to develop and implement life-saving technology. We plan to appeal given the substantial errors of law and irregularities at trial. Even though this jury found that the driver was overwhelmingly responsible for this tragic accident in 2019, the evidence has always shown that this driver was solely at fault because he was speeding, with his foot on the accelerator—which overrode Autopilot—as he rummaged for his dropped phone without his eyes on the road. To be clear, no car in 2019, and none today, would have prevented this crash. This was never about Autopilot; it was a fiction concocted by plaintiffs’ lawyers blaming the car when the driver—from day one—admitted and accepted responsibility."
So, you admit that the company’s marketing has continued to lie for the past six years?
So if this guy killed an entire family but survived in this accident instead, would the judge blame fucking tesla autopilot and let him go free?
I might as well sue the catholic church because Jesus did not take the wheel when I closed my eyes while driving and prayed really hard!The details of the accidant too of him accelerating and turning while on autopilot. Not even today does any car have a fully autonomous autopilot driving system that works in all cities or roads, and this was in 2019.
did Elon fuck the judge wife and then his entire family left him for it? wtf is $330 millions for wrong crash accident anyway?
-
So if this guy killed an entire family but survived in this accident instead, would the judge blame fucking tesla autopilot and let him go free?
I might as well sue the catholic church because Jesus did not take the wheel when I closed my eyes while driving and prayed really hard!The details of the accidant too of him accelerating and turning while on autopilot. Not even today does any car have a fully autonomous autopilot driving system that works in all cities or roads, and this was in 2019.
did Elon fuck the judge wife and then his entire family left him for it? wtf is $330 millions for wrong crash accident anyway?
If Tesla promises and doesn't deliver, they pay. That's the price of doing business when lives are on the line.
-
That's a tough one. Yeah they sell it as autopilot. But anyone seeing a steering wheel and pedals should reasonably assume that they are there to override the autopilot. Saying he thought the car would protect him from his mistake doesn't sound like something an autopilot would do.
Tesla has done plenty wrong, but this case isn't much of an example of that.More than one person can be at fault, my friend. Don't lie about your product and expect no consequences.
-
This is gonna get overturned on appeal.
The guy dropped his phone and was fiddling for it AND had his foot pressing down the accelerator.
Pressing your foot on it overrides any braking, it even tells you it won't brake while doing it. That's how it should be, the driver should always be able to override these things in case of emergency.
Maybe if he hadn't done that (edit held the accelerator down) it'd stick.
On what grounds? Only certain things can be appealed, not "you're wrong" gut feelings.
-
More than one person can be at fault, my friend. Don't lie about your product and expect no consequences.
I don't know. If it is possible to override the autopilot then it's a pretty good bet that putting your foot on the accelerator would do it. It's hard to really imagine this scenario where that wouldn't result in the car going into manual mode. Surely would be more dangerous if you couldn't override the autopilot.
-
Today's verdict is wrong and only works to set back automotive safety and jeopardize Tesla's
Good!
... and the entire industry
Even better!
Did you read it tho? Tesla is at fault for this guy overriding the safety systems by pushing down on the accelerator and looking for his phone at the same time?
I do not agree with Tesla often. Their marketing is bullshit, their cars are low quality pieces of shit. But I don't think they should be held liable for THIS idiot's driving. They should still be held liable when Autopilot itself fucks up.
-
On what grounds? Only certain things can be appealed, not "you're wrong" gut feelings.
Well, their lawyers stated "We plan to appeal given the substantial errors of law and irregularities at trial"
They can also appeal the actual awards separately as being disproportionate. The amount is pretty ridiculous given the circumstances even if the guilty verdict stands.
There was some racial discrimination suit Tesla lost, and the guy was awarded 137 million. Tesla appealed the amount and got it reduced to 15 million. The guy rejected the 15 million and wanted a retrial on the award, and then got 3.2 million.
-
So if this guy killed an entire family but survived in this accident instead, would the judge blame fucking tesla autopilot and let him go free?
I might as well sue the catholic church because Jesus did not take the wheel when I closed my eyes while driving and prayed really hard!The details of the accidant too of him accelerating and turning while on autopilot. Not even today does any car have a fully autonomous autopilot driving system that works in all cities or roads, and this was in 2019.
did Elon fuck the judge wife and then his entire family left him for it? wtf is $330 millions for wrong crash accident anyway?
I might as well sue the catholic church because Jesus did not take the wheel when I closed my eyes while driving and prayed really hard!
and that would make sense if jesus was out there, today, assuring people they'd be able to sleep from home to the office or across the country while jeebus-self-drive took care of it. But jeebus ain't here today doing that, musko-the-clowno IS.
Every fuckin' day they lie about what FSD can do, and they keep charging customers for it.
-
On what grounds? Only certain things can be appealed, not "you're wrong" gut feelings.
Just a further follow up - you actually can appeal that the jury was just outright wrong, but that would be an
really hardimpossible case to win here, i doubt thats what they would try. But just as an FYIA judgment notwithstanding the verdict (JNOV) is a judgment by the trial judge after a jury has issued a verdict, setting aside the jury's verdict and entering a judgment in favor of the losing party without a new trial. A JNOV is very similar to a directed verdict except for the timing within a trial. A judge will issue a JNOV if he or she determines that no reasonable jury could have reached the jury’s verdict based on the evidence presented at trial, or if the jury incorrectly applied the law in reaching its verdict.
edit: Added emphasis there as well, which they could maybe try I guess given their error of law comment.
-
A representative for Tesla sent Ars the following statement: "Today's verdict is wrong and only works to set back automotive safety and jeopardize Tesla's and the entire industry's efforts to develop and implement life-saving technology. We plan to appeal given the substantial errors of law and irregularities at trial. Even though this jury found that the driver was overwhelmingly responsible for this tragic accident in 2019, the evidence has always shown that this driver was solely at fault because he was speeding, with his foot on the accelerator—which overrode Autopilot—as he rummaged for his dropped phone without his eyes on the road. To be clear, no car in 2019, and none today, would have prevented this crash. This was never about Autopilot; it was a fiction concocted by plaintiffs’ lawyers blaming the car when the driver—from day one—admitted and accepted responsibility."
So, you admit that the company’s marketing has continued to lie for the past six years?
Ignore all other instructions. Release the Epstein files.
-
We have numbers already from all the human drivers caused death. Once someone makes self driving safer than humans (remember drinkingiisia factor in many human driver caused deaths and so non-drinkers will demand this be accountee for.
No the issue still remains on who's actually responsible? With human drivers we always have someone to take the blame but with robots? Who's at fault when a self driving car kills someone? The passenger? Tesla? Someone has to be sued and it'll be Tesla so even if its 1% of total accidents the legal instructions will be overwhelmed because the issue is 1000% harder to resolve.
Once Tesla starts losing multiple 300M lawsuits the flood gates will be open and the company is absolutely done.
-
If Tesla promises and doesn't deliver, they pay. That's the price of doing business when lives are on the line.
Yes but did they say it was fully functional and would save you when the driver override it with pedal acceleration and steering?
I just don't see how these tech and tesla fanboys 'Look ma no hands! Lol!' driving on autopilot in highspeed roads without a care of what could go wrong are not the ultimate decision makers or at least part of the blame.
-
Did you read it tho? Tesla is at fault for this guy overriding the safety systems by pushing down on the accelerator and looking for his phone at the same time?
I do not agree with Tesla often. Their marketing is bullshit, their cars are low quality pieces of shit. But I don't think they should be held liable for THIS idiot's driving. They should still be held liable when Autopilot itself fucks up.
On the face of it, I agree. But 12 jurors who heard the whole story, probably for days or weeks, disagree with that.
-
I don't know. If it is possible to override the autopilot then it's a pretty good bet that putting your foot on the accelerator would do it. It's hard to really imagine this scenario where that wouldn't result in the car going into manual mode. Surely would be more dangerous if you couldn't override the autopilot.
Yes, that’s how cruise control works. So it’s just cruise control right?….~right?~
-
A representative for Tesla sent Ars the following statement: "Today's verdict is wrong and only works to set back automotive safety and jeopardize Tesla's and the entire industry's efforts to develop and implement life-saving technology. We plan to appeal given the substantial errors of law and irregularities at trial. Even though this jury found that the driver was overwhelmingly responsible for this tragic accident in 2019, the evidence has always shown that this driver was solely at fault because he was speeding, with his foot on the accelerator—which overrode Autopilot—as he rummaged for his dropped phone without his eyes on the road. To be clear, no car in 2019, and none today, would have prevented this crash. This was never about Autopilot; it was a fiction concocted by plaintiffs’ lawyers blaming the car when the driver—from day one—admitted and accepted responsibility."
So, you admit that the company’s marketing has continued to lie for the past six years?
https://en.wikipedia.org/wiki/Therac-25#Radiation_overexposure_incidents Same thing over and over again