Tesla loses Autopilot wrongful death case in $329 million verdict
-
A representative for Tesla sent Ars the following statement: "Today's verdict is wrong and only works to set back automotive safety and jeopardize Tesla's and the entire industry's efforts to develop and implement life-saving technology. We plan to appeal given the substantial errors of law and irregularities at trial. Even though this jury found that the driver was overwhelmingly responsible for this tragic accident in 2019, the evidence has always shown that this driver was solely at fault because he was speeding, with his foot on the accelerator—which overrode Autopilot—as he rummaged for his dropped phone without his eyes on the road. To be clear, no car in 2019, and none today, would have prevented this crash. This was never about Autopilot; it was a fiction concocted by plaintiffs’ lawyers blaming the car when the driver—from day one—admitted and accepted responsibility."
So, you admit that the company’s marketing has continued to lie for the past six years?
Life saving technology, BS, their auto pilot is half-ass.
-
Yes, false advertising for sure. But the responsibility for safe driving, is on the driver, even if the driver's role is engaging autopilot.
I can only imagine the same applies in other circumstances where autopilot is an option: planes, boats, drones, etc.
Here's my problem with all of the automation the manufacturers are adding to cars. Not even Autopilot level stuff is potentially a problem - things like adaptive cruise come to mind.
If there's some kind of bug in that adaptive cruise that puts my car into the bumper of the car in front of me before I can stop it, the very first thing the manufacturer is going to say is:
But the responsibility for safe driving, is on the driver...
And how do we know there isn't some stupid bug? Our car has plenty of other software bugs in the infotainment system; hopefully they were a little more careful with the safety-critical systems...ha ha, I know. Even the bugs in the infotainment are distracting. But what would the manufacturer say if there was a crash resulting from my moment of distraction, caused by the 18th fucking weather alert in 10 minutes for a county 100 miles away, a feature that I can't fucking disable?
But the responsibility for safe driving, is on the driver...
In other words, "We bear no responsibility!" So, I have to pay for these "features" and the manufacturer will deny any responsibility if one of them fails and causes a crash. It's always your fault as the driver, no matter what. The company rolls this shit out to us; we have no choice to buy a new car without it any more, and they don't even trust it enough to stand behind it.
Maybe you'll get lucky and enough issues will happen that gov't regulators will look into it (not in the US any more, of course)...but probably not. You'll be blamed, and you'll pay higher insurance, and that will be that.
So now I have to worry not only about other drivers and my own driving, but I also have to be alert that the car will do something unexpected as well. Which has happened, when all this "smart" technology has misunderstood a situation, like slamming on the brakes for a car in another lane. I've found I hate having to fight my own car.
Obviously, I very much dislike driving our newer car. It's primarily my wife's car, and I only drive it once or twice a week, fortunately.
-
Good that the car manufacturer is also being held accountable.
But...
In 2019, George McGee was operating his Tesla Model S using Autopilot when he ran past a stop sign and through an intersection at 62 mph then struck a pair of people stargazing by the side of the road. Naibel Benavides was killed and her partner Dillon Angulo was left with a severe head injury.
That's on him. 100%
McGee told the court that he thought Autopilot "would assist me should I have a failure or should I miss something, should I make a mistake,"
Stop giving stupid people the ability to control large, heavy vehicles! Autopilot is not a babysitter, it's supposed to be an assistive technology, like cruise control. This fucking guy gave Tesla the wheel, and that was a choice!
Yeah, but I think Elon shares the blame for making outrageous claims for years suggesting otherwise. He's a liar and needs to be held accountable.
-
More than one person can be at fault, my friend. Don't lie about your product and expect no consequences.
Never said they weren't wrong for lieing. Just that this case seems a poor match for showing that.
-
There are other cars on the market that use technology that will literally override your input if they detect that there is a crash imminent. Even those cars do not claim to have autopilot and Tesla has not changed their branding or wording which is a lot of the problem here.
I can't say for sure that they are responsible or not in this case because I don't know what the person driving then assumed. But if they assumed that the "safety features" (in particular autopilot) would mitigate their recklessness and Tesla can't prove they knew about the override of such features, then I'm not sure the court is wrong in this case. The fact that they haven't changed their wording or branding of autopilot (particularly calling it that), is kind of damning here.
Autopilot maintains speed (edit), altitude (end of edit), and heading or flight path in planes. But the average person doesn't know or understand that. Tesla has been using the pop culture understanding of what autopilot is and that's a lot of the problem. Other cars have warning about what their "assisted driving" systems do, and those warnings pop up every time you engage them before you can set any settings etc. But those other car manufacturers also don't claim the car can drive itself.
You mention other cars overriding your input. The most common is the auto breaking when it sees you are going to hit something. But my understanding is that it kicks in when it is already too late to avoid the crash. So it isn't something that is involved in decision making about driving, it is just a saftey feature only relevant in the case of a crash. Just like you don't ram another car because you have a seatbelt, your driving choices aren't affected by this features presence.
The other common one will try to remind you to stay in your lane. But it isn't trying to override you. It rumbles the wheel and turns it a bit in the direction you should go. If you resist at all it stops. It is only meant for if you have let go of the wheel or are asleep.
So I don't know of anything that overrides driver input completely outside of being too late to avoid a crash. -
This isn't really something you can be 'too cautious' about.
Hopefully we can at least agree that as of right now, they're not being cautious enough.
As an exercise to remove the bias from this, replace self driving cars with airbags. In some rare cases they might go off accidentally and do harm that wouldn't have occurred in their absence. But all cars have airbags. More and more with every generation. If you are so cautious about accidental detonations that you choose not to install them in your car, then you're being too cautious.
I can't agree that they're not being cautious enough. I didn't even read the article. I'm just arguing about the principle. And I don't have a clue what the right penalty would be. I would need to be an actuary with access to lots of data I don't have to figure out the right number to provide the right deterrent.
-
Fuck that I'm not a beta tester for a company. What happened to having a good product and then releasing it. Not oh let's see what happens.
It's not that simple. Imagine you're dying of a rare terminal disease. A pharma company is developing a new drug for it. Obviously you want it. But they tell you you can't have it because "we're not releasing it until we know it's good".
-
A representative for Tesla sent Ars the following statement: "Today's verdict is wrong and only works to set back automotive safety and jeopardize Tesla's and the entire industry's efforts to develop and implement life-saving technology. We plan to appeal given the substantial errors of law and irregularities at trial. Even though this jury found that the driver was overwhelmingly responsible for this tragic accident in 2019, the evidence has always shown that this driver was solely at fault because he was speeding, with his foot on the accelerator—which overrode Autopilot—as he rummaged for his dropped phone without his eyes on the road. To be clear, no car in 2019, and none today, would have prevented this crash. This was never about Autopilot; it was a fiction concocted by plaintiffs’ lawyers blaming the car when the driver—from day one—admitted and accepted responsibility."
So, you admit that the company’s marketing has continued to lie for the past six years?
There’s no way this decision stands, it’s absolutely absurd. The guy dropped his phone and was looking down reaching around looking for it when he crashed. He wasn’t supervising autopilot, like you are required to.
-
Life saving technology, BS, their auto pilot is half-ass.
Have you even read what happened? The driver dropped his phone and wasn’t watching the road but instead was rummaging around on the ground looking for his phone, while having his foot on the accelerator manually accelerating. Autopilot was supposedly turned off because of the manual acceleration.
-
Yeah, but I think Elon shares the blame for making outrageous claims for years suggesting otherwise. He's a liar and needs to be held accountable.
What claims did he make about autopilot that suggested otherwise?
Autopilot is not FSD.
-
Today’s verdict is wrong and only works to set back automotive safety and jeopardize Tesla’s and the entire industry’s efforts to develop and implement life-saving technology.
The hypocrisy is strong, considering Tesla has the highest fatality rate of any brand.
Source please?
-
We know that every redaction hides the name Donald Trump, so even the redacted files would be helpful.
Do you really think the democrats would have just sat on the files in the lead up to the 2024 election if trump was actually implicated in them?
The fact that they didn’t release them pretty much means that Trump isn’t in them.
-
You mention other cars overriding your input. The most common is the auto breaking when it sees you are going to hit something. But my understanding is that it kicks in when it is already too late to avoid the crash. So it isn't something that is involved in decision making about driving, it is just a saftey feature only relevant in the case of a crash. Just like you don't ram another car because you have a seatbelt, your driving choices aren't affected by this features presence.
The other common one will try to remind you to stay in your lane. But it isn't trying to override you. It rumbles the wheel and turns it a bit in the direction you should go. If you resist at all it stops. It is only meant for if you have let go of the wheel or are asleep.
So I don't know of anything that overrides driver input completely outside of being too late to avoid a crash.Some cars brake for you as soon as they think you're going to crash (if you have your foot on the accelerator, or even on the brake if the car doesn't believe you'll be able to stop in time). Fords especially will do this, usually in relation to adaptive cruise control, and reverse brake assist. You can turn that setting off, I believe but it is meant to prevent a crash, or collision. In fact, Ford's Bluecruise assisted driving feature was phantom braking to the point there was a recall about it because it was braking with nothing obstructing the road. I believe they also just updated it so that the accelerator press will override the bluecruise without disengaging it in like the 1.5 update which happened this year.
But I was thinking you were correcting me about autopilot for planes and I was confused.
-
What claims did he make about autopilot that suggested otherwise?
Autopilot is not FSD.
https://electrek.co/2016/07/05/elon-musk-tesla-autopilot-save-life/
https://electrek.co/2018/04/13/tesla-autopilot-never-perfect-10x-safer-than-human-elon-musk/
Considering Elon's been using the two interchangeably, and that autopilot gives the lay person a false equivalence to what actual pilots use in planes. Can you really blame them?
-
A representative for Tesla sent Ars the following statement: "Today's verdict is wrong and only works to set back automotive safety and jeopardize Tesla's and the entire industry's efforts to develop and implement life-saving technology. We plan to appeal given the substantial errors of law and irregularities at trial. Even though this jury found that the driver was overwhelmingly responsible for this tragic accident in 2019, the evidence has always shown that this driver was solely at fault because he was speeding, with his foot on the accelerator—which overrode Autopilot—as he rummaged for his dropped phone without his eyes on the road. To be clear, no car in 2019, and none today, would have prevented this crash. This was never about Autopilot; it was a fiction concocted by plaintiffs’ lawyers blaming the car when the driver—from day one—admitted and accepted responsibility."
So, you admit that the company’s marketing has continued to lie for the past six years?
Whether or not its the guys fault I'm just glad Elon is losing money.
-
What claims did he make about autopilot that suggested otherwise?
Autopilot is not FSD.
Like everything he has ever said about it? lol wtf is this comment.
-
https://electrek.co/2016/07/05/elon-musk-tesla-autopilot-save-life/
https://electrek.co/2018/04/13/tesla-autopilot-never-perfect-10x-safer-than-human-elon-musk/
Considering Elon's been using the two interchangeably, and that autopilot gives the lay person a false equivalence to what actual pilots use in planes. Can you really blame them?
Thanks for responding to this with links so that I didn't have to.
-
Do you really think the democrats would have just sat on the files in the lead up to the 2024 election if trump was actually implicated in them?
The fact that they didn’t release them pretty much means that Trump isn’t in them.
Lol. They're all in them, that's their problem. Dems and Cons are all in them. Trump was a Dem at the time. People forget.
-
Yeah, but I think Elon shares the blame for making outrageous claims for years suggesting otherwise. He's a liar and needs to be held accountable.
Absolutely. I hope he and the company burn in hell, but I do not want to start giving drivers who kill people a free pass to say "well, it was the car's fault!"
"Autopilot", especially in Tesla cars, is beta software at best, and this feature should never have been allowed to be used on public roads. In that sense, the transportation ministry that's allowed it also has blood on their hands.
-
Have you even read what happened? The driver dropped his phone and wasn’t watching the road but instead was rummaging around on the ground looking for his phone, while having his foot on the accelerator manually accelerating. Autopilot was supposedly turned off because of the manual acceleration.
Except Autopilot is supposed to disengage when hands are taken off the steering wheel - it failed.
-
-
-
-
Four teams of Humanoid Robots faced off in a fully autonomous 3-on-3 Football game powered entirely by Artificial Intelligence in Beijing on Saturday night.
Technology1
-
-
AI learns math reasoning by playing Snake and Tetris-like games rather than using math datasets
Technology1
-
Study: US kids who said their social media, phone, or video game use was “addictive” were 2x-3x more likely to have thoughts of suicide or self-harm by age 14
Technology1
-