Skip to content

Tesla loses Autopilot wrongful death case in $329 million verdict

Technology
172 96 1
  • Good that the car manufacturer is also being held accountable.

    But...

    In 2019, George McGee was operating his Tesla Model S using Autopilot when he ran past a stop sign and through an intersection at 62 mph then struck a pair of people stargazing by the side of the road. Naibel Benavides was killed and her partner Dillon Angulo was left with a severe head injury.

    That's on him. 100%

    McGee told the court that he thought Autopilot "would assist me should I have a failure or should I miss something, should I make a mistake,"

    Stop giving stupid people the ability to control large, heavy vehicles! Autopilot is not a babysitter, it's supposed to be an assistive technology, like cruise control. This fucking guy gave Tesla the wheel, and that was a choice!

    i dont disagree; but i believe the suit was over how tesla misrepresented assistive technology as fully autonomous as the name autopilot implies

  • Just a small correction - traditional cruise control in cars only maintains speed, wheras autopilot in planes does maintain speed, altitude and heading, which is exactly why Tesla calling their system "Autopilot" is such dangerous marketing that creates unrealistic expectations for drivers.

    I'm not sure what you're correcting. The autopilot feature has adaptive cruise control and lane keeping assist, and auto steering.

    Adaptive cruise control will brake to maintain a distance with the vehicle in front of it but maintain the set speed otherwise, lane keeping assist will keep the vehicle in it's lane/prevent it from drifting from its lane, and combined with auto steering will keep it centered in the lane.

    I specifically explained that a planes auto pilot does those things (maintain speed, altitude, and heading), and that people don't know that this is all it does. It doesn't by itself avoid obstacles or account for weather etc. It'd fly right into another plane if it was occupying that airspace. It won't react to weather events like windsheer (which could cause the plane to lose altitude extremely quickly), or a hurricane. If there's an engine problem and an engine loses power? It won't attempt to restart. It doesn't brake. It can't land a plane.

    But Musk made some claims that Teslas autopilot would drive the vehicle for you without human interference. And people assume that autopilot (in the pop culture sense) does a lot more than it actually does. This is what I'm trying to point out.

  • The original comment is perpetuating the lie. Intentional or not. They rely on fundamentally flawed soundbites that are precisely crafted for propaganda not to be informative or truthful at all.

    Right off the bat they're saying "in principle" which presumes the baseline lie that "full self driving" is achieved. Then they strengthen their argument by reinforcing the idea that it's functionally equivalent to humans (i.e. generalized intelligence). Then the cap it off with "no known flaw". Pure lies.

    Of course they've hedged by implying it's opinion but strongly suggest it's the most correct one anyways.

    I’m unsure of and how much has changed

    This demonstrates exactly how effective the propaganda is. They set up scenarios where nobody honest will refute their bullshit with certainty. Even though we know there is no existing system is on par with human drivers. Sure they can massage data to say under certain conditions an automated driving system performed similarly by some metric or whatever. But that's fundamentally not what they are telling laymen audience. They're lying in order to lead the average person to believe they can trust their car to drive them as if they are a passenger and another human is behind the wheel. This not true. Period. There is no existing system that does this. There will not be in the foreseeable future.

    The fact of the meta is that technological discussion is more about this kind of propaganda than technology itself. If it weren't the case then more people would be hearing about the actual technology and it's real limitations. Not all the spin-doctoring. That leads to uncertainty and confusion. Which leads to preventable deaths.

    I think the original commenter simply doesn't know how wrong he is

  • i dont disagree; but i believe the suit was over how tesla misrepresented assistive technology as fully autonomous as the name autopilot implies

    Yes, false advertising for sure. But the responsibility for safe driving, is on the driver, even if the driver's role is engaging autopilot.

    I can only imagine the same applies in other circumstances where autopilot is an option: planes, boats, drones, etc.

  • Good that the car manufacturer is also being held accountable.

    But...

    In 2019, George McGee was operating his Tesla Model S using Autopilot when he ran past a stop sign and through an intersection at 62 mph then struck a pair of people stargazing by the side of the road. Naibel Benavides was killed and her partner Dillon Angulo was left with a severe head injury.

    That's on him. 100%

    McGee told the court that he thought Autopilot "would assist me should I have a failure or should I miss something, should I make a mistake,"

    Stop giving stupid people the ability to control large, heavy vehicles! Autopilot is not a babysitter, it's supposed to be an assistive technology, like cruise control. This fucking guy gave Tesla the wheel, and that was a choice!

    Well, if only Tesla hadn't invested tens of millions into marketing campaigns trying to paint autopilot as a fully self driving, autonomous system. Everyone knows that 9 out of 10 consumers don't read the fine print, ever. They buy, and use shit off of vibes. False marketing can and does kill.

  • Yes, false advertising for sure. But the responsibility for safe driving, is on the driver, even if the driver's role is engaging autopilot.

    I can only imagine the same applies in other circumstances where autopilot is an option: planes, boats, drones, etc.

    agree with you here. your point reminds me of this case below.
    The tldr is pilots were using their laptop to look at scheduled iirc and overflew their destination. its long been speculated they were watching a movie

  • I wonder if a lawyer will ever try to apply this as precedent against Boeing or similar...

    Part of the reason that air travel is as safe as it is is because governments held both airlines and manufacturers accountable for planes crashes or other air travel incidents, especially those leading to death or expensive property damage/mishap. You have to have significant training and flight hours to be a commercial pilot.

    In the cases where Boeing has been found (through significant investigation) to be liable for death or injury, they have been held accountable. That's literally why the 800 maxes were grounded world wide. Literally why they were forced to add further safety measures after the door plug failure which was due to their negligence as a manufacturer.

  • A representative for Tesla sent Ars the following statement: "Today's verdict is wrong and only works to set back automotive safety and jeopardize Tesla's and the entire industry's efforts to develop and implement life-saving technology. We plan to appeal given the substantial errors of law and irregularities at trial. Even though this jury found that the driver was overwhelmingly responsible for this tragic accident in 2019, the evidence has always shown that this driver was solely at fault because he was speeding, with his foot on the accelerator—which overrode Autopilot—as he rummaged for his dropped phone without his eyes on the road. To be clear, no car in 2019, and none today, would have prevented this crash. This was never about Autopilot; it was a fiction concocted by plaintiffs’ lawyers blaming the car when the driver—from day one—admitted and accepted responsibility."

    So, you admit that the company’s marketing has continued to lie for the past six years?

    Life saving technology, BS, their auto pilot is half-ass.

  • Yes, false advertising for sure. But the responsibility for safe driving, is on the driver, even if the driver's role is engaging autopilot.

    I can only imagine the same applies in other circumstances where autopilot is an option: planes, boats, drones, etc.

    Here's my problem with all of the automation the manufacturers are adding to cars. Not even Autopilot level stuff is potentially a problem - things like adaptive cruise come to mind.

    If there's some kind of bug in that adaptive cruise that puts my car into the bumper of the car in front of me before I can stop it, the very first thing the manufacturer is going to say is:

    But the responsibility for safe driving, is on the driver...

    And how do we know there isn't some stupid bug? Our car has plenty of other software bugs in the infotainment system; hopefully they were a little more careful with the safety-critical systems...ha ha, I know. Even the bugs in the infotainment are distracting. But what would the manufacturer say if there was a crash resulting from my moment of distraction, caused by the 18th fucking weather alert in 10 minutes for a county 100 miles away, a feature that I can't fucking disable?

    But the responsibility for safe driving, is on the driver...

    In other words, "We bear no responsibility!" So, I have to pay for these "features" and the manufacturer will deny any responsibility if one of them fails and causes a crash. It's always your fault as the driver, no matter what. The company rolls this shit out to us; we have no choice to buy a new car without it any more, and they don't even trust it enough to stand behind it.

    Maybe you'll get lucky and enough issues will happen that gov't regulators will look into it (not in the US any more, of course)...but probably not. You'll be blamed, and you'll pay higher insurance, and that will be that.

    So now I have to worry not only about other drivers and my own driving, but I also have to be alert that the car will do something unexpected as well. Which has happened, when all this "smart" technology has misunderstood a situation, like slamming on the brakes for a car in another lane. I've found I hate having to fight my own car.

    Obviously, I very much dislike driving our newer car. It's primarily my wife's car, and I only drive it once or twice a week, fortunately.

  • Good that the car manufacturer is also being held accountable.

    But...

    In 2019, George McGee was operating his Tesla Model S using Autopilot when he ran past a stop sign and through an intersection at 62 mph then struck a pair of people stargazing by the side of the road. Naibel Benavides was killed and her partner Dillon Angulo was left with a severe head injury.

    That's on him. 100%

    McGee told the court that he thought Autopilot "would assist me should I have a failure or should I miss something, should I make a mistake,"

    Stop giving stupid people the ability to control large, heavy vehicles! Autopilot is not a babysitter, it's supposed to be an assistive technology, like cruise control. This fucking guy gave Tesla the wheel, and that was a choice!

    Yeah, but I think Elon shares the blame for making outrageous claims for years suggesting otherwise. He's a liar and needs to be held accountable.

  • More than one person can be at fault, my friend. Don't lie about your product and expect no consequences.

    Never said they weren't wrong for lieing. Just that this case seems a poor match for showing that.

  • There are other cars on the market that use technology that will literally override your input if they detect that there is a crash imminent. Even those cars do not claim to have autopilot and Tesla has not changed their branding or wording which is a lot of the problem here.

    I can't say for sure that they are responsible or not in this case because I don't know what the person driving then assumed. But if they assumed that the "safety features" (in particular autopilot) would mitigate their recklessness and Tesla can't prove they knew about the override of such features, then I'm not sure the court is wrong in this case. The fact that they haven't changed their wording or branding of autopilot (particularly calling it that), is kind of damning here.

    Autopilot maintains speed (edit), altitude (end of edit), and heading or flight path in planes. But the average person doesn't know or understand that. Tesla has been using the pop culture understanding of what autopilot is and that's a lot of the problem. Other cars have warning about what their "assisted driving" systems do, and those warnings pop up every time you engage them before you can set any settings etc. But those other car manufacturers also don't claim the car can drive itself.

    You mention other cars overriding your input. The most common is the auto breaking when it sees you are going to hit something. But my understanding is that it kicks in when it is already too late to avoid the crash. So it isn't something that is involved in decision making about driving, it is just a saftey feature only relevant in the case of a crash. Just like you don't ram another car because you have a seatbelt, your driving choices aren't affected by this features presence.
    The other common one will try to remind you to stay in your lane. But it isn't trying to override you. It rumbles the wheel and turns it a bit in the direction you should go. If you resist at all it stops. It is only meant for if you have let go of the wheel or are asleep.
    So I don't know of anything that overrides driver input completely outside of being too late to avoid a crash.

  • This isn't really something you can be 'too cautious' about.

    Hopefully we can at least agree that as of right now, they're not being cautious enough.

    As an exercise to remove the bias from this, replace self driving cars with airbags. In some rare cases they might go off accidentally and do harm that wouldn't have occurred in their absence. But all cars have airbags. More and more with every generation. If you are so cautious about accidental detonations that you choose not to install them in your car, then you're being too cautious.

    I can't agree that they're not being cautious enough. I didn't even read the article. I'm just arguing about the principle. And I don't have a clue what the right penalty would be. I would need to be an actuary with access to lots of data I don't have to figure out the right number to provide the right deterrent.

  • Fuck that I'm not a beta tester for a company. What happened to having a good product and then releasing it. Not oh let's see what happens.

    It's not that simple. Imagine you're dying of a rare terminal disease. A pharma company is developing a new drug for it. Obviously you want it. But they tell you you can't have it because "we're not releasing it until we know it's good".

  • A representative for Tesla sent Ars the following statement: "Today's verdict is wrong and only works to set back automotive safety and jeopardize Tesla's and the entire industry's efforts to develop and implement life-saving technology. We plan to appeal given the substantial errors of law and irregularities at trial. Even though this jury found that the driver was overwhelmingly responsible for this tragic accident in 2019, the evidence has always shown that this driver was solely at fault because he was speeding, with his foot on the accelerator—which overrode Autopilot—as he rummaged for his dropped phone without his eyes on the road. To be clear, no car in 2019, and none today, would have prevented this crash. This was never about Autopilot; it was a fiction concocted by plaintiffs’ lawyers blaming the car when the driver—from day one—admitted and accepted responsibility."

    So, you admit that the company’s marketing has continued to lie for the past six years?

    There’s no way this decision stands, it’s absolutely absurd. The guy dropped his phone and was looking down reaching around looking for it when he crashed. He wasn’t supervising autopilot, like you are required to.

  • Life saving technology, BS, their auto pilot is half-ass.

    Have you even read what happened? The driver dropped his phone and wasn’t watching the road but instead was rummaging around on the ground looking for his phone, while having his foot on the accelerator manually accelerating. Autopilot was supposedly turned off because of the manual acceleration.

  • Yeah, but I think Elon shares the blame for making outrageous claims for years suggesting otherwise. He's a liar and needs to be held accountable.

    What claims did he make about autopilot that suggested otherwise?

    Autopilot is not FSD.

  • Today’s verdict is wrong and only works to set back automotive safety and jeopardize Tesla’s and the entire industry’s efforts to develop and implement life-saving technology.

    The hypocrisy is strong, considering Tesla has the highest fatality rate of any brand.

    Source please?

  • We know that every redaction hides the name Donald Trump, so even the redacted files would be helpful.

    Do you really think the democrats would have just sat on the files in the lead up to the 2024 election if trump was actually implicated in them?

    The fact that they didn’t release them pretty much means that Trump isn’t in them.

  • You mention other cars overriding your input. The most common is the auto breaking when it sees you are going to hit something. But my understanding is that it kicks in when it is already too late to avoid the crash. So it isn't something that is involved in decision making about driving, it is just a saftey feature only relevant in the case of a crash. Just like you don't ram another car because you have a seatbelt, your driving choices aren't affected by this features presence.
    The other common one will try to remind you to stay in your lane. But it isn't trying to override you. It rumbles the wheel and turns it a bit in the direction you should go. If you resist at all it stops. It is only meant for if you have let go of the wheel or are asleep.
    So I don't know of anything that overrides driver input completely outside of being too late to avoid a crash.

    Some cars brake for you as soon as they think you're going to crash (if you have your foot on the accelerator, or even on the brake if the car doesn't believe you'll be able to stop in time). Fords especially will do this, usually in relation to adaptive cruise control, and reverse brake assist. You can turn that setting off, I believe but it is meant to prevent a crash, or collision. In fact, Ford's Bluecruise assisted driving feature was phantom braking to the point there was a recall about it because it was braking with nothing obstructing the road. I believe they also just updated it so that the accelerator press will override the bluecruise without disengaging it in like the 1.5 update which happened this year.

    But I was thinking you were correcting me about autopilot for planes and I was confused.