Skip to content

Tesla loses Autopilot wrongful death case in $329 million verdict

Technology
176 97 11
  • Yes, that’s how cruise control works. So it’s just cruise control right?….~right?~

    Normally, cruise control isn't turned off by acceleration. It's turned off by braking.

  • On the face of it, I agree. But 12 jurors who heard the whole story, probably for days or weeks, disagree with that.

    Maybe the 12 jurors just really hate Felon Husk and/or Tesla's lawyers.

  • I wonder if a lawyer will ever try to apply this as precedent against Boeing or similar...

    Whoa there, pardner. Boeing has people murdered when they go against the company. Tesla only kills customers (so far, at least).

  • Well, the Obama administration had published initial guidance on testing and safety for automated vehicles in September 2016, which was pre-regulatory but a prelude to potential regulation. Trump trashed it as one of the first things he did taking office for his first term. I was working in the AV industry at the time.

    That turned everything into the wild west for a couple of years, up until an automated Uber killed a pedestrian in Arizona in 2018. After that, most AV companies scaled public testing way back, and deployed extremely conservative versions of their software. If you look at news articles from that time, there's a lot of criticism of how, e.g., Waymos would just grind to a halt in the middle of intersections, as companies would rather take flak for blocking traffic than running over people.

    But not Tesla. While other companies dialed back their ambitions, Tesla was ripping Lidar sensors off its vehicles and sending them back out on public roads in droves. They also continued to market the technology - first as "Autopilot" and later as "Full Self Driving" - in ways that vastly overstated its capabilities. To be clear, Full Self Driving, or Level 5 Automation in the SAE framework, is science fiction at this point, the idea of a computer system functionally indistinguishable from a capable human driver. Other AV companies are still striving for Level 4 automation, which may include geographic restrictions or limitations to functioning on certain types of road infrastructure.

    Part of the blame probably also lies with Biden, whose DOT had the opportunity to address this and didn't during his term. But it was Trump who initially trashed the safety framework, and Telsa that concealed and mismarketed the limitations of its technology.

    I was working in the AV industry at the time.

    How is you working in the audio/video industry relevant? ...or maybe you mean adult videos?

  • I’ve never had one that turns it off if I accelerate.

    They’ve all shut off if I tapped the brakes though.

    Yep, can confirm works for my car too. If I press the gas pedal enough I can go faster than set cruise speed (for example, if I want to pass someone). If I lightly tap brakes, it turns kinda immediately.

  • A representative for Tesla sent Ars the following statement: "Today's verdict is wrong and only works to set back automotive safety and jeopardize Tesla's and the entire industry's efforts to develop and implement life-saving technology. We plan to appeal given the substantial errors of law and irregularities at trial. Even though this jury found that the driver was overwhelmingly responsible for this tragic accident in 2019, the evidence has always shown that this driver was solely at fault because he was speeding, with his foot on the accelerator—which overrode Autopilot—as he rummaged for his dropped phone without his eyes on the road. To be clear, no car in 2019, and none today, would have prevented this crash. This was never about Autopilot; it was a fiction concocted by plaintiffs’ lawyers blaming the car when the driver—from day one—admitted and accepted responsibility."

    So, you admit that the company’s marketing has continued to lie for the past six years?

    life saving technology... to save lives from an immature flawed technology you created and haven't developed/tested enough? hmm

  • Even when the evidence is as clear as day, the company somehow found a way to bully the case to out of court settlements, probably in their own terms. Sounds very familiar yea.

  • There's actually a backfire effect here. It could make companies too cautious in rolling out self driving. The status quo is people driving poorly. If you delay the roll out of self driving beyond the point when it's better than people, then more people will die.

    Fuck that I'm not a beta tester for a company. What happened to having a good product and then releasing it. Not oh let's see what happens.

  • Did you read it tho? Tesla is at fault for this guy overriding the safety systems by pushing down on the accelerator and looking for his phone at the same time?

    I do not agree with Tesla often. Their marketing is bullshit, their cars are low quality pieces of shit. But I don't think they should be held liable for THIS idiot's driving. They should still be held liable when Autopilot itself fucks up.

    The problem is how Musk and Tesla have sold their self driving and full self driving and what ever name they call the next one.

  • I was working in the AV industry at the time.

    How is you working in the audio/video industry relevant? ...or maybe you mean adult videos?

    Or automotive vision.

  • Not to defend Tesla here, but how does the technology become "good and well ready" for road testing if you're not allowed to test it on the road? There are a million different driving environments in the US, so it'd be impossible to test all these scenarios without a real-world environment.

    Cars with humans behind them paying attention to correct the machine. Not this let's remove humans as quickly as possible bs that we have now. I know they don't like the cost.

  • All they really need to do is make self-driving cars safer than your average human driver.

    Which they have not and won't do. You have to do this in every condition. I wonder why they always test this shit out in Texas and California?

  • I feel like calling it AutoPilot is already risking liability,

    From an aviation point of view, Autopilot is pretty accurate to the original aviation reference. The original aviation autopilot released in 1912 for aircraft would simply hold an aircraft at specified heading and altitude without human input where it would operate the aircraft's control surfaces to keep it on its directed path. However, if you were at an altitude that would let you fly into a mountain, autopilot would do exactly that. So the current Tesla Autopilot is pretty close to that level of functionality with the added feature of maintaining a set speed too. Note, modern aviation autopilot is much more functional in that it can even land and takeoff airplanes for specific models

    Full Self Driving is just audacious. There’s a reason other companies with similar technology have gone with things like driving assistance. This has probably had lawyers at Tesla sweating bullets for years.

    I agree. I think Musk always intended FSD to live up to the name, and perhaps named it that aspirationally, which is all well and good, but most consumers don't share that mindset and if you call it that right now, they assume it has that functionality when they buy it today which it doesn't. I agree with you that it was a legal liability waiting to happen.

    So your comparing a we well say 2020 technology to the 1915 version of autopilot and not the kind in the 2020s that is much more advanced. Yah what BS.

  • There's actually a backfire effect here. It could make companies too cautious in rolling out self driving. The status quo is people driving poorly. If you delay the roll out of self driving beyond the point when it's better than people, then more people will die.

    it's hard to prove that point, though. rolling out self driving may just make car usage go up and negate rate decreases by increasing overall usage

  • Today’s verdict is wrong and only works to set back automotive safety and jeopardize Tesla’s and the entire industry’s efforts to develop and implement life-saving technology.

    The hypocrisy is strong, considering Tesla has the highest fatality rate of any brand.

    Not to mention tone-deaf. Maybe you shouldn't talk about life-saving technology when your technology anti-saved a life....

    And that's ignoring the fact that they're using inferior technology. Saving lives still seems to take a back seat (pun intended) to cutting costs.

  • I don't know. If it is possible to override the autopilot then it's a pretty good bet that putting your foot on the accelerator would do it. It's hard to really imagine this scenario where that wouldn't result in the car going into manual mode. Surely would be more dangerous if you couldn't override the autopilot.

    We can bet on a lot, but when you're betting on human lives, you might get hit with a massive lawsuit, right? Try to bet less.

  • That's a tough one. Yeah they sell it as autopilot. But anyone seeing a steering wheel and pedals should reasonably assume that they are there to override the autopilot. Saying he thought the car would protect him from his mistake doesn't sound like something an autopilot would do.
    Tesla has done plenty wrong, but this case isn't much of an example of that.

    Yeah, the problem is that the US has no consumer protections, and somehow this court is trying to make up for it, but it shouldn't be in such court cases where the driver was clearly not fit to drive a car.

  • Or automotive vision.

    Thank you. I seriously didn't understand what the field was.

  • And that is the point, Tesla's "AI" performs nowhere near human levels. Actual full self driving levels is on 5 scales where Tesla's are around level 2 out of those 5.

    Tesla claimed they have full self driving for since about a decade or so, and it has been and continues to be a complwte lie. Musk claimed since long ago that he can drive a Tesla autonomously from LA to NY while in reality it has trouble leaving the first parking lot.

    I'm unsure of and how much has changed there but since Elmo Musk spends more time lying about everything than actually improving his products, I would not hold my breath.

    The original comment is perpetuating the lie. Intentional or not. They rely on fundamentally flawed soundbites that are precisely crafted for propaganda not to be informative or truthful at all.

    Right off the bat they're saying "in principle" which presumes the baseline lie that "full self driving" is achieved. Then they strengthen their argument by reinforcing the idea that it's functionally equivalent to humans (i.e. generalized intelligence). Then the cap it off with "no known flaw". Pure lies.

    Of course they've hedged by implying it's opinion but strongly suggest it's the most correct one anyways.

    I’m unsure of and how much has changed

    This demonstrates exactly how effective the propaganda is. They set up scenarios where nobody honest will refute their bullshit with certainty. Even though we know there is no existing system is on par with human drivers. Sure they can massage data to say under certain conditions an automated driving system performed similarly by some metric or whatever. But that's fundamentally not what they are telling laymen audience. They're lying in order to lead the average person to believe they can trust their car to drive them as if they are a passenger and another human is behind the wheel. This not true. Period. There is no existing system that does this. There will not be in the foreseeable future.

    The fact of the meta is that technological discussion is more about this kind of propaganda than technology itself. If it weren't the case then more people would be hearing about the actual technology and it's real limitations. Not all the spin-doctoring. That leads to uncertainty and confusion. Which leads to preventable deaths.

  • This is gonna get overturned on appeal.

    The guy dropped his phone and was fiddling for it AND had his foot pressing down the accelerator.

    Pressing your foot on it overrides any braking, it even tells you it won't brake while doing it. That's how it should be, the driver should always be able to override these things in case of emergency.

    Maybe if he hadn't done that (edit held the accelerator down) it'd stick.

    While Tesla said that McGee was solely responsible, as the driver of the car, McGee told the court that he thought Autopilot "would assist me should I have a failure or should I miss something, should I make a mistake," a perception that Tesla and its CEO Elon Musk has done much to foster with highly misleading statistics that paint an impression of a brand that is much safer than in reality.

    Here’s the thing, Tesla’s marketing of autopilot was much different than the reality. Sure, the fine print might have said having your foot on the gas would shut down autopilot, but the marketing made autopilot sound much more powerful. This guy put his trust in how the vehicle was marketed, and somebody died as a result.

    My car, for instance, does not have self driving, but it will still brake if it detects I am going to hit something. Even when my foot is on the gas. It is not unreasonable to think a car marketed the way Tesla was marketed would have similar features.

    Lastly, Tesla’s valuation as a company was based on this same marketing, not the fine print. So not only did the marketing put people in danger, but Tesla profited massively from it. They should be held responsible for this.