Skip to content

Tesla loses Autopilot wrongful death case in $329 million verdict

Technology
172 96 1
  • So your comparing a we well say 2020 technology to the 1915 version of autopilot and not the kind in the 2020s that is much more advanced. Yah what BS.

    Because it still basically does what's they said. The only new advent for the autopilot system besides maintaining speed, heading, and altitude is the ability to use and set a GPS heading, and waypoints (for the purposes of this conversation). It will absolutely still fly into a mountain if not for other collision avoidance systems. Your average 737 or A320 is not going to spontaneously change course just because of the elevation of the ground below it changed. But you can program other systems in the plane to know to avoid a specific flight path because there is a known hazard. I want you to understand that we know a mountain is there. They don't move around much in short periods of time. Cars and pedestrians are another story entirely.

    There's a reason we still have air traffic controllers and even then pilots and air traffic control aren't infallible and they have way more systems to make flying safe than the average car (yes even the average Tesla).

  • What happens when you hit the gas while in cruise control? In all the cards I have driven, you go faster than the set speed and the car responds to your pedal movements. I guess we can debate if we call that stopped or just paused, but it is certainly not ignoring your acceleration.

    Well, yeah, you can call it “paused” if you want to. The cruise control definitely stays on though and resumes the set speed when you stop accelerating. It completely disengages when you brake though, so I’ve never thought of it as turning off when I accelerate, only when braking.

  • No. Press the brake and it turns off. Press the accelerator in lots of cars and it will speed up but return to the cruise control set speed when you release the accelerator. And further, Tesla doesn't call it cruise control and the founder of Tesla has been pretty heavily misleading about what the system is and what it does. So.

    Yeah, sure.

    You sound like one of those people who are the reason why we find the following warning on microwave ovens:

    WARNING: DO NOT TRY TO DRY PETS IN THIS DEVICE.

    And on plastic bags:

    WARNING: DO NOT PLACE OVER HEAD.

    We both know that this is not what it's for. And it (model S) has never been cleared ANYWHERE ON THIS GLOBE as an autonomous vehicle.

    (Adaptive with lane assist and collision detection) Cruise control/autopilot on, foot on accelerator, no eyes on the road, no hands on the steering wheel. That's malice. There where visible, audible and even tactile warnings wich this guy ignored.

    No current day vehicle (or something from 2019) has in it's manual that this is use as intended. As a matter of fact all warn you to not do that.

    And I get that you hate Tesla/Musk, don't we all. But in this case only 1 person is responsible. The asshole driving it.

  • Yeah, sure.

    You sound like one of those people who are the reason why we find the following warning on microwave ovens:

    WARNING: DO NOT TRY TO DRY PETS IN THIS DEVICE.

    And on plastic bags:

    WARNING: DO NOT PLACE OVER HEAD.

    We both know that this is not what it's for. And it (model S) has never been cleared ANYWHERE ON THIS GLOBE as an autonomous vehicle.

    (Adaptive with lane assist and collision detection) Cruise control/autopilot on, foot on accelerator, no eyes on the road, no hands on the steering wheel. That's malice. There where visible, audible and even tactile warnings wich this guy ignored.

    No current day vehicle (or something from 2019) has in it's manual that this is use as intended. As a matter of fact all warn you to not do that.

    And I get that you hate Tesla/Musk, don't we all. But in this case only 1 person is responsible. The asshole driving it.

    Nope. I'm correcting you because apparently most people don't even know how their cruise control works. But feel however you feel.

  • Thats not a gut feeling. That's how every cruise control since it was invented in the 70s works. You press the brake or the accelerator? Cruise control (and autopilot) = off.

    That's not a gut feeling, that's what stated in the manual.

    That's not how cruise control works and I have never seen cruise control marketed in a such a way that would make anyone believe it was smart enough to stop a car crash.

  • Farquaad said this, not Brannigan iirc

    I'm pretty sure it was both.

  • Look, we've only known the effects of radium and similar chemical structures for about a hundred years or so. Give corporations a chance to catch up. /s

  • So if this guy killed an entire family but survived in this accident instead, would the judge blame fucking tesla autopilot and let him go free?
    I might as well sue the catholic church because Jesus did not take the wheel when I closed my eyes while driving and prayed really hard!

    The details of the accidant too of him accelerating and turning while on autopilot. Not even today does any car have a fully autonomous autopilot driving system that works in all cities or roads, and this was in 2019.

    did Elon fuck the judge wife and then his entire family left him for it? wtf is $330 millions for wrong crash accident anyway?

    wtf is $330 millions for wrong crash accident anyway?

    Maybe because someone died.

  • I mean, that's probably strictly true.

    There will always be accidents with tech or anything. No matter how much planning, foresight, etc could go into a product or service. Humans cannot account for every scenario. Death is inevitable to some degree. That being said.

    Tesla point blank launched a half ass product / project that just did not fully operate as specified. I'm all for self driving vehicles, even through the bad stuff even if it happened to me I'd still be for it. Given the early stage though, they should have focused so much more on their "rolling release updates" than they have.

    Of course things will need updated, of course accidents will happen. But it's how they respond to them that makes them look evil vs good. Their response has been lack luster. The market seems to think it's a not a major issue though. There's more teslas now than ever on the roads.

  • Release the unredacted Epstein files. The Epstein files didn't redact themselves.

    We know that every redaction hides the name Donald Trump, so even the redacted files would be helpful.

  • That is an issue.

    i just realized that I didn't finish the thought. Once self driving is statistically safer we will ban human drivers. Some places it will be by law, Some the more subtile insurance costs, some by something else.

    We need to figure out liability of course. I have ideas but nobody will listen so noebuint in writting.

    One challenge here is that we generally value human life pretty high, well at least speaking from legal compensation pov. So you can't sue Joe the drunk driver for killing your husband for 300 million but you can do thay to Tesla.

    In authoritarian states like china maybe society can be forced into accepting "for greater good" sort of mentality but it's not going to happen in the west imo.

  • There are other cars on the market that use technology that will literally override your input if they detect that there is a crash imminent. Even those cars do not claim to have autopilot and Tesla has not changed their branding or wording which is a lot of the problem here.

    I can't say for sure that they are responsible or not in this case because I don't know what the person driving then assumed. But if they assumed that the "safety features" (in particular autopilot) would mitigate their recklessness and Tesla can't prove they knew about the override of such features, then I'm not sure the court is wrong in this case. The fact that they haven't changed their wording or branding of autopilot (particularly calling it that), is kind of damning here.

    Autopilot maintains speed (edit), altitude (end of edit), and heading or flight path in planes. But the average person doesn't know or understand that. Tesla has been using the pop culture understanding of what autopilot is and that's a lot of the problem. Other cars have warning about what their "assisted driving" systems do, and those warnings pop up every time you engage them before you can set any settings etc. But those other car manufacturers also don't claim the car can drive itself.

    To me having the car be able to override your actions sounds more dangerous than being to override the autopilot.

    I had one rental truck that drove me insane and scared the shit out of me because it would slam on the brakes when I tried to reverse into grass that was too tall.

    What if I were trying to avoid something dangerous, like a train or another vehicle, and the truck slammed on the brakes for me because of some tree branches in the way? Potentially deadly.

  • Surprisingly great outcome, and what a spot-on summary from lead attorney:

    "Tesla designed autopilot only for controlled access highways yet deliberately chose not to restrict drivers from using it elsewhere, alongside Elon Musk telling the world Autopilot drove better than humans," said Brett Schreiber, lead attorney for the plaintiffs. "Tesla’s lies turned our roads into test tracks for their fundamentally flawed technology, putting everyday Americans like Naibel Benavides and Dillon Angulo in harm's way. Today's verdict represents justice for Naibel's tragic death and Dillon's lifelong injuries, holding Tesla and Musk accountable for propping up the company’s trillion-dollar valuation with self-driving hype at the expense of human lives," Schreiber said.

    We need more people like him in the world.

    The bullshit artists have had free reign over useful idiots for too long.

  • There's actually a backfire effect here. It could make companies too cautious in rolling out self driving. The status quo is people driving poorly. If you delay the roll out of self driving beyond the point when it's better than people, then more people will die.

    This isn't really something you can be 'too cautious' about.

    Hopefully we can at least agree that as of right now, they're not being cautious enough.

  • While Tesla said that McGee was solely responsible, as the driver of the car, McGee told the court that he thought Autopilot "would assist me should I have a failure or should I miss something, should I make a mistake," a perception that Tesla and its CEO Elon Musk has done much to foster with highly misleading statistics that paint an impression of a brand that is much safer than in reality.

    Here’s the thing, Tesla’s marketing of autopilot was much different than the reality. Sure, the fine print might have said having your foot on the gas would shut down autopilot, but the marketing made autopilot sound much more powerful. This guy put his trust in how the vehicle was marketed, and somebody died as a result.

    My car, for instance, does not have self driving, but it will still brake if it detects I am going to hit something. Even when my foot is on the gas. It is not unreasonable to think a car marketed the way Tesla was marketed would have similar features.

    Lastly, Tesla’s valuation as a company was based on this same marketing, not the fine print. So not only did the marketing put people in danger, but Tesla profited massively from it. They should be held responsible for this.

    Sure, the fine print might have said having your foot on the gas would shut down autopilot

    The car tells you it won't brake WHILE you do it.

    This isn't a fine print thing, it's an active warning that you are overriding it. You must be able to override it, its a critical saftey feature. You have to be able to override it to avoid any potential mistake it makes (critical or not). While a Level 2 system is active, human input > level 2 input.

    It's there every time you do it. It might have looked a little different in 2019, but as an example from the internet.

    (edit: clarity + overriding with the accelerator is also explained to every user before they can enable autopilot in an on screen tutorial of basic functionality)

  • "Today’s verdict is wrong"
    I think a certain corporation needs to be reminded to have some humility toward the courts
    Corporations should not expect the mercy to get away from saying the things a human would

    It's all about giving something for useful idiots to latch on to.

    These people know most of us can't think for ourselves, so they take full advantage of it.

  • So if this guy killed an entire family but survived in this accident instead, would the judge blame fucking tesla autopilot and let him go free?
    I might as well sue the catholic church because Jesus did not take the wheel when I closed my eyes while driving and prayed really hard!

    The details of the accidant too of him accelerating and turning while on autopilot. Not even today does any car have a fully autonomous autopilot driving system that works in all cities or roads, and this was in 2019.

    did Elon fuck the judge wife and then his entire family left him for it? wtf is $330 millions for wrong crash accident anyway?

    Your post reeks of ignorance.

    I don't think you have the capability to understand what's going on.

  • I'm kinda torn on this - in principle, not this specific case. If your AI performs on paar with an average human and there is no known flaw at fault, I think you shouldn't be either.

    If the company is penalized for being at fault, then they will have reasons to try better in the future.

    I don't even give a flying fuck about how autopilot compares to the average driver. Tesla has the resources to make its technology better, so we as customers should all hold them to the highest possible standard. Anything less just outs you as a useful idiot; you're willing to accept less so someone richer than you can have more.

  • How about fucking not claiming it's FSD and just have ACC and lane keep and then collect data and train on that? Also closed circuit and test there.

    Autopilot is ACC which is what the case was about here.

  • You are defending Tesla and being disingenuous about it.

    The other car companies working on this are spending millions of dollars to test their vehicles in closed areas that simulate real world conditions in order to not kill people.

    You sound like a psychopath.

    The hyperbole is ridiculous here and it makes you sound like a psychopath.