Skip to content

Tesla loses Autopilot wrongful death case in $329 million verdict

Technology
176 97 11
  • There's actually a backfire effect here. It could make companies too cautious in rolling out self driving. The status quo is people driving poorly. If you delay the roll out of self driving beyond the point when it's better than people, then more people will die.

    it's hard to prove that point, though. rolling out self driving may just make car usage go up and negate rate decreases by increasing overall usage

  • Today’s verdict is wrong and only works to set back automotive safety and jeopardize Tesla’s and the entire industry’s efforts to develop and implement life-saving technology.

    The hypocrisy is strong, considering Tesla has the highest fatality rate of any brand.

    Not to mention tone-deaf. Maybe you shouldn't talk about life-saving technology when your technology anti-saved a life....

    And that's ignoring the fact that they're using inferior technology. Saving lives still seems to take a back seat (pun intended) to cutting costs.

  • I don't know. If it is possible to override the autopilot then it's a pretty good bet that putting your foot on the accelerator would do it. It's hard to really imagine this scenario where that wouldn't result in the car going into manual mode. Surely would be more dangerous if you couldn't override the autopilot.

    We can bet on a lot, but when you're betting on human lives, you might get hit with a massive lawsuit, right? Try to bet less.

  • That's a tough one. Yeah they sell it as autopilot. But anyone seeing a steering wheel and pedals should reasonably assume that they are there to override the autopilot. Saying he thought the car would protect him from his mistake doesn't sound like something an autopilot would do.
    Tesla has done plenty wrong, but this case isn't much of an example of that.

    Yeah, the problem is that the US has no consumer protections, and somehow this court is trying to make up for it, but it shouldn't be in such court cases where the driver was clearly not fit to drive a car.

  • Or automotive vision.

    Thank you. I seriously didn't understand what the field was.

  • And that is the point, Tesla's "AI" performs nowhere near human levels. Actual full self driving levels is on 5 scales where Tesla's are around level 2 out of those 5.

    Tesla claimed they have full self driving for since about a decade or so, and it has been and continues to be a complwte lie. Musk claimed since long ago that he can drive a Tesla autonomously from LA to NY while in reality it has trouble leaving the first parking lot.

    I'm unsure of and how much has changed there but since Elmo Musk spends more time lying about everything than actually improving his products, I would not hold my breath.

    The original comment is perpetuating the lie. Intentional or not. They rely on fundamentally flawed soundbites that are precisely crafted for propaganda not to be informative or truthful at all.

    Right off the bat they're saying "in principle" which presumes the baseline lie that "full self driving" is achieved. Then they strengthen their argument by reinforcing the idea that it's functionally equivalent to humans (i.e. generalized intelligence). Then the cap it off with "no known flaw". Pure lies.

    Of course they've hedged by implying it's opinion but strongly suggest it's the most correct one anyways.

    I’m unsure of and how much has changed

    This demonstrates exactly how effective the propaganda is. They set up scenarios where nobody honest will refute their bullshit with certainty. Even though we know there is no existing system is on par with human drivers. Sure they can massage data to say under certain conditions an automated driving system performed similarly by some metric or whatever. But that's fundamentally not what they are telling laymen audience. They're lying in order to lead the average person to believe they can trust their car to drive them as if they are a passenger and another human is behind the wheel. This not true. Period. There is no existing system that does this. There will not be in the foreseeable future.

    The fact of the meta is that technological discussion is more about this kind of propaganda than technology itself. If it weren't the case then more people would be hearing about the actual technology and it's real limitations. Not all the spin-doctoring. That leads to uncertainty and confusion. Which leads to preventable deaths.

  • This is gonna get overturned on appeal.

    The guy dropped his phone and was fiddling for it AND had his foot pressing down the accelerator.

    Pressing your foot on it overrides any braking, it even tells you it won't brake while doing it. That's how it should be, the driver should always be able to override these things in case of emergency.

    Maybe if he hadn't done that (edit held the accelerator down) it'd stick.

    While Tesla said that McGee was solely responsible, as the driver of the car, McGee told the court that he thought Autopilot "would assist me should I have a failure or should I miss something, should I make a mistake," a perception that Tesla and its CEO Elon Musk has done much to foster with highly misleading statistics that paint an impression of a brand that is much safer than in reality.

    Here’s the thing, Tesla’s marketing of autopilot was much different than the reality. Sure, the fine print might have said having your foot on the gas would shut down autopilot, but the marketing made autopilot sound much more powerful. This guy put his trust in how the vehicle was marketed, and somebody died as a result.

    My car, for instance, does not have self driving, but it will still brake if it detects I am going to hit something. Even when my foot is on the gas. It is not unreasonable to think a car marketed the way Tesla was marketed would have similar features.

    Lastly, Tesla’s valuation as a company was based on this same marketing, not the fine print. So not only did the marketing put people in danger, but Tesla profited massively from it. They should be held responsible for this.

  • There's actually a backfire effect here. It could make companies too cautious in rolling out self driving. The status quo is people driving poorly. If you delay the roll out of self driving beyond the point when it's better than people, then more people will die.

    The status quo is people driving poorly.

    It's not people driving poorly, as much as it is horrible city planning, poor traffic design and, perhaps most importantly, not requiring people to be educated enough before receiving a driver's license.

    This is an issue seen practically exclusively in underdeveloped countries. In Europe road accidents are incredibly rare. Nobody here even considers self-driving cars a solution to anything, because there's nothing to solve.

    This is nothing but Tesla (et al.) selling a 'solution' to an artificially created problem, that will not solve anything and simply address the symptoms.

  • No the issue still remains on who's actually responsible? With human drivers we always have someone to take the blame but with robots? Who's at fault when a self driving car kills someone? The passenger? Tesla? Someone has to be sued and it'll be Tesla so even if its 1% of total accidents the legal instructions will be overwhelmed because the issue is 1000% harder to resolve.

    Once Tesla starts losing multiple 300M lawsuits the flood gates will be open and the company is absolutely done.

    That is an issue.

    i just realized that I didn't finish the thought. Once self driving is statistically safer we will ban human drivers. Some places it will be by law, Some the more subtile insurance costs, some by something else.

    We need to figure out liability of course. I have ideas but nobody will listen so noebuint in writting.

  • That's a tough one. Yeah they sell it as autopilot. But anyone seeing a steering wheel and pedals should reasonably assume that they are there to override the autopilot. Saying he thought the car would protect him from his mistake doesn't sound like something an autopilot would do.
    Tesla has done plenty wrong, but this case isn't much of an example of that.

    There are other cars on the market that use technology that will literally override your input if they detect that there is a crash imminent. Even those cars do not claim to have autopilot and Tesla has not changed their branding or wording which is a lot of the problem here.

    I can't say for sure that they are responsible or not in this case because I don't know what the person driving then assumed. But if they assumed that the "safety features" (in particular autopilot) would mitigate their recklessness and Tesla can't prove they knew about the override of such features, then I'm not sure the court is wrong in this case. The fact that they haven't changed their wording or branding of autopilot (particularly calling it that), is kind of damning here.

    Autopilot maintains speed (edit), altitude (end of edit), and heading or flight path in planes. But the average person doesn't know or understand that. Tesla has been using the pop culture understanding of what autopilot is and that's a lot of the problem. Other cars have warning about what their "assisted driving" systems do, and those warnings pop up every time you engage them before you can set any settings etc. But those other car manufacturers also don't claim the car can drive itself.

  • I’ve never had one that turns it off if I accelerate.

    They’ve all shut off if I tapped the brakes though.

    What happens when you hit the gas while in cruise control? In all the cards I have driven, you go faster than the set speed and the car responds to your pedal movements. I guess we can debate if we call that stopped or just paused, but it is certainly not ignoring your acceleration.

  • Thats not a gut feeling. That's how every cruise control since it was invented in the 70s works. You press the brake or the accelerator? Cruise control (and autopilot) = off.

    That's not a gut feeling, that's what stated in the manual.

    No. Press the brake and it turns off. Press the accelerator in lots of cars and it will speed up but return to the cruise control set speed when you release the accelerator. And further, Tesla doesn't call it cruise control and the founder of Tesla has been pretty heavily misleading about what the system is and what it does. So.

  • The problem is how Musk and Tesla have sold their self driving and full self driving and what ever name they call the next one.

    Should be a class action lawsuit by Tesla owners and damages in tens of billions rather than millions tbh. I'm just saying that this particular case can't be seen as Tesla's fault by anyone being objective.

  • So your comparing a we well say 2020 technology to the 1915 version of autopilot and not the kind in the 2020s that is much more advanced. Yah what BS.

    Because it still basically does what's they said. The only new advent for the autopilot system besides maintaining speed, heading, and altitude is the ability to use and set a GPS heading, and waypoints (for the purposes of this conversation). It will absolutely still fly into a mountain if not for other collision avoidance systems. Your average 737 or A320 is not going to spontaneously change course just because of the elevation of the ground below it changed. But you can program other systems in the plane to know to avoid a specific flight path because there is a known hazard. I want you to understand that we know a mountain is there. They don't move around much in short periods of time. Cars and pedestrians are another story entirely.

    There's a reason we still have air traffic controllers and even then pilots and air traffic control aren't infallible and they have way more systems to make flying safe than the average car (yes even the average Tesla).

  • What happens when you hit the gas while in cruise control? In all the cards I have driven, you go faster than the set speed and the car responds to your pedal movements. I guess we can debate if we call that stopped or just paused, but it is certainly not ignoring your acceleration.

    Well, yeah, you can call it “paused” if you want to. The cruise control definitely stays on though and resumes the set speed when you stop accelerating. It completely disengages when you brake though, so I’ve never thought of it as turning off when I accelerate, only when braking.

  • No. Press the brake and it turns off. Press the accelerator in lots of cars and it will speed up but return to the cruise control set speed when you release the accelerator. And further, Tesla doesn't call it cruise control and the founder of Tesla has been pretty heavily misleading about what the system is and what it does. So.

    Yeah, sure.

    You sound like one of those people who are the reason why we find the following warning on microwave ovens:

    WARNING: DO NOT TRY TO DRY PETS IN THIS DEVICE.

    And on plastic bags:

    WARNING: DO NOT PLACE OVER HEAD.

    We both know that this is not what it's for. And it (model S) has never been cleared ANYWHERE ON THIS GLOBE as an autonomous vehicle.

    (Adaptive with lane assist and collision detection) Cruise control/autopilot on, foot on accelerator, no eyes on the road, no hands on the steering wheel. That's malice. There where visible, audible and even tactile warnings wich this guy ignored.

    No current day vehicle (or something from 2019) has in it's manual that this is use as intended. As a matter of fact all warn you to not do that.

    And I get that you hate Tesla/Musk, don't we all. But in this case only 1 person is responsible. The asshole driving it.

  • Yeah, sure.

    You sound like one of those people who are the reason why we find the following warning on microwave ovens:

    WARNING: DO NOT TRY TO DRY PETS IN THIS DEVICE.

    And on plastic bags:

    WARNING: DO NOT PLACE OVER HEAD.

    We both know that this is not what it's for. And it (model S) has never been cleared ANYWHERE ON THIS GLOBE as an autonomous vehicle.

    (Adaptive with lane assist and collision detection) Cruise control/autopilot on, foot on accelerator, no eyes on the road, no hands on the steering wheel. That's malice. There where visible, audible and even tactile warnings wich this guy ignored.

    No current day vehicle (or something from 2019) has in it's manual that this is use as intended. As a matter of fact all warn you to not do that.

    And I get that you hate Tesla/Musk, don't we all. But in this case only 1 person is responsible. The asshole driving it.

    Nope. I'm correcting you because apparently most people don't even know how their cruise control works. But feel however you feel.

  • Thats not a gut feeling. That's how every cruise control since it was invented in the 70s works. You press the brake or the accelerator? Cruise control (and autopilot) = off.

    That's not a gut feeling, that's what stated in the manual.

    That's not how cruise control works and I have never seen cruise control marketed in a such a way that would make anyone believe it was smart enough to stop a car crash.

  • Farquaad said this, not Brannigan iirc

    I'm pretty sure it was both.

  • Look, we've only known the effects of radium and similar chemical structures for about a hundred years or so. Give corporations a chance to catch up. /s

  • Former and current Microsofties react to the latest layoffs

    Technology technology
    20
    1
    85 Stimmen
    20 Beiträge
    209 Aufrufe
    eightbitblood@lemmy.worldE
    Incredibly well said. And couldn't agree more! Especially after working as a game dev for Apple Arcade. We spent months proving to them their saving architecture was faulty and would lead to people losing their save file for each Apple Arcade game they play. We were ignored, and then told it was a dev problem. Cut to the launch of Arcade: every single game has several 1 star reviews about players losing their save files. This cannot be fixed by devs as it's an Apple problem, so devs have to figure out novel ways to prevent the issue from happening using their own time and resources. 1.5 years later, Apple finishes restructuring the entire backend of Arcade, fixing the problem. They tell all their devs to reimplement the saving architecture of their games to be compliant with Apples new backend or get booted from Arcade. This costs devs months of time to complete for literally zero return (Apple Arcade deals are upfront - little to no revenue is seen after launch). Apple used their trillions of dollars to ignore a massive backend issue that affected every player and developer on Apple Arcade. They then forced every dev to make an update to their game at their own expense just to keep it listed on Arcade. All while directing user frustration over the issue towards developers instead of taking accountability for launching a faulty product. Literally, these companies are run by sociopaths that have egos bigger than their paychecks. Issues like this are ignored as it's easier to place the blame on someone down the line. People like your manager end up getting promoted to the top of an office heirachy of bullshit, and everything the company makes just gets worse until whatever corpse is left is sold for parts to whatever bigger dumb company hasn't collapsed yet. It's really painful to watch, and even more painful to work with these idiots.
  • Alibaba Cloud claims new DB manager beats rival hyperscalers

    Technology technology
    1
    1
    8 Stimmen
    1 Beiträge
    19 Aufrufe
    Niemand hat geantwortet
  • 73 Stimmen
    15 Beiträge
    130 Aufrufe
    L
    same, i however dont subscribe to thier "contact you by recruiters, since you get flooded with indian recruiters of questionable positions, and jobs im not eligible for. unfortunately for the field i was trying to get into, wasnt helping so i found just a regular job in the mean time.
  • 33 Stimmen
    13 Beiträge
    149 Aufrufe
    maggiwuerze@feddit.orgM
    2x Fn on MacBooks
  • New Orleans debates real-time facial recognition legislation

    Technology technology
    12
    1
    150 Stimmen
    12 Beiträge
    118 Aufrufe
    A
    [image: 62e40d75-1358-46a4-a7a5-1f08c6afe4dc.jpeg] Palantir had a contract with New Orleans starting around ~2012 to create their predictive policing tech that scans surveillance cameras for very vague details and still misidentifies people. It's very similar to Lavender, the tech they use to identify members of Hamas and attack with drones. This results in misidentified targets ~10% of the time, according to the IDF (likely it's a much higher misidentification rate than 10%). Palantir picked Louisiana over somewhere like San Francisco bc they knew it would be a lot easier to violate rights and privacy here and get away with it. Whatever they decide in New Orleans on Thursday during this Council meeting that nobody cares about, will likely be the first of its kind on the books legal basis to track civilians in the U.S. and allow the federal government to take control over that ability whenever they want. This could also set a precedent for use in other states. Guess who's running the entire country right now, and just gave high ranking army contracts to Palantir employees for "no reason" while they are also receiving a multimillion dollar federal contract to create an insane database on every American and giant data centers are being built all across the country.
  • Websites Are Tracking You Via Browser Fingerprinting

    Technology technology
    41
    1
    296 Stimmen
    41 Beiträge
    578 Aufrufe
    M
    Lets you question how digital stalking is still allowed?
  • 82 Stimmen
    3 Beiträge
    39 Aufrufe
    sfxrlz@lemmy.dbzer0.comS
    As a Star Wars yellowtext: „In the final days of the senate, senator organa…“
  • 0 Stimmen
    7 Beiträge
    68 Aufrufe
    V
    Just downloaded it, thanks for the info!