Tesla loses Autopilot wrongful death case in $329 million verdict
-
So if this guy killed an entire family but survived in this accident instead, would the judge blame fucking tesla autopilot and let him go free?
I might as well sue the catholic church because Jesus did not take the wheel when I closed my eyes while driving and prayed really hard!The details of the accidant too of him accelerating and turning while on autopilot. Not even today does any car have a fully autonomous autopilot driving system that works in all cities or roads, and this was in 2019.
did Elon fuck the judge wife and then his entire family left him for it? wtf is $330 millions for wrong crash accident anyway?
Your post reeks of ignorance.
I don't think you have the capability to understand what's going on.
-
I'm kinda torn on this - in principle, not this specific case. If your AI performs on paar with an average human and there is no known flaw at fault, I think you shouldn't be either.
If the company is penalized for being at fault, then they will have reasons to try better in the future.
I don't even give a flying fuck about how autopilot compares to the average driver. Tesla has the resources to make its technology better, so we as customers should all hold them to the highest possible standard. Anything less just outs you as a useful idiot; you're willing to accept less so someone richer than you can have more.
-
How about fucking not claiming it's FSD and just have ACC and lane keep and then collect data and train on that? Also closed circuit and test there.
Autopilot is ACC which is what the case was about here.
-
You are defending Tesla and being disingenuous about it.
The other car companies working on this are spending millions of dollars to test their vehicles in closed areas that simulate real world conditions in order to not kill people.
You sound like a psychopath.
The hyperbole is ridiculous here and it makes you sound like a psychopath.
-
We can bet on a lot, but when you're betting on human lives, you might get hit with a massive lawsuit, right? Try to bet less.
I don't know what you're trying to say.
Do you think it shouldn't be possible to override autopilot?
-
To me having the car be able to override your actions sounds more dangerous than being to override the autopilot.
I had one rental truck that drove me insane and scared the shit out of me because it would slam on the brakes when I tried to reverse into grass that was too tall.
What if I were trying to avoid something dangerous, like a train or another vehicle, and the truck slammed on the brakes for me because of some tree branches in the way? Potentially deadly.
I agree. I hate auto braking features. I'm not a fan of cruise control. I very much dislike adaptable cruise control, lane keeping assist, reverse braking, driving assist, and one pedal mode. I drive a stick shift car from the early 2000's for this reason. Just enough tech to be useful. Not enough tech to get in the way of me being in control of the car.
But there's definitely some cruise controls out there even before all the stuff with sensors and such hit the market that doesn't work the way lots of people in this thread seem to think. Braking absolutely will cancel the set cruise control but doesn't turn it off. Accelerating in some cars also doesn't cancel the cruise control, it allows you to override it to accelerate but will go back to the set cruise control speed when you take your foot off the accelerator.
I absolutely recognize that not being able to override the controls has a significant potential to be deadly. All I'm saying is there's lots of drivers who probably shouldn't be on the road who these tools are designed for and they don't understand even the basics of how they work. They think the stuff is a cool gimmick. It makes them overconfident. And when you couple that with the outright lies that Musk has spewed continuously about these products and features, you should be able to see just why Tesla should be held accountable when the public trusts the company's claims and people die or get seriously injured as a result.
I've driven a lot of vehicles with features I absolutely hated. Ones that took agency away from the driver that I felt was extremely dangerous. On the other hand, I have had people just merge into me like I wasn't there. On several occasions. Happens to me at least every month or so. I've had people almost hit me from behind because they were driving distracted. I've literally watched people back into their own fences. Watched people wreck because they lost control of their vehicle or weren't paying attention. Supposedly these "features" are meant to prevent or mitigate the risks of that. And people believe they are more capable of mitigating that risk than they are, due to marketing and outright ridiculous claims from tech enthusiasts who promote these brands.
If I know anything I know that you can't necessarily make people read the warning label. And it becomes harder to override what they believe if you lied to them first and then try to tell them the truth later.
-
https://en.wikipedia.org/wiki/Therac-25#Radiation_overexposure_incidents Same thing over and over again
Technicians entering data too fast caused error 54. Come on.. their software was running bad code to check form fields. This is like letting a web form cut off your arm.
Scary.
-
There are other cars on the market that use technology that will literally override your input if they detect that there is a crash imminent. Even those cars do not claim to have autopilot and Tesla has not changed their branding or wording which is a lot of the problem here.
I can't say for sure that they are responsible or not in this case because I don't know what the person driving then assumed. But if they assumed that the "safety features" (in particular autopilot) would mitigate their recklessness and Tesla can't prove they knew about the override of such features, then I'm not sure the court is wrong in this case. The fact that they haven't changed their wording or branding of autopilot (particularly calling it that), is kind of damning here.
Autopilot maintains speed (edit), altitude (end of edit), and heading or flight path in planes. But the average person doesn't know or understand that. Tesla has been using the pop culture understanding of what autopilot is and that's a lot of the problem. Other cars have warning about what their "assisted driving" systems do, and those warnings pop up every time you engage them before you can set any settings etc. But those other car manufacturers also don't claim the car can drive itself.
Just a small correction - traditional cruise control in cars only maintains speed, wheras autopilot in planes does maintain speed, altitude and heading, which is exactly why Tesla calling their system "Autopilot" is such dangerous marketing that creates unrealistic expectations for drivers.
-
I think that's a bad idea, both legally and ethically. Vehicles cause tens of thousands of deaths - not to mention injuries - per year in North America. You're proposing that a company who can meet that standard is absolved of liability? Meet, not improve.
In that case, you've given these companies license to literally make money off of removing responsibility for those deaths. The driver's not responsible, and neither is the company. That seems pretty terrible to me, and I'm sure to the loved ones of anyone who has been killed in a vehicle collision.
Yeah, but you can just set targets and penalize companies for missing them. No. of accidents per year for example. Even assuming autonomous vehicles only ever become as good as the average driver, this already means a substantial improvement over where things are at. For me, that's the point where I'd start to phase out manually operated vehicles. I believe they will ge significantly better than that eventually.
-
A representative for Tesla sent Ars the following statement: "Today's verdict is wrong and only works to set back automotive safety and jeopardize Tesla's and the entire industry's efforts to develop and implement life-saving technology. We plan to appeal given the substantial errors of law and irregularities at trial. Even though this jury found that the driver was overwhelmingly responsible for this tragic accident in 2019, the evidence has always shown that this driver was solely at fault because he was speeding, with his foot on the accelerator—which overrode Autopilot—as he rummaged for his dropped phone without his eyes on the road. To be clear, no car in 2019, and none today, would have prevented this crash. This was never about Autopilot; it was a fiction concocted by plaintiffs’ lawyers blaming the car when the driver—from day one—admitted and accepted responsibility."
So, you admit that the company’s marketing has continued to lie for the past six years?
Good that the car manufacturer is also being held accountable.
But...
In 2019, George McGee was operating his Tesla Model S using Autopilot when he ran past a stop sign and through an intersection at 62 mph then struck a pair of people stargazing by the side of the road. Naibel Benavides was killed and her partner Dillon Angulo was left with a severe head injury.
That's on him. 100%
McGee told the court that he thought Autopilot "would assist me should I have a failure or should I miss something, should I make a mistake,"
Stop giving stupid people the ability to control large, heavy vehicles! Autopilot is not a babysitter, it's supposed to be an assistive technology, like cruise control. This fucking guy gave Tesla the wheel, and that was a choice!
-
Autopilot is ACC which is what the case was about here.
If they called fucking acc autopilot they deserve to rot in hell, what the actual fuck.
That is such a misleading naming.
-
Good that the car manufacturer is also being held accountable.
But...
In 2019, George McGee was operating his Tesla Model S using Autopilot when he ran past a stop sign and through an intersection at 62 mph then struck a pair of people stargazing by the side of the road. Naibel Benavides was killed and her partner Dillon Angulo was left with a severe head injury.
That's on him. 100%
McGee told the court that he thought Autopilot "would assist me should I have a failure or should I miss something, should I make a mistake,"
Stop giving stupid people the ability to control large, heavy vehicles! Autopilot is not a babysitter, it's supposed to be an assistive technology, like cruise control. This fucking guy gave Tesla the wheel, and that was a choice!
i dont disagree; but i believe the suit was over how tesla misrepresented assistive technology as fully autonomous as the name autopilot implies
-
Just a small correction - traditional cruise control in cars only maintains speed, wheras autopilot in planes does maintain speed, altitude and heading, which is exactly why Tesla calling their system "Autopilot" is such dangerous marketing that creates unrealistic expectations for drivers.
I'm not sure what you're correcting. The autopilot feature has adaptive cruise control and lane keeping assist, and auto steering.
Adaptive cruise control will brake to maintain a distance with the vehicle in front of it but maintain the set speed otherwise, lane keeping assist will keep the vehicle in it's lane/prevent it from drifting from its lane, and combined with auto steering will keep it centered in the lane.
I specifically explained that a planes auto pilot does those things (maintain speed, altitude, and heading), and that people don't know that this is all it does. It doesn't by itself avoid obstacles or account for weather etc. It'd fly right into another plane if it was occupying that airspace. It won't react to weather events like windsheer (which could cause the plane to lose altitude extremely quickly), or a hurricane. If there's an engine problem and an engine loses power? It won't attempt to restart. It doesn't brake. It can't land a plane.
But Musk made some claims that Teslas autopilot would drive the vehicle for you without human interference. And people assume that autopilot (in the pop culture sense) does a lot more than it actually does. This is what I'm trying to point out.
-
The original comment is perpetuating the lie. Intentional or not. They rely on fundamentally flawed soundbites that are precisely crafted for propaganda not to be informative or truthful at all.
Right off the bat they're saying "in principle" which presumes the baseline lie that "full self driving" is achieved. Then they strengthen their argument by reinforcing the idea that it's functionally equivalent to humans (i.e. generalized intelligence). Then the cap it off with "no known flaw". Pure lies.
Of course they've hedged by implying it's opinion but strongly suggest it's the most correct one anyways.
I’m unsure of and how much has changed
This demonstrates exactly how effective the propaganda is. They set up scenarios where nobody honest will refute their bullshit with certainty. Even though we know there is no existing system is on par with human drivers. Sure they can massage data to say under certain conditions an automated driving system performed similarly by some metric or whatever. But that's fundamentally not what they are telling laymen audience. They're lying in order to lead the average person to believe they can trust their car to drive them as if they are a passenger and another human is behind the wheel. This not true. Period. There is no existing system that does this. There will not be in the foreseeable future.
The fact of the meta is that technological discussion is more about this kind of propaganda than technology itself. If it weren't the case then more people would be hearing about the actual technology and it's real limitations. Not all the spin-doctoring. That leads to uncertainty and confusion. Which leads to preventable deaths.
I think the original commenter simply doesn't know how wrong he is
-
i dont disagree; but i believe the suit was over how tesla misrepresented assistive technology as fully autonomous as the name autopilot implies
Yes, false advertising for sure. But the responsibility for safe driving, is on the driver, even if the driver's role is engaging autopilot.
I can only imagine the same applies in other circumstances where autopilot is an option: planes, boats, drones, etc.
-
Good that the car manufacturer is also being held accountable.
But...
In 2019, George McGee was operating his Tesla Model S using Autopilot when he ran past a stop sign and through an intersection at 62 mph then struck a pair of people stargazing by the side of the road. Naibel Benavides was killed and her partner Dillon Angulo was left with a severe head injury.
That's on him. 100%
McGee told the court that he thought Autopilot "would assist me should I have a failure or should I miss something, should I make a mistake,"
Stop giving stupid people the ability to control large, heavy vehicles! Autopilot is not a babysitter, it's supposed to be an assistive technology, like cruise control. This fucking guy gave Tesla the wheel, and that was a choice!
Well, if only Tesla hadn't invested tens of millions into marketing campaigns trying to paint autopilot as a fully self driving, autonomous system. Everyone knows that 9 out of 10 consumers don't read the fine print, ever. They buy, and use shit off of vibes. False marketing can and does kill.
-
Yes, false advertising for sure. But the responsibility for safe driving, is on the driver, even if the driver's role is engaging autopilot.
I can only imagine the same applies in other circumstances where autopilot is an option: planes, boats, drones, etc.
agree with you here. your point reminds me of this case below.
The tldr is pilots were using their laptop to look at scheduled iirc and overflew their destination. its long been speculated they were watching a movie -
I wonder if a lawyer will ever try to apply this as precedent against Boeing or similar...
Part of the reason that air travel is as safe as it is is because governments held both airlines and manufacturers accountable for planes crashes or other air travel incidents, especially those leading to death or expensive property damage/mishap. You have to have significant training and flight hours to be a commercial pilot.
In the cases where Boeing has been found (through significant investigation) to be liable for death or injury, they have been held accountable. That's literally why the 800 maxes were grounded world wide. Literally why they were forced to add further safety measures after the door plug failure which was due to their negligence as a manufacturer.
Systemic failures led to a door plug flying off a Boeing 737 Max, NTSB says
The National Transportation Safety Board chairwoman says heroic actions by the crew aboard an Alaska Airlines flight ensured everyone survived last year when the door plug panel blew out of the plane.
AP News (apnews.com)
What Boeing’s Door-Plug Debacle Says About the Future of Aviation Safety
For a flight to be imperiled by a simple and preventable manufacturing or maintenance error is an anomaly with ominous implications.
The MIT Press Reader (thereader.mitpress.mit.edu)
Boeing Charged with 737 Max Fraud Conspiracy and Agrees to Pay over $2.5 Billion
This is archived content from the U.S. Department of Justice website. The information here may be outdated and links may no longer function. Please contact webmaster@usdoj.gov if you have any questions about the archive site.
(www.justice.gov)
-
A representative for Tesla sent Ars the following statement: "Today's verdict is wrong and only works to set back automotive safety and jeopardize Tesla's and the entire industry's efforts to develop and implement life-saving technology. We plan to appeal given the substantial errors of law and irregularities at trial. Even though this jury found that the driver was overwhelmingly responsible for this tragic accident in 2019, the evidence has always shown that this driver was solely at fault because he was speeding, with his foot on the accelerator—which overrode Autopilot—as he rummaged for his dropped phone without his eyes on the road. To be clear, no car in 2019, and none today, would have prevented this crash. This was never about Autopilot; it was a fiction concocted by plaintiffs’ lawyers blaming the car when the driver—from day one—admitted and accepted responsibility."
So, you admit that the company’s marketing has continued to lie for the past six years?
Life saving technology, BS, their auto pilot is half-ass.
-
Yes, false advertising for sure. But the responsibility for safe driving, is on the driver, even if the driver's role is engaging autopilot.
I can only imagine the same applies in other circumstances where autopilot is an option: planes, boats, drones, etc.
Here's my problem with all of the automation the manufacturers are adding to cars. Not even Autopilot level stuff is potentially a problem - things like adaptive cruise come to mind.
If there's some kind of bug in that adaptive cruise that puts my car into the bumper of the car in front of me before I can stop it, the very first thing the manufacturer is going to say is:
But the responsibility for safe driving, is on the driver...
And how do we know there isn't some stupid bug? Our car has plenty of other software bugs in the infotainment system; hopefully they were a little more careful with the safety-critical systems...ha ha, I know. Even the bugs in the infotainment are distracting. But what would the manufacturer say if there was a crash resulting from my moment of distraction, caused by the 18th fucking weather alert in 10 minutes for a county 100 miles away, a feature that I can't fucking disable?
But the responsibility for safe driving, is on the driver...
In other words, "We bear no responsibility!" So, I have to pay for these "features" and the manufacturer will deny any responsibility if one of them fails and causes a crash. It's always your fault as the driver, no matter what. The company rolls this shit out to us; we have no choice to buy a new car without it any more, and they don't even trust it enough to stand behind it.
Maybe you'll get lucky and enough issues will happen that gov't regulators will look into it (not in the US any more, of course)...but probably not. You'll be blamed, and you'll pay higher insurance, and that will be that.
So now I have to worry not only about other drivers and my own driving, but I also have to be alert that the car will do something unexpected as well. Which has happened, when all this "smart" technology has misunderstood a situation, like slamming on the brakes for a car in another lane. I've found I hate having to fight my own car.
Obviously, I very much dislike driving our newer car. It's primarily my wife's car, and I only drive it once or twice a week, fortunately.