Skip to content

Tesla loses Autopilot wrongful death case in $329 million verdict

Technology
176 97 8
  • Good that the car manufacturer is also being held accountable.

    But...

    In 2019, George McGee was operating his Tesla Model S using Autopilot when he ran past a stop sign and through an intersection at 62 mph then struck a pair of people stargazing by the side of the road. Naibel Benavides was killed and her partner Dillon Angulo was left with a severe head injury.

    That's on him. 100%

    McGee told the court that he thought Autopilot "would assist me should I have a failure or should I miss something, should I make a mistake,"

    Stop giving stupid people the ability to control large, heavy vehicles! Autopilot is not a babysitter, it's supposed to be an assistive technology, like cruise control. This fucking guy gave Tesla the wheel, and that was a choice!

    You're missing the point. Calling a driving assistant an autopilot is the problem here. If a company is allowed to sell a vehicle, and it says it has autopilot, it would be reasonable to assume that it does and that it works properly - unless you have a basic understanding how car "autopilots" are implemented, which is more than you can expect from a layman.

  • Have you even read what happened? The driver dropped his phone and wasn’t watching the road but instead was rummaging around on the ground looking for his phone, while having his foot on the accelerator manually accelerating. Autopilot was supposedly turned off because of the manual acceleration.

    That text you italicized so proudly, is what Tesla CLAIMS happened. Did you know Tesla repeatedly told the court that they did not have the video and data that had been captured seconds before the crash, until a forensics expert hired by the PLAINTIFFS found the data, showing Tesla had it the entire time?

    Gee, why would Tesla try to hide that data if it showed the driver engaged the accelerator? Why did the plaintiffs have to go to extreme efforts to get that data?

    A jury of 12 saw that evidence, you didn't, but you believe Elon the habitual liar so hey, keep on glazin'.

  • Good that the car manufacturer is also being held accountable.

    But...

    In 2019, George McGee was operating his Tesla Model S using Autopilot when he ran past a stop sign and through an intersection at 62 mph then struck a pair of people stargazing by the side of the road. Naibel Benavides was killed and her partner Dillon Angulo was left with a severe head injury.

    That's on him. 100%

    McGee told the court that he thought Autopilot "would assist me should I have a failure or should I miss something, should I make a mistake,"

    Stop giving stupid people the ability to control large, heavy vehicles! Autopilot is not a babysitter, it's supposed to be an assistive technology, like cruise control. This fucking guy gave Tesla the wheel, and that was a choice!

    I dig blaming the people who wind up believing deceptive marketing practices, instead of blaming the people doing the deceiving.

    Look up the dictionary definition of autopilot: a mechanical, electrical or hydraulic system used to guide a vehicle without assistance from a human being. FULL SELF DRIVING, yeah, why would that wording lead people to believe the car was, you know, fully self-driving?

    Combine that with year after year of Elon Musk constantly stating in public that the car either already drives itself, or will be capable of doing so just around the corner, by the end of next year, over and over and over and

    Elon lied constantly to keep the stock price up, and people have died for believing those lies.

  • Have you even read what happened? The driver dropped his phone and wasn’t watching the road but instead was rummaging around on the ground looking for his phone, while having his foot on the accelerator manually accelerating. Autopilot was supposedly turned off because of the manual acceleration.

    U believe everything elon says.

  • I will repeat, regardless of what the (erroneous) claims are by Tesla, a driver is still responsible.

    This is like those automated bill payment systems. Sure, they are automated, and the company promotes it as "easy" and "convenient", but you're still responsible if those bills don't get paid for whatever reason.

    From another report:

    While driving, McGee dropped his mobile phone that he was using and scrambled to pick it up. He said during the trial that he believed Enhanced Autopilot would brake if an obstacle was in the way. His Model S accelerated through an intersection at just over 60 miles per hour, hitting a nearby empty parked car and its owners, who were standing on the other side of their vehicle.

    Isn't using a phone while being the driver of a vehicle illegal? And what the hell is was up with highway speeds near an intersection??? This dude can blame autopilot, but goddamn, he was completely negligent. It's like there were two idiots driving the same vehicle that day.

    Yes, of course the driver is at fault for being an idiot. And sadly, a shitton of drivers are idiots. Ignoring this fact is practically ignoring reality. You shouldn't be allowed to do false marketing as a company exactly because idiots will fall for it.

  • There’s no way this decision stands, it’s absolutely absurd. The guy dropped his phone and was looking down reaching around looking for it when he crashed. He wasn’t supervising autopilot, like you are required to.

    Dude, slow down, if you keep glazing Elon this hard, it's gonna start getting frothy.

    I guess the lesson is, if your car doesn't provide a system that can be used to guide the vehicle WITHOUT ASSISTANCE FROM A HUMAN BEING, then don't be an idiot and call it "AUTOPILOT"

  • Whether or not its the guys fault I'm just glad Elon is losing money.

    Hope he has to sell twatter at some point. Not that any good would come from that, but just the thought of him finally eating some shit makes me giggle.

  • That text you italicized so proudly, is what Tesla CLAIMS happened. Did you know Tesla repeatedly told the court that they did not have the video and data that had been captured seconds before the crash, until a forensics expert hired by the PLAINTIFFS found the data, showing Tesla had it the entire time?

    Gee, why would Tesla try to hide that data if it showed the driver engaged the accelerator? Why did the plaintiffs have to go to extreme efforts to get that data?

    A jury of 12 saw that evidence, you didn't, but you believe Elon the habitual liar so hey, keep on glazin'.

    Please read the article. I hate when people upvote bullshit just because it says things they like to hear. I dislike Elon Musk as much as anyone else, but the jury's findings were this:

    • The driver is ⅔ responsible for the crash because of his negligent driving.
    • The fact that the driver did in fact keep his foot on the accelerator was accepted by the jury.
    • The jury accepted that the driver was reaching for his cell phone at the time of the crash.
    • Evidence in court showed that the speed of the car was about 100 km/h. Keep in mind that this incident occurred in the Florida Keys where there are no high-speed expressways. I couldn't find info on where exactly this happened, but the main road in the area is US Route 1, which close to the mainland is a large four-lane road with occasional intersections, but narrows into a two-lane road for most of the distance.
    • The jury found Tesla ⅓ liable because it deemed that it had sold a faulty product. For international readers, in the US, a company that sells a product which is defective during normal use is strictly liable for resulting damages.
    • Obviously Tesla plans to appeal but it is normal for everyone to appeal in these sorts of cases. Many appeals get shot down by the appellate court.
  • Good that the car manufacturer is also being held accountable.

    But...

    In 2019, George McGee was operating his Tesla Model S using Autopilot when he ran past a stop sign and through an intersection at 62 mph then struck a pair of people stargazing by the side of the road. Naibel Benavides was killed and her partner Dillon Angulo was left with a severe head injury.

    That's on him. 100%

    McGee told the court that he thought Autopilot "would assist me should I have a failure or should I miss something, should I make a mistake,"

    Stop giving stupid people the ability to control large, heavy vehicles! Autopilot is not a babysitter, it's supposed to be an assistive technology, like cruise control. This fucking guy gave Tesla the wheel, and that was a choice!

    It is assistive technology, but that is not how tesla has been marketing it. They even sell a product called full self driving, while it's not that at all.

  • Whether or not its the guys fault I'm just glad Elon is losing money.

    Unfortunately, for companies like this, that would be just another business expense to keep things running.

  • Unfortunately, for companies like this, that would be just another business expense to keep things running.

    $329mm is a little more than a standard cost of doing business fine. That's substantially more than 80% of these companies get fined for causing huge amounts of damage.

  • This is gonna get overturned on appeal.

    The guy dropped his phone and was fiddling for it AND had his foot pressing down the accelerator.

    Pressing your foot on it overrides any braking, it even tells you it won't brake while doing it. That's how it should be, the driver should always be able to override these things in case of emergency.

    Maybe if he hadn't done that (edit held the accelerator down) it'd stick.

    I think the bigger issue is that Tesla might be diminishing the drivers impression of their vehicle responsibility with their marketing/presentation of auto pilot.

    I say that knowing very little about what it's like to use auto pilot but if it is the case that there are changes that can be made that will result in less deaths then maybe the guys lawyer has a point.

  • Absolutely. I hope he and the company burn in hell, but I do not want to start giving drivers who kill people a free pass to say "well, it was the car's fault!"

    "Autopilot", especially in Tesla cars, is beta software at best, and this feature should never have been allowed to be used on public roads. In that sense, the transportation ministry that's allowed it also has blood on their hands.

    Woo, both parties are terrible, irresponsible, and should be held accountable

  • It's not that simple. Imagine you're dying of a rare terminal disease. A pharma company is developing a new drug for it. Obviously you want it. But they tell you you can't have it because "we're not releasing it until we know it's good".

    This is, or was (thanks RFK for handing the industry a blank check), how pharma development works. You don't even get to do human trials until you're pretty damn sure it's not going to kill anyone. "Experimental medicine" stuff you read about is still medicine that's been in development for YEARS, and gone through animal, cellular, and various other trials.

  • Hope he has to sell twatter at some point. Not that any good would come from that, but just the thought of him finally eating some shit makes me giggle.

    Indeed, just the feeling of loss crossing his path would taste sweet for us peasants.

  • Which they have not and won't do. You have to do this in every condition. I wonder why they always test this shit out in Texas and California?

    I guess they just didn't want to admit that snow defeats both lidar and vision cameras. Plus the fact that snow covers lane markers, Street signs, and car sensors. People can adjust to these conditions, especially when driving locally. No self driving system can function without input.

  • Have you even read what happened? The driver dropped his phone and wasn’t watching the road but instead was rummaging around on the ground looking for his phone, while having his foot on the accelerator manually accelerating. Autopilot was supposedly turned off because of the manual acceleration.

    FreeDumbAdvocate serving Elon for free.

  • I think the bigger issue is that Tesla might be diminishing the drivers impression of their vehicle responsibility with their marketing/presentation of auto pilot.

    I say that knowing very little about what it's like to use auto pilot but if it is the case that there are changes that can be made that will result in less deaths then maybe the guys lawyer has a point.

    You gotta remember we're also back in 2019. Most of the talk back then was about what it was going to be able to do when FSD was ready, but no one got access to it until 2020 and that was a very small invite only group, and it lasted like that for years. I'd say the potential for confusion today is immensely more.

    I have used AP back then, and it was good, but it clearly made lots of little mistakes, and needed constant little adjustments. If you were paying attention, they were all easy to manage and you even got to know when to expect problems and take corrective action in advance.

    My \ the big beef with this case, is that he kept his food on the accelerator, and the car tells you while you do this, that it won't brake, and having your foot on the accelerator is a common practice, as AP can be slow to start, or you need to pass someone etc, so it's really unfathomable to think that the first time this guy ever did this, was when he decided to try and pick up his dropped phone, and thought, I should keep my foot on the accelerator while doing this! No amount of marketing, should be able to override "Autopilot will not brake. Accelerator pedal pressed" type active warnings with the screen pulsating some color at him. He knew about those warnings, without any doubt in my mind. He chose to ignore them. What more could you write in a small space to warn people it will not brake?

    That being said - The NHSTA has found that Tesla's monitoring system was lacking, and Tesla has had to improve on that because of that in recent times. People would attach oranges to the steering wheel to defeat the nag to pay attention type thing back then, but this goes well beyond that IMO. Even the current system won't immediately shut down if you decided to not pay attention for some reason, it would take time before it pulls itself over, but you might get a strike against future use where it will prevent you from using it again.

    Had his foot not been on the accelerator, this would have been a very different case had the accident still occurred (which is also still possible)

  • This is, or was (thanks RFK for handing the industry a blank check), how pharma development works. You don't even get to do human trials until you're pretty damn sure it's not going to kill anyone. "Experimental medicine" stuff you read about is still medicine that's been in development for YEARS, and gone through animal, cellular, and various other trials.

    Actually we have "right to try" laws for the scenario I described.

    But the FDA could use some serious reform. Under the system we have, an FDA approval lumps together the determinations of whether a drugs is safe, effective and worth paying for. A more libertarian system would let people spend their own money on drugs that are safe even if the FDA's particular research didn't find them effective. And it wouldn't waste tax payer money on drugs that are effective but exorbitantly expensive relative to their minimal effectiveness. But if a wealthy person wants to spend their own money, thereby subsidizing pharmaceuticals for the rest of us, that's great in my opinion.

  • ICEBlock - See Something, Tap Something

    Technology technology
    13
    109 Stimmen
    13 Beiträge
    151 Aufrufe
    D
    My main concern is that the app isn't open source. I don't trust any software that isn't fully open source.
  • 0 Stimmen
    1 Beiträge
    15 Aufrufe
    Niemand hat geantwortet
  • Signal – an ethical replacement for WhatsApp

    Technology technology
    235
    1
    1k Stimmen
    235 Beiträge
    5k Aufrufe
    V
    What I said is that smart people can be convinced to move to another platform. Most of my friends are not technically inclined, but it was easy to make them use it, at least to chat with me. What you did is change "smart people" with "people who already want to move", which is not the same. You then said it's not something you can choose (as you cannot choose to be rich). But I answered that you can actually choose your friends. Never did I say people who are not interested in niche technologies are not smart. My statement can be rephrased in an equivalent statement "people who cannot be convinced to change are not smart", and I stand to it.
  • 310 Stimmen
    37 Beiträge
    344 Aufrufe
    S
    Same, especially when searching technical or niche topics. Since there aren't a ton of results specific to the topic, mostly semi-related results will appear in the first page or two of a regular (non-Gemini) Google search, just due to the higher popularity of those webpages compared to the relevant webpages. Even the relevant webpages will have lots of non-relevant or semi-relevant information surrounding the answer I'm looking for. I don't know enough about it to be sure, but Gemini is probably just scraping a handful of websites on the first page, and since most of those are only semi-related, the resulting summary is a classic example of garbage in, garbage out. I also think there's probably something in the code that looks for information that is shared across multiple sources and prioritizing that over something that's only on one particular page (possibly the sole result with the information you need). Then, it phrases the summary as a direct answer to your query, misrepresenting the actual information on the pages they scraped. At least Gemini gives sources, I guess. The thing that gets on my nerves the most is how often I see people quote the summary as proof of something without checking the sources. It was bad before the rollout of Gemini, but at least back then Google was mostly scraping text and presenting it with little modification, along with a direct link to the webpage. Now, it's an LLM generating text phrased as a direct answer to a question (that was also AI-generated from your search query) using AI-summarized data points scraped from multiple webpages. It's obfuscating the source material further, but I also can't help but feel like it exposes a little of the behind-the-scenes fuckery Google has been doing for years before Gemini. How it bastardizes your query by interpreting it into a question, and then prioritizes homogeneous results that agree on the "answer" to your "question". For years they've been doing this to a certain extent, they just didn't share how they interpreted your query.
  • Meta publishes V-Jepa 2 – an AI world model

    Technology technology
    3
    1
    9 Stimmen
    3 Beiträge
    40 Aufrufe
    K
    Yay more hype. Just what we needed more of, it's hype, at last
  • 203 Stimmen
    6 Beiträge
    63 Aufrufe
    C
    One could say it's their fiduciary duty.
  • Microsoft Teams will soon block screen capture during meetings

    Technology technology
    43
    305 Stimmen
    43 Beiträge
    463 Aufrufe
    D
    No but, you can just close it.
  • 0 Stimmen
    9 Beiträge
    86 Aufrufe
    kolanaki@pawb.socialK
    I kinda don't want anyone other than a doctor determining it, tbh. Fuck the human bean counters just as much as the AI ones. Hopefully we can just start growing organs instead of having to even make such a grim decision and everyone can get new livers. Even if they don't need them.