Skip to content

Tesla withheld data, lied, and misdirected police and plaintiffs to avoid blame in Autopilot crash

Technology
24 22 0
  • Tesla was caught withholding data, lying about it, and misdirecting authorities in the wrongful death case involving Autopilot that it lost this week.

    The automaker was undeniably covering up for Autopilot.

    Last week, a jury found Tesla partially liable for a wrongful death involving a crash on Autopilot. We now have access to the trial transcripts, which confirm that Tesla was extremely misleading in its attempt to place all the blame on the driver.

    The company went as far as to actively withhold critical evidence that explained Autopilot’s performance around the crash. Within about three minutes of the crash, the Model S uploaded a “collision snapshot”—video, CAN‑bus streams, EDR data, etc.—to Tesla’s servers, the “Mothership”, and received an acknowledgement. The vehicle then deleted its local copy, resulting in Tesla being the only entity having access.

    What ensued were years of battle to get Tesla to acknowledge that this collision snapshot exists and is relevant to the case.

    The police repeatedly attempted to obtain the data from the collision snapshot, but Tesla led the authorities and the plaintiffs on a lengthy journey of deception and misdirection that spanned years.

    Surprised Pikachu

  • That jumped out at me too. Giving the benefit of the doubt, it could be that this “snapshot” includes a very large amount of data that could be problematic if stored locally for longer. In reality, they probably do it this way for exactly this type of situation, so they can retain full control of the potentially-damning data.

    That's not "benefit of the doubt", that's "playing devil's advocate". They probably used something like this.

  • Tesla was caught withholding data, lying about it, and misdirecting authorities in the wrongful death case involving Autopilot that it lost this week.

    The automaker was undeniably covering up for Autopilot.

    Last week, a jury found Tesla partially liable for a wrongful death involving a crash on Autopilot. We now have access to the trial transcripts, which confirm that Tesla was extremely misleading in its attempt to place all the blame on the driver.

    The company went as far as to actively withhold critical evidence that explained Autopilot’s performance around the crash. Within about three minutes of the crash, the Model S uploaded a “collision snapshot”—video, CAN‑bus streams, EDR data, etc.—to Tesla’s servers, the “Mothership”, and received an acknowledgement. The vehicle then deleted its local copy, resulting in Tesla being the only entity having access.

    What ensued were years of battle to get Tesla to acknowledge that this collision snapshot exists and is relevant to the case.

    The police repeatedly attempted to obtain the data from the collision snapshot, but Tesla led the authorities and the plaintiffs on a lengthy journey of deception and misdirection that spanned years.

    Remember they also defanged the regulations they were subject too also

  • That jumped out at me too. Giving the benefit of the doubt, it could be that this “snapshot” includes a very large amount of data that could be problematic if stored locally for longer. In reality, they probably do it this way for exactly this type of situation, so they can retain full control of the potentially-damning data.

    If they can transmit it, it is not a lot. It is that simple.

  • Tesla was caught withholding data, lying about it, and misdirecting authorities in the wrongful death case involving Autopilot that it lost this week.

    The automaker was undeniably covering up for Autopilot.

    Last week, a jury found Tesla partially liable for a wrongful death involving a crash on Autopilot. We now have access to the trial transcripts, which confirm that Tesla was extremely misleading in its attempt to place all the blame on the driver.

    The company went as far as to actively withhold critical evidence that explained Autopilot’s performance around the crash. Within about three minutes of the crash, the Model S uploaded a “collision snapshot”—video, CAN‑bus streams, EDR data, etc.—to Tesla’s servers, the “Mothership”, and received an acknowledgement. The vehicle then deleted its local copy, resulting in Tesla being the only entity having access.

    What ensued were years of battle to get Tesla to acknowledge that this collision snapshot exists and is relevant to the case.

    The police repeatedly attempted to obtain the data from the collision snapshot, but Tesla led the authorities and the plaintiffs on a lengthy journey of deception and misdirection that spanned years.

    Scumbags

  • Folks. Publicly traded companies will ALWAYS compare the expected value of breaking the law with compliance.

    Say it costs $100 million to follow the law. Breaking it comes with a $300 million fine, but only a 20% chance of getting caught.

    They compare a 100% chance of paying $100 million to a 20% chance of paying $300 million.

    Average cost of following the law: $100 million

    Average cost of breaking it: $60 million

    If we're gonna do capitalism (which I would rather we not, for the record!), we have to make that expected value calculation break in favor of following regulations. If it is cheaper to break the law than to follow it, you're not just losing money by complying: you're giving ground to your competition. Fines need to be massive. Infractions need to get caught and punished. Executives need to be held personally accountable. Corporations need to be dissolved. Fines cannot be just the cost of doing business.

    Do people need to (re)watch Fight Club?

    Narrator:

    A new car built by my company leaves somewhere traveling at 60 mph. The rear differential locks up. The car crashes and burns with everyone trapped inside.

    Now, should we initiate a recall?

    Take the number of vehicles in the field, A, multiply by the probable rate of failure, B, multiply by the average out-of-court settlement, C. A times B times C equals X.

    If X is less than the cost of a recall, we don't do one.

    It's been like 25 years.

    Did people like... genuienly not know this, forget about it?

  • If they can transmit it, it is not a lot. It is that simple.

    Nearly all of the vehicles have 4G/5G connectivity via AT&T. This isn't a dial up connection. They can transmit whatever the fuck they want.

  • I'd raise you in that. Companies are people now. If companies break laws, they should be held accountable personally. Even Club Fed would be misery for 14 years on a murder rap.

    I'd argue the entire C-suite should be legally responsible for anything the company does.

  • @DrunkEngineer A normal company fires its CEO and cleans house after something like that. Instead Tesla just offered him a big new compensation package to encourage him to stay and keep destroying their reputation and any shred of morality they may claim to have.

    A particular insurance company replaced a mysteriously dead executive with another who doubled down on premeditated murder.

  • Tesla was caught withholding data, lying about it, and misdirecting authorities in the wrongful death case involving Autopilot that it lost this week.

    The automaker was undeniably covering up for Autopilot.

    Last week, a jury found Tesla partially liable for a wrongful death involving a crash on Autopilot. We now have access to the trial transcripts, which confirm that Tesla was extremely misleading in its attempt to place all the blame on the driver.

    The company went as far as to actively withhold critical evidence that explained Autopilot’s performance around the crash. Within about three minutes of the crash, the Model S uploaded a “collision snapshot”—video, CAN‑bus streams, EDR data, etc.—to Tesla’s servers, the “Mothership”, and received an acknowledgement. The vehicle then deleted its local copy, resulting in Tesla being the only entity having access.

    What ensued were years of battle to get Tesla to acknowledge that this collision snapshot exists and is relevant to the case.

    The police repeatedly attempted to obtain the data from the collision snapshot, but Tesla led the authorities and the plaintiffs on a lengthy journey of deception and misdirection that spanned years.

    Is this the one where the car crashes after Autopilot turns off? Where Tesla tries to claim that the driver floored it after it turned off?

  • Within about three minutes of the crash, the Model S uploaded a “collision snapshot”—video, CAN‑bus streams, EDR data, etc.—to Tesla’s servers, the “Mothership”, and received an acknowledgement. The vehicle then deleted its local copy, resulting in Tesla being the only entity having access.

    Holy fucking shit. What is the purpose of deleting the data on the vehicle other than to sabotage the owner of the vehicle?

    Smells like intentional destruction of evidence.