Skip to content

Tesla withheld data, lied, and misdirected police and plaintiffs to avoid blame in Autopilot crash

Technology
73 51 3
  • Tesla was caught withholding data, lying about it, and misdirecting authorities in the wrongful death case involving Autopilot that it lost this week.

    The automaker was undeniably covering up for Autopilot.

    Last week, a jury found Tesla partially liable for a wrongful death involving a crash on Autopilot. We now have access to the trial transcripts, which confirm that Tesla was extremely misleading in its attempt to place all the blame on the driver.

    The company went as far as to actively withhold critical evidence that explained Autopilot’s performance around the crash. Within about three minutes of the crash, the Model S uploaded a “collision snapshot”—video, CAN‑bus streams, EDR data, etc.—to Tesla’s servers, the “Mothership”, and received an acknowledgement. The vehicle then deleted its local copy, resulting in Tesla being the only entity having access.

    What ensued were years of battle to get Tesla to acknowledge that this collision snapshot exists and is relevant to the case.

    The police repeatedly attempted to obtain the data from the collision snapshot, but Tesla led the authorities and the plaintiffs on a lengthy journey of deception and misdirection that spanned years.

  • A capitalist would say 'The market will correct this.'

    Well the market corrected UHC's ceo

  • Didnt the article say they retrieved the filename and hash, thus proving the existence of the crash diagnostic snapshot. After which Tesla handed over their copy?

    Or did the forensics retrieve the actual data?

    Edit: Given the importance of this type of data, not saving it to non-voletile memory is negligent at best. Even if it required a huge amount of space, they could delete unimportant files like the Spotify cache or apps or whatever

    The article kind of fumbles the wording and creates confusion. There are, however, some passages that indicate to me that the actual data was recovered. All of the following are taking about the NAND flash memory.

    The engineers quickly found that all the data was there despite Tesla’s previous claims.

    ...

    Now, the plaintiffs had access to everything.

    ...

    Moore was astonished by all the data found through cloning the Autopilot ECU:

    “For an engineer like me, the data out of those computers was a treasure‑trove of how this crash happened.”

    ...

    On top of all the data being so much more helpful, Moore found unallocated space and metadata for snapshot_collision_airbag‑deployment.tar’, including its SHA‑1 checksum and the exact server path.

    It seems that maybe the .tar file itself was not recovered, but all the data about the crash was still there.

  • Because it may not be possible to transmit depending on location. Also non violtile storage is cheap and fast and ram is normally limited

    On embedded controllers you are usually heavily limited with nonvolatile memory.

  • Perhaps most importantly although we know it was not so lost because we read the article or at least the summary if it had been it would have been a deliberate design decision to have it be so.

    Your explanation doesn't wash in reality but it also doesn't wash even in theory.

    You're also making assumptions in that the volatile memory lost power and thus must have been cleared at some point. I dont think there is a right or a wrong based on the knowledge i have I just am throwing out a random guess.

  • Bullshit. It was saved locally. It can stay saved locally but be marked for deletion if storage gets tight. This is a solved computer science problem.

    There is zero reason to delete it immediate except to cover their asses.

    If I was on the jury I'd be pushing for maximum monetary penalty.

    Treble damages

  • Tesla was caught withholding data, lying about it, and misdirecting authorities in the wrongful death case involving Autopilot that it lost this week.

    The automaker was undeniably covering up for Autopilot.

    Last week, a jury found Tesla partially liable for a wrongful death involving a crash on Autopilot. We now have access to the trial transcripts, which confirm that Tesla was extremely misleading in its attempt to place all the blame on the driver.

    The company went as far as to actively withhold critical evidence that explained Autopilot’s performance around the crash. Within about three minutes of the crash, the Model S uploaded a “collision snapshot”—video, CAN‑bus streams, EDR data, etc.—to Tesla’s servers, the “Mothership”, and received an acknowledgement. The vehicle then deleted its local copy, resulting in Tesla being the only entity having access.

    What ensued were years of battle to get Tesla to acknowledge that this collision snapshot exists and is relevant to the case.

    The police repeatedly attempted to obtain the data from the collision snapshot, but Tesla led the authorities and the plaintiffs on a lengthy journey of deception and misdirection that spanned years.

    a company doing unethical immoral things, purgery and lying to officials? thats been done a billion times already. Elon is no different then any other scum bag who runs the world.

  • Tesla was caught withholding data, lying about it, and misdirecting authorities in the wrongful death case involving Autopilot that it lost this week.

    The automaker was undeniably covering up for Autopilot.

    Last week, a jury found Tesla partially liable for a wrongful death involving a crash on Autopilot. We now have access to the trial transcripts, which confirm that Tesla was extremely misleading in its attempt to place all the blame on the driver.

    The company went as far as to actively withhold critical evidence that explained Autopilot’s performance around the crash. Within about three minutes of the crash, the Model S uploaded a “collision snapshot”—video, CAN‑bus streams, EDR data, etc.—to Tesla’s servers, the “Mothership”, and received an acknowledgement. The vehicle then deleted its local copy, resulting in Tesla being the only entity having access.

    What ensued were years of battle to get Tesla to acknowledge that this collision snapshot exists and is relevant to the case.

    The police repeatedly attempted to obtain the data from the collision snapshot, but Tesla led the authorities and the plaintiffs on a lengthy journey of deception and misdirection that spanned years.

    Beta test of the Tesla Autolawyer huh?

  • You're also making assumptions in that the volatile memory lost power and thus must have been cleared at some point. I dont think there is a right or a wrong based on the knowledge i have I just am throwing out a random guess.

    The article says Tesla deletes it and was forced to produce it. Seems pretty obvious that your theory is wrong

  • Tesla was caught withholding data, lying about it, and misdirecting authorities in the wrongful death case involving Autopilot that it lost this week.

    The automaker was undeniably covering up for Autopilot.

    Last week, a jury found Tesla partially liable for a wrongful death involving a crash on Autopilot. We now have access to the trial transcripts, which confirm that Tesla was extremely misleading in its attempt to place all the blame on the driver.

    The company went as far as to actively withhold critical evidence that explained Autopilot’s performance around the crash. Within about three minutes of the crash, the Model S uploaded a “collision snapshot”—video, CAN‑bus streams, EDR data, etc.—to Tesla’s servers, the “Mothership”, and received an acknowledgement. The vehicle then deleted its local copy, resulting in Tesla being the only entity having access.

    What ensued were years of battle to get Tesla to acknowledge that this collision snapshot exists and is relevant to the case.

    The police repeatedly attempted to obtain the data from the collision snapshot, but Tesla led the authorities and the plaintiffs on a lengthy journey of deception and misdirection that spanned years.

    Another reason why Leon Hitler and Krasnov shut down the NTSB office that was investigating their shitty Autopilot system.

  • Tesla was caught withholding data, lying about it, and misdirecting authorities in the wrongful death case involving Autopilot that it lost this week.

    The automaker was undeniably covering up for Autopilot.

    Last week, a jury found Tesla partially liable for a wrongful death involving a crash on Autopilot. We now have access to the trial transcripts, which confirm that Tesla was extremely misleading in its attempt to place all the blame on the driver.

    The company went as far as to actively withhold critical evidence that explained Autopilot’s performance around the crash. Within about three minutes of the crash, the Model S uploaded a “collision snapshot”—video, CAN‑bus streams, EDR data, etc.—to Tesla’s servers, the “Mothership”, and received an acknowledgement. The vehicle then deleted its local copy, resulting in Tesla being the only entity having access.

    What ensued were years of battle to get Tesla to acknowledge that this collision snapshot exists and is relevant to the case.

    The police repeatedly attempted to obtain the data from the collision snapshot, but Tesla led the authorities and the plaintiffs on a lengthy journey of deception and misdirection that spanned years.

    And the consequence will be ... ?

  • Same walking route?

  • On embedded controllers you are usually heavily limited with nonvolatile memory.

    Not sure why you think this, it’s generally trivial to add non-volatile storage to microcontrollers, and much more complicated to add external RAM.

  • DIY cyborg (Nerdforge)

    Technology technology
    1
    22 Stimmen
    1 Beiträge
    0 Aufrufe
    Niemand hat geantwortet
  • EU age verification app to ban any Android system not licensed by Google

    Technology technology
    124
    538 Stimmen
    124 Beiträge
    1k Aufrufe
    arararagi@ani.socialA
    At least in the UK it has been the Labor party doing it, they all want control.
  • 31 Stimmen
    3 Beiträge
    17 Aufrufe
    L
    Oh look, the live service FOMO tactic is leaking out. First they fail to protect the children, now they fail to protect themselves.
  • 287 Stimmen
    56 Beiträge
    440 Aufrufe
    T
    well they all did add to the discussion! they gave me something to think about
  • Why Smart Uniform Systems Are Essential for Manufacturing Plants

    Technology technology
    1
    0 Stimmen
    1 Beiträge
    16 Aufrufe
    Niemand hat geantwortet
  • 310 Stimmen
    37 Beiträge
    360 Aufrufe
    S
    Same, especially when searching technical or niche topics. Since there aren't a ton of results specific to the topic, mostly semi-related results will appear in the first page or two of a regular (non-Gemini) Google search, just due to the higher popularity of those webpages compared to the relevant webpages. Even the relevant webpages will have lots of non-relevant or semi-relevant information surrounding the answer I'm looking for. I don't know enough about it to be sure, but Gemini is probably just scraping a handful of websites on the first page, and since most of those are only semi-related, the resulting summary is a classic example of garbage in, garbage out. I also think there's probably something in the code that looks for information that is shared across multiple sources and prioritizing that over something that's only on one particular page (possibly the sole result with the information you need). Then, it phrases the summary as a direct answer to your query, misrepresenting the actual information on the pages they scraped. At least Gemini gives sources, I guess. The thing that gets on my nerves the most is how often I see people quote the summary as proof of something without checking the sources. It was bad before the rollout of Gemini, but at least back then Google was mostly scraping text and presenting it with little modification, along with a direct link to the webpage. Now, it's an LLM generating text phrased as a direct answer to a question (that was also AI-generated from your search query) using AI-summarized data points scraped from multiple webpages. It's obfuscating the source material further, but I also can't help but feel like it exposes a little of the behind-the-scenes fuckery Google has been doing for years before Gemini. How it bastardizes your query by interpreting it into a question, and then prioritizes homogeneous results that agree on the "answer" to your "question". For years they've been doing this to a certain extent, they just didn't share how they interpreted your query.
  • 61 Stimmen
    11 Beiträge
    105 Aufrufe
    K
    If you use LLMs like they should be, i.e. as autocomplete, they're helpful. Classic autocomplete can't see me type "import" and correctly guess that I want to import a file that I just created, but Copilot can. You shouldn't expect it to understand code, but it can type more quickly than you and plug the right things in more often than not.
  • Researchers develop recyclable, healable electronics

    Technology technology
    3
    1
    15 Stimmen
    3 Beiträge
    38 Aufrufe
    T
    Isn't the most common failure modes of electronics capacitors dying, followed closely by heat in chips? This research sounds cool and all.