Skip to content

Tesla withheld data, lied, and misdirected police and plaintiffs to avoid blame in Autopilot crash

Technology
73 51 4
  • While an attractive pitch, remember that what's going to fuck over the working class are the laws intended to attack the assholes at the top. Which group do you think will have the means to defend themselves in court, has the power to make legislative moves, and would benefit from having another tool to fuck their slaves with?

    Class action criminal suits then.

    Prosecute everyone who owns even a single share in the company and if the working class guy goes to jail, so does the wealthy guy, no exceptions. But by share of ownership so whoever owns most, gets the most jailtime.

    Offshore shareholders can just get all their assets in the country seized.

  • It is possible that the data is just never saved in non-volatile memory meaning that once power is lost that the values are also lost. In which case its not really deleting the information but rather just that information is just never intentionally saved.

    P.S. I am not a tesla fan boy just wanted to give this tiny insight.

    This explanation is completely fabricated, based on nothing, and nonsense.

    It is obviously critical data that nobody halfway competent would write to ram. Also video data is very large and makes no sense to store in ram.

    Furthermore the article says it was deleted and they later recovered it which would not have been possible with RAM

    Basically why are you pushing this drivel.

  • This explanation is completely fabricated, based on nothing, and nonsense.

    It is obviously critical data that nobody halfway competent would write to ram. Also video data is very large and makes no sense to store in ram.

    Furthermore the article says it was deleted and they later recovered it which would not have been possible with RAM

    Basically why are you pushing this drivel.

    If the data is temporarily stored until it is transmitted and then is not considered to be needed anymore I see no reason as to why that would need to be stored locally forever.

  • Within about three minutes of the crash, the Model S uploaded a “collision snapshot”—video, CAN‑bus streams, EDR data, etc.—to Tesla’s servers, the “Mothership”, and received an acknowledgement. The vehicle then deleted its local copy, resulting in Tesla being the only entity having access.

    Holy fucking shit. What is the purpose of deleting the data on the vehicle other than to sabotage the owner of the vehicle?

    The only thing making this nazi company its market value and all the hype is promis of self driving. The autopilot technology is the main value. If there will be proof of it is wrong, Tesla gonna loose the investors. Simply as that, fucking nazi Musk cannot allow proof that his shitty car killed peoples because of the autopilot. I recommend to search for podcast and reporting by The Guardian on this theme. I’m really looking forward to read the book Tesla files. It’s from the journalist who was contacted by Tesla whistleblower. There are thousands cases when the autopilot started to behave just “little crazy”.

  • If the data is temporarily stored until it is transmitted and then is not considered to be needed anymore I see no reason as to why that would need to be stored locally forever.

    Because it may not be possible to transmit depending on location. Also non violtile storage is cheap and fast and ram is normally limited

  • If the data is temporarily stored until it is transmitted and then is not considered to be needed anymore I see no reason as to why that would need to be stored locally forever.

    Perhaps most importantly although we know it was not so lost because we read the article or at least the summary if it had been it would have been a deliberate design decision to have it be so.

    Your explanation doesn't wash in reality but it also doesn't wash even in theory.

  • Within about three minutes of the crash, the Model S uploaded a “collision snapshot”—video, CAN‑bus streams, EDR data, etc.—to Tesla’s servers, the “Mothership”, and received an acknowledgement. The vehicle then deleted its local copy, resulting in Tesla being the only entity having access.

    Holy fucking shit. What is the purpose of deleting the data on the vehicle other than to sabotage the owner of the vehicle?

    Information to be used against you and never for you.

  • The conditioning of people to think it must be monetary fines is strong I guess. Imo it shouldn't be a fine for intentionally breaking laws, especially when putting lives in danger. It should be jail time for the executives. Make the calculation disappear altogether.

    A capitalist would say 'The market will correct this.'

  • Tesla was caught withholding data, lying about it, and misdirecting authorities in the wrongful death case involving Autopilot that it lost this week.

    The automaker was undeniably covering up for Autopilot.

    Last week, a jury found Tesla partially liable for a wrongful death involving a crash on Autopilot. We now have access to the trial transcripts, which confirm that Tesla was extremely misleading in its attempt to place all the blame on the driver.

    The company went as far as to actively withhold critical evidence that explained Autopilot’s performance around the crash. Within about three minutes of the crash, the Model S uploaded a “collision snapshot”—video, CAN‑bus streams, EDR data, etc.—to Tesla’s servers, the “Mothership”, and received an acknowledgement. The vehicle then deleted its local copy, resulting in Tesla being the only entity having access.

    What ensued were years of battle to get Tesla to acknowledge that this collision snapshot exists and is relevant to the case.

    The police repeatedly attempted to obtain the data from the collision snapshot, but Tesla led the authorities and the plaintiffs on a lengthy journey of deception and misdirection that spanned years.

    I really wish Roosevelt was around to smash these companies into pieces for this shit.

  • Tesla was caught withholding data, lying about it, and misdirecting authorities in the wrongful death case involving Autopilot that it lost this week.

    The automaker was undeniably covering up for Autopilot.

    Last week, a jury found Tesla partially liable for a wrongful death involving a crash on Autopilot. We now have access to the trial transcripts, which confirm that Tesla was extremely misleading in its attempt to place all the blame on the driver.

    The company went as far as to actively withhold critical evidence that explained Autopilot’s performance around the crash. Within about three minutes of the crash, the Model S uploaded a “collision snapshot”—video, CAN‑bus streams, EDR data, etc.—to Tesla’s servers, the “Mothership”, and received an acknowledgement. The vehicle then deleted its local copy, resulting in Tesla being the only entity having access.

    What ensued were years of battle to get Tesla to acknowledge that this collision snapshot exists and is relevant to the case.

    The police repeatedly attempted to obtain the data from the collision snapshot, but Tesla led the authorities and the plaintiffs on a lengthy journey of deception and misdirection that spanned years.

  • A capitalist would say 'The market will correct this.'

    Well the market corrected UHC's ceo

  • Didnt the article say they retrieved the filename and hash, thus proving the existence of the crash diagnostic snapshot. After which Tesla handed over their copy?

    Or did the forensics retrieve the actual data?

    Edit: Given the importance of this type of data, not saving it to non-voletile memory is negligent at best. Even if it required a huge amount of space, they could delete unimportant files like the Spotify cache or apps or whatever

    The article kind of fumbles the wording and creates confusion. There are, however, some passages that indicate to me that the actual data was recovered. All of the following are taking about the NAND flash memory.

    The engineers quickly found that all the data was there despite Tesla’s previous claims.

    ...

    Now, the plaintiffs had access to everything.

    ...

    Moore was astonished by all the data found through cloning the Autopilot ECU:

    “For an engineer like me, the data out of those computers was a treasure‑trove of how this crash happened.”

    ...

    On top of all the data being so much more helpful, Moore found unallocated space and metadata for snapshot_collision_airbag‑deployment.tar’, including its SHA‑1 checksum and the exact server path.

    It seems that maybe the .tar file itself was not recovered, but all the data about the crash was still there.

  • Because it may not be possible to transmit depending on location. Also non violtile storage is cheap and fast and ram is normally limited

    On embedded controllers you are usually heavily limited with nonvolatile memory.

  • Perhaps most importantly although we know it was not so lost because we read the article or at least the summary if it had been it would have been a deliberate design decision to have it be so.

    Your explanation doesn't wash in reality but it also doesn't wash even in theory.

    You're also making assumptions in that the volatile memory lost power and thus must have been cleared at some point. I dont think there is a right or a wrong based on the knowledge i have I just am throwing out a random guess.

  • Bullshit. It was saved locally. It can stay saved locally but be marked for deletion if storage gets tight. This is a solved computer science problem.

    There is zero reason to delete it immediate except to cover their asses.

    If I was on the jury I'd be pushing for maximum monetary penalty.

    Treble damages

  • Tesla was caught withholding data, lying about it, and misdirecting authorities in the wrongful death case involving Autopilot that it lost this week.

    The automaker was undeniably covering up for Autopilot.

    Last week, a jury found Tesla partially liable for a wrongful death involving a crash on Autopilot. We now have access to the trial transcripts, which confirm that Tesla was extremely misleading in its attempt to place all the blame on the driver.

    The company went as far as to actively withhold critical evidence that explained Autopilot’s performance around the crash. Within about three minutes of the crash, the Model S uploaded a “collision snapshot”—video, CAN‑bus streams, EDR data, etc.—to Tesla’s servers, the “Mothership”, and received an acknowledgement. The vehicle then deleted its local copy, resulting in Tesla being the only entity having access.

    What ensued were years of battle to get Tesla to acknowledge that this collision snapshot exists and is relevant to the case.

    The police repeatedly attempted to obtain the data from the collision snapshot, but Tesla led the authorities and the plaintiffs on a lengthy journey of deception and misdirection that spanned years.

    a company doing unethical immoral things, purgery and lying to officials? thats been done a billion times already. Elon is no different then any other scum bag who runs the world.

  • Tesla was caught withholding data, lying about it, and misdirecting authorities in the wrongful death case involving Autopilot that it lost this week.

    The automaker was undeniably covering up for Autopilot.

    Last week, a jury found Tesla partially liable for a wrongful death involving a crash on Autopilot. We now have access to the trial transcripts, which confirm that Tesla was extremely misleading in its attempt to place all the blame on the driver.

    The company went as far as to actively withhold critical evidence that explained Autopilot’s performance around the crash. Within about three minutes of the crash, the Model S uploaded a “collision snapshot”—video, CAN‑bus streams, EDR data, etc.—to Tesla’s servers, the “Mothership”, and received an acknowledgement. The vehicle then deleted its local copy, resulting in Tesla being the only entity having access.

    What ensued were years of battle to get Tesla to acknowledge that this collision snapshot exists and is relevant to the case.

    The police repeatedly attempted to obtain the data from the collision snapshot, but Tesla led the authorities and the plaintiffs on a lengthy journey of deception and misdirection that spanned years.

    Beta test of the Tesla Autolawyer huh?

  • You're also making assumptions in that the volatile memory lost power and thus must have been cleared at some point. I dont think there is a right or a wrong based on the knowledge i have I just am throwing out a random guess.

    The article says Tesla deletes it and was forced to produce it. Seems pretty obvious that your theory is wrong

  • Tesla was caught withholding data, lying about it, and misdirecting authorities in the wrongful death case involving Autopilot that it lost this week.

    The automaker was undeniably covering up for Autopilot.

    Last week, a jury found Tesla partially liable for a wrongful death involving a crash on Autopilot. We now have access to the trial transcripts, which confirm that Tesla was extremely misleading in its attempt to place all the blame on the driver.

    The company went as far as to actively withhold critical evidence that explained Autopilot’s performance around the crash. Within about three minutes of the crash, the Model S uploaded a “collision snapshot”—video, CAN‑bus streams, EDR data, etc.—to Tesla’s servers, the “Mothership”, and received an acknowledgement. The vehicle then deleted its local copy, resulting in Tesla being the only entity having access.

    What ensued were years of battle to get Tesla to acknowledge that this collision snapshot exists and is relevant to the case.

    The police repeatedly attempted to obtain the data from the collision snapshot, but Tesla led the authorities and the plaintiffs on a lengthy journey of deception and misdirection that spanned years.

    Another reason why Leon Hitler and Krasnov shut down the NTSB office that was investigating their shitty Autopilot system.

  • Tesla was caught withholding data, lying about it, and misdirecting authorities in the wrongful death case involving Autopilot that it lost this week.

    The automaker was undeniably covering up for Autopilot.

    Last week, a jury found Tesla partially liable for a wrongful death involving a crash on Autopilot. We now have access to the trial transcripts, which confirm that Tesla was extremely misleading in its attempt to place all the blame on the driver.

    The company went as far as to actively withhold critical evidence that explained Autopilot’s performance around the crash. Within about three minutes of the crash, the Model S uploaded a “collision snapshot”—video, CAN‑bus streams, EDR data, etc.—to Tesla’s servers, the “Mothership”, and received an acknowledgement. The vehicle then deleted its local copy, resulting in Tesla being the only entity having access.

    What ensued were years of battle to get Tesla to acknowledge that this collision snapshot exists and is relevant to the case.

    The police repeatedly attempted to obtain the data from the collision snapshot, but Tesla led the authorities and the plaintiffs on a lengthy journey of deception and misdirection that spanned years.

    And the consequence will be ... ?

  • 304 Stimmen
    106 Beiträge
    1k Aufrufe
    L
    Idk if this covers your needs, but Home Assistant is non-cloud and supports voice commands. They're selling a voice hardware now (preview edition): https://www.home-assistant.io/voice-pe/ While I've used HA for years, I've never tried any of the voice command methods, so can't really comment on it. I had just recently came across their voice hardware and am probably going to give it a try.
  • 105 Stimmen
    50 Beiträge
    599 Aufrufe
    Z
    "Dude trust me, just give me 40 billion more dollars, lobby for complete deregulation of the industry, and get me 50 more petabytes of data, then we will have a little human in the computer! RealshitGPT will have human level intelligence!"
  • 1k Stimmen
    352 Beiträge
    4k Aufrufe
    nutwrench@lemmy.mlN
    Well, "dark traffic" sounds SCARY. You wouldn't want to do anything scary, would you? Like, use the computer you paid for to control the content you want to see? /s
  • Compact but Capable: Oiwa Garage’s Custom Honda Acty Projects

    Technology technology
    1
    2
    2 Stimmen
    1 Beiträge
    17 Aufrufe
    Niemand hat geantwortet
  • Software is evolving backwards

    Technology technology
    64
    1
    341 Stimmen
    64 Beiträge
    631 Aufrufe
    M
    Came here looking for this
  • AI and misinformation

    Technology technology
    3
    20 Stimmen
    3 Beiträge
    40 Aufrufe
    D
    Don’t lose hope, just pretend to with sarcasm. Or if you are feeling down it could work the other way too. https://aibusiness.com/nlp/sarcasm-is-really-really-really-easy-for-ai-to-handle#close-modal
  • Companies are using Ribbon AI, an AI interviewer to screen candidates.

    Technology technology
    52
    56 Stimmen
    52 Beiträge
    409 Aufrufe
    P
    I feel like I could succeed in an LLM selection process. I could sell my skills to a robot, could get an LLM to help. It's a long way ahead of keyword based automatic selectors At least an LLM is predictable, human judges are so variable
  • Meta Reportedly Eyeing 'Super Sensing' Tech for Smart Glasses

    Technology technology
    4
    1
    34 Stimmen
    4 Beiträge
    46 Aufrufe
    M
    I see your point but also I just genuinely don't have a mind for that shit. Even my own close friends and family, it never pops into my head to ask about that vacation they just got back from or what their kids are up to. I rely on social cues from others, mainly my wife, to sort of kick start my brain. I just started a new job. I can't remember who said they were into fishing and who didn't, and now it's anxiety inducing to try to figure out who is who. Or they ask me a friendly question and I get caught up answering and when I'm done I forget to ask it back to them (because frequently asking someone about their weekend or kids or whatever is their way of getting to share their own life with you, but my brain doesn't think that way). I get what you're saying. It could absolutely be used for performative interactions but for some of us people drift away because we aren't good at being curious about them or remembering details like that. And also, I have to sit through awkward lunches at work where no one really knows what to talk about or ask about because outside of work we are completely alien to one another. And it's fine. It wouldn't be worth the damage it does. I have left behind all personally identifiable social media for the same reason. But I do hate how social anxiety and ADHD makes friendship so fleeting.