Tesla In 'Self-Drive Mode' Hit By Train After Turning Onto Train Tracks
-
This post did not contain any content.
-
This post did not contain any content.
Teslas do still have steering wheels, after all
You don't say!
-
This post did not contain any content.
Working as expected then.
-
This post did not contain any content.
Can't wait to hop in a Robotaxi! /s
What's that? They'll have human drivers in them? Still maybe no.
-
This post did not contain any content.
Tesla's self-driving is pretty shite but they seem to have a particular problem with railway crossings, as also pointed out in the article. Of all of the obstacles for the self-driving system to fail to detect, the several thousand tons of moving steel is probably one of the worst outcomes.
-
Can't wait to hop in a Robotaxi! /s
What's that? They'll have human drivers in them? Still maybe no.
Pretty sure this one also had a driver in it.
A family in Pennsylvania
-
This post did not contain any content.
Paraphrasing:
"We only have the driver's word they were in self driving mode..."
"This isn't the first time a Tesla has driven onto train tracks..."
Since it isn't the first time I'm gonna go ahead and believe the driver, thanks.
-
This post did not contain any content.
If only there was a way to avoid the place where trains drive.
I checked first. They didn't make a turn into a crossing. It turned onto the tracks. Jalopnik says there's no official statement that it was actually driving under FSD(elusion) but if it was strictly under human driving (or FSD turned itself off after driving off) I guarantee Tesla will invade privacy and slander the driver by next day for the sake of court of public opinion
-
Paraphrasing:
"We only have the driver's word they were in self driving mode..."
"This isn't the first time a Tesla has driven onto train tracks..."
Since it isn't the first time I'm gonna go ahead and believe the driver, thanks.
The ~2010 runaway Toyota hysteria was ultimately blamed on mechanical problems less than half the time. Floor mats jamming the pedal, drivers mixing up gas/brake pedals in panic, downright lying to evade a speeding ticket, etc were cause for many cases.
Should a manufacturer be held accountable for legitimate flaws? Absolutely. Should drivers be absolved without the facts just because we don't like a company? I don't think so. But if Tesla has proof fsd was off, we'll know in a minute when they invade the driver's privacy and release driving events
-
The ~2010 runaway Toyota hysteria was ultimately blamed on mechanical problems less than half the time. Floor mats jamming the pedal, drivers mixing up gas/brake pedals in panic, downright lying to evade a speeding ticket, etc were cause for many cases.
Should a manufacturer be held accountable for legitimate flaws? Absolutely. Should drivers be absolved without the facts just because we don't like a company? I don't think so. But if Tesla has proof fsd was off, we'll know in a minute when they invade the driver's privacy and release driving events
Tesla has constantly lied about their FSD for a decade. We don't trust them because they are untrustworthy, not because we don't like them.
-
The ~2010 runaway Toyota hysteria was ultimately blamed on mechanical problems less than half the time. Floor mats jamming the pedal, drivers mixing up gas/brake pedals in panic, downright lying to evade a speeding ticket, etc were cause for many cases.
Should a manufacturer be held accountable for legitimate flaws? Absolutely. Should drivers be absolved without the facts just because we don't like a company? I don't think so. But if Tesla has proof fsd was off, we'll know in a minute when they invade the driver's privacy and release driving events
The ~2010 runaway Toyota hysteria was ultimately blamed on mechanical problems less than half the time. Floor mats jamming the pedal, drivers mixing up gas/brake pedals in panic, downright lying to evade a speeding ticket, etc were cause for many cases.
I owned an FJ80 Land Cruiser when that happened. I printed up a couple stickers for myself, and for a buddy who owned a Tacoma, that said "I'm not speeding, my pedal's stuck!" (yes I'm aware the FJ80 was slow as dogshit, that didn't stop me from speeding).
-
Tesla has constantly lied about their FSD for a decade. We don't trust them because they are untrustworthy, not because we don't like them.
I have no sources for this so take with a grain of salt... But I've heard that Tesla turns off self driving just before an accident so they can say it was the drivers fault. Now in this case, if it was on while it drove on the tracks I would think would prove it's Tesla's faulty self driving plus human error for not correcting it. Either way it would prove partly Tesla's fault if it was on at the time.
-
I have no sources for this so take with a grain of salt... But I've heard that Tesla turns off self driving just before an accident so they can say it was the drivers fault. Now in this case, if it was on while it drove on the tracks I would think would prove it's Tesla's faulty self driving plus human error for not correcting it. Either way it would prove partly Tesla's fault if it was on at the time.
That would require their self driving algorithm to actually detect an accident. I doubt it's capable of doing so consistently.
-
I have no sources for this so take with a grain of salt... But I've heard that Tesla turns off self driving just before an accident so they can say it was the drivers fault. Now in this case, if it was on while it drove on the tracks I would think would prove it's Tesla's faulty self driving plus human error for not correcting it. Either way it would prove partly Tesla's fault if it was on at the time.
On a related note, getting unstuck from something like train tracks is a pretty significant hurdles. The only real way is to back up IF turning onto the tracks wasn't a drop down of the same depth as the rails. Someone who is caught off guard isn't going to be able to turn a passenger car off the tracks because the rails are tall and getting an angle with the wheels to get over them isn't really available.
So while in a perfect world the driver would have slammed on the brakes immediately before it got onto the tracks, getting even the front wheels onto the tracks because they weren't fast enough may have been impossible to recover from and going forward might have been their best bet. Depends on how the track crossing is built.
-
Paraphrasing:
"We only have the driver's word they were in self driving mode..."
"This isn't the first time a Tesla has driven onto train tracks..."
Since it isn't the first time I'm gonna go ahead and believe the driver, thanks.
I mean... I have seen some REALLY REALLY stupid drivers so I could totally see multiple people thinking they found a short cut or not realizing the road they are supposed to be on is 20 feet to the left and there is a reason their phone is losing its shit all while their suspension is getting destroyed.
But yeah. It is the standard tesla corp MO. They detect a dangerous situation and disable all the "self driving". Obviously because it is up to the driver to handle it and not because they want the legal protection to say it wasn't their fault.
-
I have no sources for this so take with a grain of salt... But I've heard that Tesla turns off self driving just before an accident so they can say it was the drivers fault. Now in this case, if it was on while it drove on the tracks I would think would prove it's Tesla's faulty self driving plus human error for not correcting it. Either way it would prove partly Tesla's fault if it was on at the time.
Pretty sure they can tell the method used when disengaging fsd/ap. So they would know if it was manually turned off or if the system lost enough info and shut it down. They should be able to tell within a few seconds if accuracy the order of events. I can't imagine a scenario that wouldn't be blatantly obvious where the tesla was able to determine an accident was imminent and shut off fsd/ap wroth enough time to "blame it on the driver". What might be possible is that the logs show fsd shut off like a millisecond before impact/event and then someone merely reported that fsd was not engaged at the time of the accident. Technically true and tesla lawyers might fight like hell to maintain that theory, but if an independent source is able to review the logs, I don't see that being a possibility.
-
Pretty sure they can tell the method used when disengaging fsd/ap. So they would know if it was manually turned off or if the system lost enough info and shut it down. They should be able to tell within a few seconds if accuracy the order of events. I can't imagine a scenario that wouldn't be blatantly obvious where the tesla was able to determine an accident was imminent and shut off fsd/ap wroth enough time to "blame it on the driver". What might be possible is that the logs show fsd shut off like a millisecond before impact/event and then someone merely reported that fsd was not engaged at the time of the accident. Technically true and tesla lawyers might fight like hell to maintain that theory, but if an independent source is able to review the logs, I don't see that being a possibility.
Of course they know, they're using it to hide the truth. Stop giving a corporation the benefit of the doubt where public safety is concerned, especially when they've been shown to abuse it in the past
-
Tesla's self-driving is pretty shite but they seem to have a particular problem with railway crossings, as also pointed out in the article. Of all of the obstacles for the self-driving system to fail to detect, the several thousand tons of moving steel is probably one of the worst outcomes.
Maybe if they use LIDAR like they should have instead of just cameras it wouldn’t be such an issue, but they’re determined to minimize costs and maximize profits at the expense of consumers as are all publicly traded companies
-
This post did not contain any content.
How symbolic
-
Paraphrasing:
"We only have the driver's word they were in self driving mode..."
"This isn't the first time a Tesla has driven onto train tracks..."
Since it isn't the first time I'm gonna go ahead and believe the driver, thanks.
Furthermore, with the amount of telemetry that those cars have The company knows whether it was in self drive or not when it went onto the track. So the fact that they didn't go public saying it wasn't means that it was in self-drive mode and they want to save the PR face and liability.
-
-
-
-
-
'Fortnite' Lobbies Can Now Have Up to 92% Bots - Players Are Furious Over Supposed OG Season 3 Update
Technology1
-
A judge set the timeline for the Amazon antitrust trial, which starts on February 9, 2027
Technology1
-
-
Paul McCartney and Dua Lipa urge UK Prime Minister to rethink his AI copyright plans. A new law could soon allow AI companies to use copyrighted material without permission.
Technology1