Tesla In 'Self-Drive Mode' Hit By Train After Turning Onto Train Tracks
-
This post did not contain any content.
You could not pay me to drive a Tesla.
-
This post did not contain any content.
What a cool and futuristic car. It’s all computer!
I’m still waiting for Elon’s car to drive onto train tracks.
-
And who is going to willingly get into a Tesla Taxi?!?!
You still don't have to get in for one to hit you. I ride a motorcycle and I'm always sketched out when there's a Tesla behind me
-
Tesla's self-driving is pretty shite but they seem to have a particular problem with railway crossings, as also pointed out in the article. Of all of the obstacles for the self-driving system to fail to detect, the several thousand tons of moving steel is probably one of the worst outcomes.
That, and little kids... and motorcycles... and school busses
-
Paraphrasing:
"We only have the driver's word they were in self driving mode..."
"This isn't the first time a Tesla has driven onto train tracks..."
Since it isn't the first time I'm gonna go ahead and believe the driver, thanks.
Since the story has 3 separate incidents where "the driver let their Tesla turn left onto some railroad tracks" I'm going to posit:
Teslas on self-drive mode will turn left onto railroad tracks unless forcibly restrained.
Prove me wrong, Tesla
-
This post did not contain any content.
Deregulation, ain't it great.
-
Since the story has 3 separate incidents where "the driver let their Tesla turn left onto some railroad tracks" I'm going to posit:
Teslas on self-drive mode will turn left onto railroad tracks unless forcibly restrained.
Prove me wrong, Tesla
I mean …… Tesla self driving allegedly did this three times in three years but we don’t yet have public data to verify that’s what happened nor do we in any way compare it to what human drivers do.
Although one of the many ways I think I’m an above average driver (just like everyone else) is that people do a lot of stupid things at railroad crossings and I never would
-
Furthermore, with the amount of telemetry that those cars have The company knows whether it was in self drive or not when it went onto the track. So the fact that they didn't go public saying it wasn't means that it was in self-drive mode and they want to save the PR face and liability.
I've heard they also like to disengage self-driving mode right before a collision.
-
This post did not contain any content.
Tesla's are death traps.
-
Tesla has constantly lied about their FSD for a decade. We don't trust them because they are untrustworthy, not because we don't like them.
They promote it in ways that people sometimes trust it too much …. But in particular when releasing telemetry I do t remember tha ever being an accusation
-
I've heard they also like to disengage self-driving mode right before a collision.
That sounds a lot more like a rumor to me.. it would be extremely suspicious and would leave them open to GIGANTIC liability issues.
-
I have no sources for this so take with a grain of salt... But I've heard that Tesla turns off self driving just before an accident so they can say it was the drivers fault. Now in this case, if it was on while it drove on the tracks I would think would prove it's Tesla's faulty self driving plus human error for not correcting it. Either way it would prove partly Tesla's fault if it was on at the time.
They supposedly also have a threshold, like ten seconds - if FSD cuts out less than that threshold before the accident, it’s still FSD’s fault
-
That sounds a lot more like a rumor to me.. it would be extremely suspicious and would leave them open to GIGANTIC liability issues.
It's been well documented. It lets them say in their statistics that the owner was in control of the car during the crash
-
That sounds a lot more like a rumor to me.. it would be extremely suspicious and would leave them open to GIGANTIC liability issues.
How so? The human in the car is always ultimately responsible when using level 3 driver assists. Tesla does not have level 4/5 self-driving and therefore doesn’t have to assume any liability.
-
That sounds a lot more like a rumor to me.. it would be extremely suspicious and would leave them open to GIGANTIC liability issues.
In the report, the NHTSA spotlights 16 separate crashes, each involving a Tesla vehicle plowing into stopped first responders and highway maintenance vehicles. In the crashes, it claims, records show that the self-driving feature had "aborted vehicle control less than one second prior to the first impact"
Tesla Accused of Shutting Off Autopilot Moments Before Impact
In 16 separate Tesla crashes, records showed that the self-driving feature had suspiciously "aborted vehicle control" just before impact.
Futurism (futurism.com)
-
I mean... I have seen some REALLY REALLY stupid drivers so I could totally see multiple people thinking they found a short cut or not realizing the road they are supposed to be on is 20 feet to the left and there is a reason their phone is losing its shit all while their suspension is getting destroyed.
But yeah. It is the standard tesla corp MO. They detect a dangerous situation and disable all the "self driving". Obviously because it is up to the driver to handle it and not because they want the legal protection to say it wasn't their fault.
At my local commuter rail station the entrance to the parking lot is immediately next to the track. It’s easily within margin of error for GPS and if you’re only focusing immediately in front of you the pavement at the entrance probably look similar.
There are plenty of cues so don’t rolled shouldn’t be fooled but perhaps FSD wouldn’t pay attention to them since it’s a bit of an outlier.
That being said, I almost got my Subaru stuck once because an ATV trail looked like the dirt road to a campsite from the GPS, and I missed any cues there may have been
-
They promote it in ways that people sometimes trust it too much …. But in particular when releasing telemetry I do t remember tha ever being an accusation
It’s more about when they don’t release it/only selectively say things that make them look good and staying silent when they look bad.
-
On a related note, getting unstuck from something like train tracks is a pretty significant hurdles. The only real way is to back up IF turning onto the tracks wasn't a drop down of the same depth as the rails. Someone who is caught off guard isn't going to be able to turn a passenger car off the tracks because the rails are tall and getting an angle with the wheels to get over them isn't really available.
So while in a perfect world the driver would have slammed on the brakes immediately before it got onto the tracks, getting even the front wheels onto the tracks because they weren't fast enough may have been impossible to recover from and going forward might have been their best bet. Depends on how the track crossing is built.
If you’re about to be hit by a train, driving forward through the barrier is always the correct choice. It will move out of the way and you stay alive to fix the scratches in your paint.
-
I've heard they also like to disengage self-driving mode right before a collision.
That actually sounds like a reasonable response. Driving assist means that a human is supposed to be attentive to take control. If the system detects a situation where it's unable to make a good decision, dumping that decision on the human in control seems like the closest they have to a "fail safe" option. Of course, there should probably also be an understanding that people are stupid and will almost certainly have stopped paying attention a long time ago. So, maybe a "human take the wheel" followed by a "slam the brakes" if no input is detected in 2-3 seconds. While an emergency stop isn't always the right choice, it probably beats leaving a several ton metal object hurtling along uncontrolled in nearly every circumstance.
-
At my local commuter rail station the entrance to the parking lot is immediately next to the track. It’s easily within margin of error for GPS and if you’re only focusing immediately in front of you the pavement at the entrance probably look similar.
There are plenty of cues so don’t rolled shouldn’t be fooled but perhaps FSD wouldn’t pay attention to them since it’s a bit of an outlier.
That being said, I almost got my Subaru stuck once because an ATV trail looked like the dirt road to a campsite from the GPS, and I missed any cues there may have been
You uh... don't need to tell people stuff like that.