Tesla In 'Self-Drive Mode' Hit By Train After Turning Onto Train Tracks
-
Maybe if they use LIDAR like they should have instead of just cameras it wouldn’t be such an issue, but they’re determined to minimize costs and maximize profits at the expense of consumers as are all publicly traded companies
You don't understand. Musk likes how they look, we can't disturb that for "safety"!
-
And who is going to willingly get into a Tesla Taxi?!?!
I won't get in one because I'm not giving a single dollar of business to Musk. He can go jump up his own asshole.
-
You don't understand. Musk likes how they look, we can't disturb that for "safety"!
Or it no longer has anything to do with making a vehicle look cool.
The Lucid Air is equipped with up to 32 on-board sensors, including long range Lidar radar, short-range radar, surround view monitoring cameras.
It's because musk treats all his businesses like startups, and no matter how successful they get, in the interest of "trimming the fat" he'd like to keep people buying inferior products at a higher profit margin than thinking about better investment and long term growth, just like many companies.
-
This post did not contain any content.
Meanwhile my sister's fiancee drank the whole pitcher and is back working there totally believing Elon is some super genius and that the cars are capable of full self-driving. (I really have to bite my tongue when listening to him)
My camry can self-drive too, same outcome. (just off a cliff, rather than in front of a train)
-
LOL wasn't expecting that.
-
Full stream ahead with the train puns
Whoa, that typo nearly knocked this discussion off the rails...
-
This post did not contain any content.
New food chain just dropped
-
Paraphrasing:
"We only have the driver's word they were in self driving mode..."
"This isn't the first time a Tesla has driven onto train tracks..."
Since it isn't the first time I'm gonna go ahead and believe the driver, thanks.
Maybe I'm missing something, but isn't it trivial to take it out of their bullshit dangerous "FSD" mode and take control? How does a car go approximately 40-50 feet down the tracks without the driver noticing and stopping it?
-
I won't get in one because I'm not giving a single dollar of business to Musk. He can go jump up his own asshole.
Also a very valid point.
-
Maybe I'm missing something, but isn't it trivial to take it out of their bullshit dangerous "FSD" mode and take control? How does a car go approximately 40-50 feet down the tracks without the driver noticing and stopping it?
Yes.
You hit the brake.
-
This post did not contain any content.
Driver failed to control their car and avoid a collision.
FTFY.
I'm sure the car did actually take the action. But there are TONS of unavoidable warnings and reminders to the driver to supervise and take control when FSD goes wrong.
Which you can do by such super-technical means as "hitting the brake" or "steering the other way" or "flipping the right stalk up". Rocket science, I know.
Driver's fault. Bad technology, yes. Worse driver.
-
They didn't make a turn into a crossing. It turned onto the tracks.
Just to be clear for others, it did so at a crossing. That's still obviously not what it should have done and it's no defence of the self-driving feature, but I read your comment as suggesting it had found its way onto train tracks by some other route.
Thanks. I could have clarified better myself. I meant "didn't turn from a rail-parallel road onto a crossing to be met by a train it couldn't reasonably detect due to bad road design"
-
Maybe I'm missing something, but isn't it trivial to take it out of their bullshit dangerous "FSD" mode and take control? How does a car go approximately 40-50 feet down the tracks without the driver noticing and stopping it?
On some railroad crossings you might only need to go off the crossing to get stuck in the tracks and unable to back out. Trying to get out is another 30-40 feet.
Being caught off guard when the car isn't supposed to do that is how to get stuck in the first place. Yeah, terrible driver trusting shit technology.
-
That … tracks
-
This post did not contain any content.
Honey, are those train tracks? .... Yes looks like we'll turn left on to the tracks for 1/2 a mile. Its a detour.
-
How so? The human in the car is always ultimately responsible when using level 3 driver assists. Tesla does not have level 4/5 self-driving and therefore doesn’t have to assume any liability.
If you are monkeying with the car right before it crashes... wouldn't that raise suspicion?
-
It's been well documented. It lets them say in their statistics that the owner was in control of the car during the crash
That's my whole point
-
Driver failed to control their car and avoid a collision.
FTFY.
I'm sure the car did actually take the action. But there are TONS of unavoidable warnings and reminders to the driver to supervise and take control when FSD goes wrong.
Which you can do by such super-technical means as "hitting the brake" or "steering the other way" or "flipping the right stalk up". Rocket science, I know.
Driver's fault. Bad technology, yes. Worse driver.
I'd have bailed out and waited for the insurance check. Then got a different car.
-
Driver failed to control their car and avoid a collision.
FTFY.
I'm sure the car did actually take the action. But there are TONS of unavoidable warnings and reminders to the driver to supervise and take control when FSD goes wrong.
Which you can do by such super-technical means as "hitting the brake" or "steering the other way" or "flipping the right stalk up". Rocket science, I know.
Driver's fault. Bad technology, yes. Worse driver.
::: spoiler spoiler
sdfsafsafsdaf
::: -
It simply saw a superior technology and decided to attack.
::: spoiler spoiler
sdfsafsafsdaf
:::