Can Tesla's Self-Driving Software Handle Bus-Only Lanes? Not Reliably, No.
-
The guy posting it everywhere all over reddit and twitter himself says it happened in February. People on here, including myself, have linked to his posts saying it happened in February.
He could simply blur his face…..
Well now you've made me go read all their Reddit comments. It happened February 27, and they said it took 2 months for their insurance to get processed. It doesn't seem unreasonable to me to wait until they've got everything sorted out before posting.
Based on pictures of the dash cam files they posted, I don't think they have internal video, they would have had to turn it on before hand I think? Tesla mostly talks about sentry mode internal camera, so I don't actually know if it would record by default.
But again, I don't think it's reasonable to expect them to edit a video with face blurring and everything, they don't owe the Internet anything, and not everyone even knows how.I'll be honest I don't know for sure they're being truthful, but I'm certainly a lot less skeptical about than you. I think enough of their story checks out that I'm not going to discount it based on lack of additional info alone. I haven't seen them contradict themselves anywhere.
-
I’m not aware of any cars that will use radar, cameras, or lidar all the time and automatically stop your car to a complete standstill while you’re manually operating it. Yours does this? It physically won’t let you run into something, ever?
Yes, it does. It performs speed sign recognition and lane departure warning continuously as well, but will only perform steering correction above a minimum speed (I believe 50 kmh) and adjust the speed while adaptive cruise control is switched on.
-
Well now you've made me go read all their Reddit comments. It happened February 27, and they said it took 2 months for their insurance to get processed. It doesn't seem unreasonable to me to wait until they've got everything sorted out before posting.
Based on pictures of the dash cam files they posted, I don't think they have internal video, they would have had to turn it on before hand I think? Tesla mostly talks about sentry mode internal camera, so I don't actually know if it would record by default.
But again, I don't think it's reasonable to expect them to edit a video with face blurring and everything, they don't owe the Internet anything, and not everyone even knows how.I'll be honest I don't know for sure they're being truthful, but I'm certainly a lot less skeptical about than you. I think enough of their story checks out that I'm not going to discount it based on lack of additional info alone. I haven't seen them contradict themselves anywhere.
What about the lies about the version of FSD?
The version they claim to have had at the time wasn’t actually available at that time.
They have provided literally zero evidence that FSD was on. None whatsoever.
-
What about the lies about the version of FSD?
The version they claim to have had at the time wasn’t actually available at that time.
They have provided literally zero evidence that FSD was on. None whatsoever.
There's plenty of evidence of other people with the same v13.2.8 version before their crash on Feb 27, for example:
-
No it didn’t switch it off because of that.
FSD and autopilot do different things using the same data. It’s fact that they behave differently.
From NHTSA IN 2022:
"The agency's analysis of these sixteen subject first responder and road maintenance vehicle crashes indicated that Forward Collision Warnings (FCW) activated in the majority of incidents immediately prior to impact and that subsequent Automatic Emergency Braking (AEB) intervened in approximately half of the collisions. On average in these crashes, Autopilot aborted vehicle control less than one second prior to the first impact," the report reads.
—https://static.nhtsa.gov/odi/inv/2022/INOA-EA22002-3184.PDF
How else with they suddenly know to hit brakes and stitch off ~1 seconds before impact? Coincidence, or negligent coding?
-
From NHTSA IN 2022:
"The agency's analysis of these sixteen subject first responder and road maintenance vehicle crashes indicated that Forward Collision Warnings (FCW) activated in the majority of incidents immediately prior to impact and that subsequent Automatic Emergency Braking (AEB) intervened in approximately half of the collisions. On average in these crashes, Autopilot aborted vehicle control less than one second prior to the first impact," the report reads.
—https://static.nhtsa.gov/odi/inv/2022/INOA-EA22002-3184.PDF
How else with they suddenly know to hit brakes and stitch off ~1 seconds before impact? Coincidence, or negligent coding?
So they tried to hide it from them by explicitly logging when it switched on and off in the data that they report to them? Huh?