Can Tesla's Self-Driving Software Handle Bus-Only Lanes? Not Reliably, No.
-
It's not hard to find videos of self-driving Teslas wilding in bus lanes. Check the videos out, then consider:
"There was an interesting side-note in Tesla’s last earnings call, where they explained the main challenge of releasing Full-Self Driving (supervised!) in China was a quirk of Chinese roads: the bus-only lanes.
Well, jeez, we have bus-only lanes here in Chicago, too. Like many other American metropolises… including Austin TX, where Tesla plans to rollout unsupervised autonomous vehicles in a matter of weeks..."
It's one of those regional differences to driving that make a generalizable self-driving platform an exceedingly tough technical nut to crack... unless you're willing to just plain ignore the local rules.
-
It's not hard to find videos of self-driving Teslas wilding in bus lanes. Check the videos out, then consider:
"There was an interesting side-note in Tesla’s last earnings call, where they explained the main challenge of releasing Full-Self Driving (supervised!) in China was a quirk of Chinese roads: the bus-only lanes.
Well, jeez, we have bus-only lanes here in Chicago, too. Like many other American metropolises… including Austin TX, where Tesla plans to rollout unsupervised autonomous vehicles in a matter of weeks..."
It's one of those regional differences to driving that make a generalizable self-driving platform an exceedingly tough technical nut to crack... unless you're willing to just plain ignore the local rules.
It's actually extremely straightforward to mark lanes as "bus-only" when you map out roads, but that's assuming you're not being micromanaged by a moronic egomaniac.
-
It's not hard to find videos of self-driving Teslas wilding in bus lanes. Check the videos out, then consider:
"There was an interesting side-note in Tesla’s last earnings call, where they explained the main challenge of releasing Full-Self Driving (supervised!) in China was a quirk of Chinese roads: the bus-only lanes.
Well, jeez, we have bus-only lanes here in Chicago, too. Like many other American metropolises… including Austin TX, where Tesla plans to rollout unsupervised autonomous vehicles in a matter of weeks..."
It's one of those regional differences to driving that make a generalizable self-driving platform an exceedingly tough technical nut to crack... unless you're willing to just plain ignore the local rules.
It can't stop when running directly at a wall but you expect it to handle special lanes?
-
It's not hard to find videos of self-driving Teslas wilding in bus lanes. Check the videos out, then consider:
"There was an interesting side-note in Tesla’s last earnings call, where they explained the main challenge of releasing Full-Self Driving (supervised!) in China was a quirk of Chinese roads: the bus-only lanes.
Well, jeez, we have bus-only lanes here in Chicago, too. Like many other American metropolises… including Austin TX, where Tesla plans to rollout unsupervised autonomous vehicles in a matter of weeks..."
It's one of those regional differences to driving that make a generalizable self-driving platform an exceedingly tough technical nut to crack... unless you're willing to just plain ignore the local rules.
If tesla's "AI" was used in airplane autopilots, there would be a crash every 12 hour and the entire fleet of tesla-AI aircraft would be defunct by the end of the year.
-
If tesla's "AI" was used in airplane autopilots, there would be a crash every 12 hour and the entire fleet of tesla-AI aircraft would be defunct by the end of the year.
You may not have much experience with autopilots, so no. There are different levels of autopilots in aviation, not just the full control with auto-land you may be thinking of. I used to fly a small prop plane with single axis autopilot. much less capable than Tesla full self-driving. However it was safe and useful because I understood its capabilities and limitations. I knew what to use it for and what not, so even an extremely simple analog autopilot successfully reduced pilot workload, improving safety
-
You may not have much experience with autopilots, so no. There are different levels of autopilots in aviation, not just the full control with auto-land you may be thinking of. I used to fly a small prop plane with single axis autopilot. much less capable than Tesla full self-driving. However it was safe and useful because I understood its capabilities and limitations. I knew what to use it for and what not, so even an extremely simple analog autopilot successfully reduced pilot workload, improving safety
Ok. But they're talking about Tesla fsd ai.
It would have killed you. -
It's actually extremely straightforward to mark lanes as "bus-only" when you map out roads, but that's assuming you're not being micromanaged by a moronic egomaniac.
It’s actually extremely straightforward to mark lanes as “bus-only” when you map out roads
Yeah... until they change. And municipalities are not known for thoroughly documenting their changes, nor companies keeping their info up to date even if those changes are provided.
-
Ok. But they're talking about Tesla fsd ai.
It would have killed you.Well yeah, because Tesla don’t make planes so their FSD wouldn’t know what to do when in control of a plane.
-
It can't stop when running directly at a wall but you expect it to handle special lanes?
Rober got absolutely destroyed, and rightly so, for that BS video and “test”. He didn’t even use FSD - he just drove straight into the fake wall and then claimed FSD did it. We could literally see the cars screen showing FSD wasn’t on, or even autopilot.
-
Rober got absolutely destroyed, and rightly so, for that BS video and “test”. He didn’t even use FSD - he just drove straight into the fake wall and then claimed FSD did it. We could literally see the cars screen showing FSD wasn’t on, or even autopilot.
Is the taste of shoe leather getting any better?
-
Is the taste of shoe leather getting any better?
-
Fuck me man over 800 comments in less than a month?
Just put the phone down and go get some air
-
Fuck me man over 800 comments in less than a month?
Just put the phone down and go get some air
I work on a computer 8-10 hours a day. I can multitask very well. 800 comments in a month, most which take like 10 seconds to type and hit post, is not some insanely difficult challenge.
How about instead of trying to shame people - while posting on the same site nonetheless lol - you try and post something on topic or productive? Or maybe use that time for something better?
-
Well yeah, because Tesla don’t make planes so their FSD wouldn’t know what to do when in control of a plane.
It would also have killed you in one of their cars probably
-
Rober got absolutely destroyed, and rightly so, for that BS video and “test”. He didn’t even use FSD - he just drove straight into the fake wall and then claimed FSD did it. We could literally see the cars screen showing FSD wasn’t on, or even autopilot.
He didn't use FSD because he was on a track and FSD requires a destination. It was using Autopilot, according to his statement. Are you suggesting that Autopilot is inherently less safe than FSD? I'm confused about your position on this.
-
Rober got absolutely destroyed, and rightly so, for that BS video and “test”. He didn’t even use FSD - he just drove straight into the fake wall and then claimed FSD did it. We could literally see the cars screen showing FSD wasn’t on, or even autopilot.
FSD wouldn't have done any better, it can't even figure out shadows on the road properly as seen in this crash 3 days ago:
Video: Tesla Full-Self Driving Veers Off Road, Hits Tree, and Flips Car for No Obvious Reason (No Serious Injuries, but Scary) - FuelArc News
A 2025 Tesla Model 3 drives off of a rural road, clips a tree, loses a tire, flips over, and comes to rest on its roof. Luckily, the driver is alive and well, able to post about it on social media. The problem? The car was driving itself in Full Self-Driving. The subtitle from the […]
FuelArc News (fuelarc.com)
-
FSD wouldn't have done any better, it can't even figure out shadows on the road properly as seen in this crash 3 days ago:
Video: Tesla Full-Self Driving Veers Off Road, Hits Tree, and Flips Car for No Obvious Reason (No Serious Injuries, but Scary) - FuelArc News
A 2025 Tesla Model 3 drives off of a rural road, clips a tree, loses a tire, flips over, and comes to rest on its roof. Luckily, the driver is alive and well, able to post about it on social media. The problem? The car was driving itself in Full Self-Driving. The subtitle from the […]
FuelArc News (fuelarc.com)
That guy is lying through his teeth lol.
- The crash happened in February, within 2 weeks of him buying the car. He never thought to bring it up with Tesla until just now, apparently. Try and make that make sense.
- There's no proof that he was using FSD, and that this wasn't driver error. He has posted footage of every camera except for the interior camera which would show if he was driving or not. I wonder why?
- He claims that he had a version of FSD that wasn't released to the public at the time.
- Tesla have a program you can download to get all your telemetry data, showing exactly when FSD is enabled and disabled. He never tried to get that.
- Again - he never even contacted Tesla to ask them to investigate.
- 3 months later he's posting his story to 10+ reddit subs and tagging Mark Rober and every anti-tesla person on X, and promoting his X account on Reddit.
He's fame chasing. He saw an opportunity to turn his user error crash into 15 mins of fame and money.
-
He didn't use FSD because he was on a track and FSD requires a destination. It was using Autopilot, according to his statement. Are you suggesting that Autopilot is inherently less safe than FSD? I'm confused about your position on this.
He claimed in the video he was using FSD, but then mainly used autopilot - which was one of the biggest issues people had with his video in the first place. Autopilot is not as good as FSD. We also saw FSD engaged for a brief second before disengaging (from what looked like him either turning the wheel or accelerating, or possibly because he activated it 2 seconds before he was about to manually drive through a wall and it realized it as soon as he turned it on).
FSD doesn't require a destination btw.
Are you suggesting that Autopilot is inherently less safe than FSD?
It's not "less safe", it's far less advanced and serves a different purpose.
-
That guy is lying through his teeth lol.
- The crash happened in February, within 2 weeks of him buying the car. He never thought to bring it up with Tesla until just now, apparently. Try and make that make sense.
- There's no proof that he was using FSD, and that this wasn't driver error. He has posted footage of every camera except for the interior camera which would show if he was driving or not. I wonder why?
- He claims that he had a version of FSD that wasn't released to the public at the time.
- Tesla have a program you can download to get all your telemetry data, showing exactly when FSD is enabled and disabled. He never tried to get that.
- Again - he never even contacted Tesla to ask them to investigate.
- 3 months later he's posting his story to 10+ reddit subs and tagging Mark Rober and every anti-tesla person on X, and promoting his X account on Reddit.
He's fame chasing. He saw an opportunity to turn his user error crash into 15 mins of fame and money.
Did you watch the video? There's no way that it was just "user error", nobody randomly swerves into a tree when nothing's there.
Maybe you're implying it was insurance fraud?Tesla gives out beta access to users, so I wouldn't put too much weight on that claimed version they were using.
-
Did you watch the video? There's no way that it was just "user error", nobody randomly swerves into a tree when nothing's there.
Maybe you're implying it was insurance fraud?Tesla gives out beta access to users, so I wouldn't put too much weight on that claimed version they were using.
Did you watch the video? There’s no way that it was just “user error”, nobody randomly swerves into a tree when nothing’s there.
You're assuming that he was paying attention and driving normally. He could have dozed off and pulled the wheel.
All I'm asking for is evidence to support his claim other than "trust me bro".
Tesla gives out beta access to users, so I wouldn’t put too much weight on that claimed version they were using.
As others have pointed out, that version wasn't out to regular people on that day.
It all points to a guy who crashed his car and has now seen an opportunity months later to get his 15 minutes of fame. Why didn't he post about this in FEBRUARY WHEN IT HAPPENED? Why did he not reach out to tesla in the last 3 months? It just doesn't add up. He has provided zero evidence that FSD caused this.