Skip to content

Can Tesla's Self-Driving Software Handle Bus-Only Lanes? Not Reliably, No.

Technology
46 18 128
  • I work on a computer 8-10 hours a day. I can multitask very well. 800 comments in a month, most which take like 10 seconds to type and hit post, is not some insanely difficult challenge.

    How about instead of trying to shame people - while posting on the same site nonetheless lol - you try and post something on topic or productive? Or maybe use that time for something better?

    Well thank god we have you do be productive and on topic. Clearly doing a service to the world, and totally not just shit posting out of boredom.

  • Did you watch the video? There’s no way that it was just “user error”, nobody randomly swerves into a tree when nothing’s there.

    You're assuming that he was paying attention and driving normally. He could have dozed off and pulled the wheel.

    All I'm asking for is evidence to support his claim other than "trust me bro".

    Tesla gives out beta access to users, so I wouldn’t put too much weight on that claimed version they were using.

    As others have pointed out, that version wasn't out to regular people on that day.

    It all points to a guy who crashed his car and has now seen an opportunity months later to get his 15 minutes of fame. Why didn't he post about this in FEBRUARY WHEN IT HAPPENED? Why did he not reach out to tesla in the last 3 months? It just doesn't add up. He has provided zero evidence that FSD caused this.

    Where are you getting February from?
    As far as I know this happened last week. I don't blame the person not wanting to reveal their face.

  • He claimed in the video he was using FSD, but then mainly used autopilot - which was one of the biggest issues people had with his video in the first place. Autopilot is not as good as FSD. We also saw FSD engaged for a brief second before disengaging (from what looked like him either turning the wheel or accelerating, or possibly because he activated it 2 seconds before he was about to manually drive through a wall and it realized it as soon as he turned it on).

    FSD doesn't require a destination btw.

    Are you suggesting that Autopilot is inherently less safe than FSD?

    It's not "less safe", it's far less advanced and serves a different purpose.

    I genuinely don't understand what FSD has to do with any of it. My car's front collision sensor works regardless of whether cruise control is enabled.

    If I'm understanding your argument correctly, the driver needs to enable a setting first for a Tesla not to plow directly into a wall? I would say that makes it less safe.

  • Where are you getting February from?
    As far as I know this happened last week. I don't blame the person not wanting to reveal their face.

    The guy posting it everywhere all over reddit and twitter himself says it happened in February. People on here, including myself, have linked to his posts saying it happened in February.

    He could simply blur his face…..

  • I genuinely don't understand what FSD has to do with any of it. My car's front collision sensor works regardless of whether cruise control is enabled.

    If I'm understanding your argument correctly, the driver needs to enable a setting first for a Tesla not to plow directly into a wall? I would say that makes it less safe.

    Mark Rober drove a Tesla manually into the fake wall that he made specifically so he could promote his friends LiDAR company. He lied and said he had FSD enabled, then changed his story to say only autopilot was on when called out on his lies.

    What car do you have? Are you saying that just in normal everyday manual driving your car would stop your car automatically from 60mph and not hit a wall because of a collision sensor? Collision sensors are for slow moving things that are like 1m in front/behind you.

    I’m saying Rober lied, intentionally and deceptively so. FSD has everything to do with it because he said FSD drove him into the wall, but it wasn’t even enabled - he manually drove straight through the wall.

  • Mark Rober drove a Tesla manually into the fake wall that he made specifically so he could promote his friends LiDAR company. He lied and said he had FSD enabled, then changed his story to say only autopilot was on when called out on his lies.

    What car do you have? Are you saying that just in normal everyday manual driving your car would stop your car automatically from 60mph and not hit a wall because of a collision sensor? Collision sensors are for slow moving things that are like 1m in front/behind you.

    I’m saying Rober lied, intentionally and deceptively so. FSD has everything to do with it because he said FSD drove him into the wall, but it wasn’t even enabled - he manually drove straight through the wall.

    What car do you have?

    Volkswagen Group vehicle.

    Are you saying that just in normal everyday manual driving your car would stop your car automatically from 60mph and not hit a wall because of a collision sensor?

    My car's AEBS will apply braking, shake the steering wheel, sound a loud alarm and flash the dashboard. I can't say for sure if it applies full braking, or if that only applies at lower speeds.

    Collision sensors are for slow moving things that are like 1m in front/behind you.

    Perhaps I've not described the system accurately, because I'm not referring to parking sensors. My car's owner's manual states that AEBS works at speeds up to 220 km/h, and I've personally experienced it trigger while going over 120 km/h.

    My take on Rober's video is simply that Tesla's automated driver safety systems are sub-par compared to other manufacturers. Perhaps somebody could perform another test with FSD enabled, but I personally don't think it's safe to require a driver to first enable a specific mode in order to avoid an accident—then they might as well just press the brakes themselves.

  • Well thank god we have you do be productive and on topic. Clearly doing a service to the world, and totally not just shit posting out of boredom.

    Just block this maroon, he's jacking off to consternation instead of working his help desk job

  • Lol, this fascist bootlicker down voted you instead of spraying more gishgallop

  • Lol, this fascist bootlicker down voted you instead of spraying more gishgallop

    Yeah 😂

  • Rober got absolutely destroyed, and rightly so, for that BS video and “test”. He didn’t even use FSD - he just drove straight into the fake wall and then claimed FSD did it. We could literally see the cars screen showing FSD wasn’t on, or even autopilot.

    It switched autopilot off when the parking sensors (ultrasonic/radar range finder with a very narrow range) detected the wall.

    If Tesla isn’t doing that to hide deficiencies from NHTSA investigations, I’ll eat my shoe.

    Also FSD is an app running on the same hardware and camera systems as autopilot, what would make it better at “seeing” through mist or a reasonable facsimile of the road up ahead?

  • It's not hard to find videos of self-driving Teslas wilding in bus lanes. Check the videos out, then consider:

    "There was an interesting side-note in Tesla’s last earnings call, where they explained the main challenge of releasing Full-Self Driving (supervised!) in China was a quirk of Chinese roads: the bus-only lanes.

    Well, jeez, we have bus-only lanes here in Chicago, too. Like many other American metropolises… including Austin TX, where Tesla plans to rollout unsupervised autonomous vehicles in a matter of weeks..."

    It's one of those regional differences to driving that make a generalizable self-driving platform an exceedingly tough technical nut to crack... unless you're willing to just plain ignore the local rules.

    Rest of the worlds solution: remove Tesla

    American solution : remove bus lanes

  • It’s actually extremely straightforward to mark lanes as “bus-only” when you map out roads

    Yeah... until they change. And municipalities are not known for thoroughly documenting their changes, nor companies keeping their info up to date even if those changes are provided.

    Sooooo .... You people don't have signs? Because my car can read those.

  • What car do you have?

    Volkswagen Group vehicle.

    Are you saying that just in normal everyday manual driving your car would stop your car automatically from 60mph and not hit a wall because of a collision sensor?

    My car's AEBS will apply braking, shake the steering wheel, sound a loud alarm and flash the dashboard. I can't say for sure if it applies full braking, or if that only applies at lower speeds.

    Collision sensors are for slow moving things that are like 1m in front/behind you.

    Perhaps I've not described the system accurately, because I'm not referring to parking sensors. My car's owner's manual states that AEBS works at speeds up to 220 km/h, and I've personally experienced it trigger while going over 120 km/h.

    My take on Rober's video is simply that Tesla's automated driver safety systems are sub-par compared to other manufacturers. Perhaps somebody could perform another test with FSD enabled, but I personally don't think it's safe to require a driver to first enable a specific mode in order to avoid an accident—then they might as well just press the brakes themselves.

    I’m not aware of any cars that will use radar, cameras, or lidar all the time and automatically stop your car to a complete standstill while you’re manually operating it. Yours does this? It physically won’t let you run into something, ever?

  • It switched autopilot off when the parking sensors (ultrasonic/radar range finder with a very narrow range) detected the wall.

    If Tesla isn’t doing that to hide deficiencies from NHTSA investigations, I’ll eat my shoe.

    Also FSD is an app running on the same hardware and camera systems as autopilot, what would make it better at “seeing” through mist or a reasonable facsimile of the road up ahead?

    No it didn’t switch it off because of that.

    FSD and autopilot do different things using the same data. It’s fact that they behave differently.

  • The guy posting it everywhere all over reddit and twitter himself says it happened in February. People on here, including myself, have linked to his posts saying it happened in February.

    He could simply blur his face…..

    Well now you've made me go read all their Reddit comments. It happened February 27, and they said it took 2 months for their insurance to get processed. It doesn't seem unreasonable to me to wait until they've got everything sorted out before posting.

    Based on pictures of the dash cam files they posted, I don't think they have internal video, they would have had to turn it on before hand I think? Tesla mostly talks about sentry mode internal camera, so I don't actually know if it would record by default.
    But again, I don't think it's reasonable to expect them to edit a video with face blurring and everything, they don't owe the Internet anything, and not everyone even knows how.

    I'll be honest I don't know for sure they're being truthful, but I'm certainly a lot less skeptical about than you. I think enough of their story checks out that I'm not going to discount it based on lack of additional info alone. I haven't seen them contradict themselves anywhere.

  • I’m not aware of any cars that will use radar, cameras, or lidar all the time and automatically stop your car to a complete standstill while you’re manually operating it. Yours does this? It physically won’t let you run into something, ever?

    Yes, it does. It performs speed sign recognition and lane departure warning continuously as well, but will only perform steering correction above a minimum speed (I believe 50 kmh) and adjust the speed while adaptive cruise control is switched on.

  • Well now you've made me go read all their Reddit comments. It happened February 27, and they said it took 2 months for their insurance to get processed. It doesn't seem unreasonable to me to wait until they've got everything sorted out before posting.

    Based on pictures of the dash cam files they posted, I don't think they have internal video, they would have had to turn it on before hand I think? Tesla mostly talks about sentry mode internal camera, so I don't actually know if it would record by default.
    But again, I don't think it's reasonable to expect them to edit a video with face blurring and everything, they don't owe the Internet anything, and not everyone even knows how.

    I'll be honest I don't know for sure they're being truthful, but I'm certainly a lot less skeptical about than you. I think enough of their story checks out that I'm not going to discount it based on lack of additional info alone. I haven't seen them contradict themselves anywhere.

    What about the lies about the version of FSD?

    The version they claim to have had at the time wasn’t actually available at that time.

    They have provided literally zero evidence that FSD was on. None whatsoever.

  • What about the lies about the version of FSD?

    The version they claim to have had at the time wasn’t actually available at that time.

    They have provided literally zero evidence that FSD was on. None whatsoever.

    There's plenty of evidence of other people with the same v13.2.8 version before their crash on Feb 27, for example:

  • No it didn’t switch it off because of that.

    FSD and autopilot do different things using the same data. It’s fact that they behave differently.

    From NHTSA IN 2022:

    "The agency's analysis of these sixteen subject first responder and road maintenance vehicle crashes indicated that Forward Collision Warnings (FCW) activated in the majority of incidents immediately prior to impact and that subsequent Automatic Emergency Braking (AEB) intervened in approximately half of the collisions. On average in these crashes, Autopilot aborted vehicle control less than one second prior to the first impact," the report reads.

    https://static.nhtsa.gov/odi/inv/2022/INOA-EA22002-3184.PDF

    How else with they suddenly know to hit brakes and stitch off ~1 seconds before impact? Coincidence, or negligent coding?

  • From NHTSA IN 2022:

    "The agency's analysis of these sixteen subject first responder and road maintenance vehicle crashes indicated that Forward Collision Warnings (FCW) activated in the majority of incidents immediately prior to impact and that subsequent Automatic Emergency Braking (AEB) intervened in approximately half of the collisions. On average in these crashes, Autopilot aborted vehicle control less than one second prior to the first impact," the report reads.

    https://static.nhtsa.gov/odi/inv/2022/INOA-EA22002-3184.PDF

    How else with they suddenly know to hit brakes and stitch off ~1 seconds before impact? Coincidence, or negligent coding?

    So they tried to hide it from them by explicitly logging when it switched on and off in the data that they report to them? Huh?