Can Tesla's Self-Driving Software Handle Bus-Only Lanes? Not Reliably, No.
-
It's actually extremely straightforward to mark lanes as "bus-only" when you map out roads, but that's assuming you're not being micromanaged by a moronic egomaniac.
It’s actually extremely straightforward to mark lanes as “bus-only” when you map out roads
Yeah... until they change. And municipalities are not known for thoroughly documenting their changes, nor companies keeping their info up to date even if those changes are provided.
-
Ok. But they're talking about Tesla fsd ai.
It would have killed you.Well yeah, because Tesla don’t make planes so their FSD wouldn’t know what to do when in control of a plane.
-
It can't stop when running directly at a wall but you expect it to handle special lanes?
Rober got absolutely destroyed, and rightly so, for that BS video and “test”. He didn’t even use FSD - he just drove straight into the fake wall and then claimed FSD did it. We could literally see the cars screen showing FSD wasn’t on, or even autopilot.
-
Rober got absolutely destroyed, and rightly so, for that BS video and “test”. He didn’t even use FSD - he just drove straight into the fake wall and then claimed FSD did it. We could literally see the cars screen showing FSD wasn’t on, or even autopilot.
Is the taste of shoe leather getting any better?
-
Is the taste of shoe leather getting any better?
-
Fuck me man over 800 comments in less than a month?
Just put the phone down and go get some air
-
Fuck me man over 800 comments in less than a month?
Just put the phone down and go get some air
I work on a computer 8-10 hours a day. I can multitask very well. 800 comments in a month, most which take like 10 seconds to type and hit post, is not some insanely difficult challenge.
How about instead of trying to shame people - while posting on the same site nonetheless lol - you try and post something on topic or productive? Or maybe use that time for something better?
-
Well yeah, because Tesla don’t make planes so their FSD wouldn’t know what to do when in control of a plane.
It would also have killed you in one of their cars probably
-
Rober got absolutely destroyed, and rightly so, for that BS video and “test”. He didn’t even use FSD - he just drove straight into the fake wall and then claimed FSD did it. We could literally see the cars screen showing FSD wasn’t on, or even autopilot.
He didn't use FSD because he was on a track and FSD requires a destination. It was using Autopilot, according to his statement. Are you suggesting that Autopilot is inherently less safe than FSD? I'm confused about your position on this.
-
Rober got absolutely destroyed, and rightly so, for that BS video and “test”. He didn’t even use FSD - he just drove straight into the fake wall and then claimed FSD did it. We could literally see the cars screen showing FSD wasn’t on, or even autopilot.
FSD wouldn't have done any better, it can't even figure out shadows on the road properly as seen in this crash 3 days ago:
Video: Tesla Full-Self Driving Veers Off Road, Hits Tree, and Flips Car for No Obvious Reason (No Serious Injuries, but Scary) - FuelArc News
A 2025 Tesla Model 3 drives off of a rural road, clips a tree, loses a tire, flips over, and comes to rest on its roof. Luckily, the driver is alive and well, able to post about it on social media. The problem? The car was driving itself in Full Self-Driving. The subtitle from the […]
FuelArc News (fuelarc.com)
-
FSD wouldn't have done any better, it can't even figure out shadows on the road properly as seen in this crash 3 days ago:
Video: Tesla Full-Self Driving Veers Off Road, Hits Tree, and Flips Car for No Obvious Reason (No Serious Injuries, but Scary) - FuelArc News
A 2025 Tesla Model 3 drives off of a rural road, clips a tree, loses a tire, flips over, and comes to rest on its roof. Luckily, the driver is alive and well, able to post about it on social media. The problem? The car was driving itself in Full Self-Driving. The subtitle from the […]
FuelArc News (fuelarc.com)
That guy is lying through his teeth lol.
- The crash happened in February, within 2 weeks of him buying the car. He never thought to bring it up with Tesla until just now, apparently. Try and make that make sense.
- There's no proof that he was using FSD, and that this wasn't driver error. He has posted footage of every camera except for the interior camera which would show if he was driving or not. I wonder why?
- He claims that he had a version of FSD that wasn't released to the public at the time.
- Tesla have a program you can download to get all your telemetry data, showing exactly when FSD is enabled and disabled. He never tried to get that.
- Again - he never even contacted Tesla to ask them to investigate.
- 3 months later he's posting his story to 10+ reddit subs and tagging Mark Rober and every anti-tesla person on X, and promoting his X account on Reddit.
He's fame chasing. He saw an opportunity to turn his user error crash into 15 mins of fame and money.
-
He didn't use FSD because he was on a track and FSD requires a destination. It was using Autopilot, according to his statement. Are you suggesting that Autopilot is inherently less safe than FSD? I'm confused about your position on this.
He claimed in the video he was using FSD, but then mainly used autopilot - which was one of the biggest issues people had with his video in the first place. Autopilot is not as good as FSD. We also saw FSD engaged for a brief second before disengaging (from what looked like him either turning the wheel or accelerating, or possibly because he activated it 2 seconds before he was about to manually drive through a wall and it realized it as soon as he turned it on).
FSD doesn't require a destination btw.
Are you suggesting that Autopilot is inherently less safe than FSD?
It's not "less safe", it's far less advanced and serves a different purpose.
-
That guy is lying through his teeth lol.
- The crash happened in February, within 2 weeks of him buying the car. He never thought to bring it up with Tesla until just now, apparently. Try and make that make sense.
- There's no proof that he was using FSD, and that this wasn't driver error. He has posted footage of every camera except for the interior camera which would show if he was driving or not. I wonder why?
- He claims that he had a version of FSD that wasn't released to the public at the time.
- Tesla have a program you can download to get all your telemetry data, showing exactly when FSD is enabled and disabled. He never tried to get that.
- Again - he never even contacted Tesla to ask them to investigate.
- 3 months later he's posting his story to 10+ reddit subs and tagging Mark Rober and every anti-tesla person on X, and promoting his X account on Reddit.
He's fame chasing. He saw an opportunity to turn his user error crash into 15 mins of fame and money.
Did you watch the video? There's no way that it was just "user error", nobody randomly swerves into a tree when nothing's there.
Maybe you're implying it was insurance fraud?Tesla gives out beta access to users, so I wouldn't put too much weight on that claimed version they were using.
-
Did you watch the video? There's no way that it was just "user error", nobody randomly swerves into a tree when nothing's there.
Maybe you're implying it was insurance fraud?Tesla gives out beta access to users, so I wouldn't put too much weight on that claimed version they were using.
Did you watch the video? There’s no way that it was just “user error”, nobody randomly swerves into a tree when nothing’s there.
You're assuming that he was paying attention and driving normally. He could have dozed off and pulled the wheel.
All I'm asking for is evidence to support his claim other than "trust me bro".
Tesla gives out beta access to users, so I wouldn’t put too much weight on that claimed version they were using.
As others have pointed out, that version wasn't out to regular people on that day.
It all points to a guy who crashed his car and has now seen an opportunity months later to get his 15 minutes of fame. Why didn't he post about this in FEBRUARY WHEN IT HAPPENED? Why did he not reach out to tesla in the last 3 months? It just doesn't add up. He has provided zero evidence that FSD caused this.
-
It would also have killed you in one of their cars probably
How many people have died from their Tesla's FSD making mistakes?
-
How many people have died from their Tesla's FSD making mistakes?
At least 54 https://www.tesladeaths.com/
-
At least 54 https://www.tesladeaths.com/
Funny, because one random one of those that I picked:
was marked as "Verified Tesla Autopilot Death"......when all it is is a lawsuit has filed claiming that autopilot resulted in the death. Funnily enough almost every one that I look at is not actually "verified". This one:
Never even mentions autopilot or FSD.
The fine print down the bottom of the page basically says "yeah nah we don't actually know if autopilot or FSD played any part at all in these accidents or deaths, but we don't care because the NHTSA says that the cars have autopilot" lol. The "SGO" they put next to ones that have "confirmed autopilot death" as proof is rubbish, because according to the NHTSA who collect the SGO data, they say this about the ADS/ADAS that this spreadsheet says caused the crash:
It is important to note that these crashes are categorized based on what driving automation system was reported as being equipped on the vehicle, not on what system was reported to be engaged at the time of the incident.
Source: https://www.nhtsa.gov/laws-regulations/standing-general-order-crash-reporting
So basically the people making this site have gone "OMG the SGO categorized this as a Automated Driving System crash!!!! That means Autopilot/FSD caused the crash!!!" before reading what that category actually means lol. Well either that or they do know (most likely) and have a bit of an agenda.
BTW here is another lawsuit over a death where Autopilot was claimed to have caused it:
What was the truth? The driver was drunk and didn't even have autopilot activated. Tesla won the court case. It's almost like people lie when faced with the consequences of their own, or their loved ones, actions and want someone else to be to blame.
-
Funny, because one random one of those that I picked:
was marked as "Verified Tesla Autopilot Death"......when all it is is a lawsuit has filed claiming that autopilot resulted in the death. Funnily enough almost every one that I look at is not actually "verified". This one:
Never even mentions autopilot or FSD.
The fine print down the bottom of the page basically says "yeah nah we don't actually know if autopilot or FSD played any part at all in these accidents or deaths, but we don't care because the NHTSA says that the cars have autopilot" lol. The "SGO" they put next to ones that have "confirmed autopilot death" as proof is rubbish, because according to the NHTSA who collect the SGO data, they say this about the ADS/ADAS that this spreadsheet says caused the crash:
It is important to note that these crashes are categorized based on what driving automation system was reported as being equipped on the vehicle, not on what system was reported to be engaged at the time of the incident.
Source: https://www.nhtsa.gov/laws-regulations/standing-general-order-crash-reporting
So basically the people making this site have gone "OMG the SGO categorized this as a Automated Driving System crash!!!! That means Autopilot/FSD caused the crash!!!" before reading what that category actually means lol. Well either that or they do know (most likely) and have a bit of an agenda.
BTW here is another lawsuit over a death where Autopilot was claimed to have caused it:
What was the truth? The driver was drunk and didn't even have autopilot activated. Tesla won the court case. It's almost like people lie when faced with the consequences of their own, or their loved ones, actions and want someone else to be to blame.
You really believe FSD hasn't killed anybody yet?
-
You really believe FSD hasn't killed anybody yet?
All I'm saying is that none have been proven to have actually killed anyone yet in a court of law, and that your claim of 57 FSD/Autopilot caused deaths is bogus for all the reasons I listed.
Even if FSD has killed people - which it likely has - is it at any rate higher than normal people in non FSD cars crash and kill people? Given how Teslas are one of the highest selling cars in the world for the last what, 10 years, and how many people use Autpilot or FSD multiple times a day, even if 57 people had been killed by it, that would still be probably no more than the number of people that drive any other model of car with the same amount of sales during that timeframe.
-
All I'm saying is that none have been proven to have actually killed anyone yet in a court of law, and that your claim of 57 FSD/Autopilot caused deaths is bogus for all the reasons I listed.
Even if FSD has killed people - which it likely has - is it at any rate higher than normal people in non FSD cars crash and kill people? Given how Teslas are one of the highest selling cars in the world for the last what, 10 years, and how many people use Autpilot or FSD multiple times a day, even if 57 people had been killed by it, that would still be probably no more than the number of people that drive any other model of car with the same amount of sales during that timeframe.
Just How Safe Is Tesla’s Full Self-Driving Mode?
A recent crash video calls into question the safety of Tesla's Full Self-Driving System after software malfunctions.
Forbes (www.forbes.com)
https://www.cnn.com/2024/10/18/business/tesla-fsd-federal-investigation
It's glorified cruise control. Don't defend their false advertising.