Tesla Reports Drop in Self-Driving Safety After Introducing “End-to-End Neural Networks”
-
TL;DR: Tesla self-driving tech is becoming less safe per mile, according to Tesla’s own data.
Q1 2025 was 2.5% worse than Q1 2024. Q2 2025 was 2.8% worse than Q2 2024.
This is actually in line with ai I've used... For some reason it just turns to shit after a while, I'm not sure why
-
TL;DR: Tesla self-driving tech is becoming less safe per mile, according to Tesla’s own data.
Q1 2025 was 2.5% worse than Q1 2024. Q2 2025 was 2.8% worse than Q2 2024.
Ha funny. I highly suspected this could happen when I heard they have quotas for how many changes the people training the AI make to the AI behavior. That's a recipe for flooding the system with bad data.
No AI can be better than the info it is given, and if X is an indicator on that, it's just about a certainty that Tesla AI will rot in misinformation. -
This is taking “testing in production” to a whole new level. How did this get past the regulations?
On second thoughts, does any country have concrete regulations for self driving vehicles? I am curious what they would be, and how they would quantify the thresholds since no self driving solution would be 100% accident-free.
Well, we know how he got passed regulations
-
This is taking “testing in production” to a whole new level. How did this get past the regulations?
On second thoughts, does any country have concrete regulations for self driving vehicles? I am curious what they would be, and how they would quantify the thresholds since no self driving solution would be 100% accident-free.
Switzerland:
Switzerland’s Regulatory Framework for Automated and Driverless Vehicles: Implementation on the 1st March 2025
The Swiss government (Bundesrat) has adopted the Ordinance on Automated Driving (Verordnung über das automatisierte Fahren, VAF), establishing a robust regulatory framework to implement amendments to the Road Traffic Act (Strassenverkehrsgesetz, SVG) approved by Parliament in 2023. This forward-looking legislation enables transformative transportation technologies, including hands-free driving with autopilot systems on highways, driverless vehicles on pre-defined routes, and fully automated parking solutions. These advancements promise to enhance mobility by improving road safety, reducing traffic congestion, and optimise parking spaces and land.
SAAM (Swiss Association for Autonomous Mobility) (www.saam.swiss)
-
TL;DR: Tesla self-driving tech is becoming less safe per mile, according to Tesla’s own data.
Q1 2025 was 2.5% worse than Q1 2024. Q2 2025 was 2.8% worse than Q2 2024.
It will be fine they said, it will get better they said.
Somehow it gets even worse.
-
This is taking “testing in production” to a whole new level. How did this get past the regulations?
On second thoughts, does any country have concrete regulations for self driving vehicles? I am curious what they would be, and how they would quantify the thresholds since no self driving solution would be 100% accident-free.
Teslas "FSD" is not legal in the EU AFAIK.
-
TL;DR: Tesla self-driving tech is becoming less safe per mile, according to Tesla’s own data.
Q1 2025 was 2.5% worse than Q1 2024. Q2 2025 was 2.8% worse than Q2 2024.
Replacing code with networks has the potential to be much faster with similar quality. The idea is good.
-
This is taking “testing in production” to a whole new level. How did this get past the regulations?
On second thoughts, does any country have concrete regulations for self driving vehicles? I am curious what they would be, and how they would quantify the thresholds since no self driving solution would be 100% accident-free.
How did this get past the regulations?
It's almost like DOGE specifically dismantled the parts of the government that were investigating and attempting to regulate Musk's companies.
-
Replacing code with networks has the potential to be much faster with similar quality. The idea is good.
At least when it's Teslas code getting replaced
-
Move fast and break things - literal edition
The New Self-Driving Package presented by Fred Durst
-
At least when it's Teslas code getting replaced
No, in general. You want to do as little pre and post processing as possible for neural networks.
-
TL;DR: Tesla self-driving tech is becoming less safe per mile, according to Tesla’s own data.
Q1 2025 was 2.5% worse than Q1 2024. Q2 2025 was 2.8% worse than Q2 2024.
Could this be attributed to the driver mix changing?
It's quite possible tesla drivers are worse in 2025 than 2024
-
The New Self-Driving Package presented by Fred Durst
Apt since Durst is such a shitbag
-
Teslas "FSD" is not legal in the EU AFAIK.
The Cybertruck isn't road legal in the UK. And if it was it'd need a special license due to it's weight.
-
Could this be attributed to the driver mix changing?
It's quite possible tesla drivers are worse in 2025 than 2024
The politically motivated sell off should have something to do with it
-
TL;DR: Tesla self-driving tech is becoming less safe per mile, according to Tesla’s own data.
Q1 2025 was 2.5% worse than Q1 2024. Q2 2025 was 2.8% worse than Q2 2024.
End-to-end ML can be much better than hybrid (or fully rules-based) systems. But there's no guarantee and you have to actually measure the difference to be sure.
For safety-critical systems, I would also not want to commit fully to an e2e system because the worse explainability means it's much harder to be confident that there is no strange failure mode that you haven't spotted but may be, or may become, unacceptable common. In that case, you would want to be able to revert to a rules-based fallaback that may once have looked worse-performing but which has turned out to be better. That means that you can't just delete and stop maintaining that rules-based code if you have any type of long-term thinking. Hmm.
-
The politically motivated sell off should have something to do with it
Yeah exactly!
-
End-to-end ML can be much better than hybrid (or fully rules-based) systems. But there's no guarantee and you have to actually measure the difference to be sure.
For safety-critical systems, I would also not want to commit fully to an e2e system because the worse explainability means it's much harder to be confident that there is no strange failure mode that you haven't spotted but may be, or may become, unacceptable common. In that case, you would want to be able to revert to a rules-based fallaback that may once have looked worse-performing but which has turned out to be better. That means that you can't just delete and stop maintaining that rules-based code if you have any type of long-term thinking. Hmm.
yeah i wanna see what the fuck metrics made them think this was a good idea. what is their mean average precision. did they recall@1 for humans on the road
-
TL;DR: Tesla self-driving tech is becoming less safe per mile, according to Tesla’s own data.
Q1 2025 was 2.5% worse than Q1 2024. Q2 2025 was 2.8% worse than Q2 2024.
Teslas are already the least safe cars sold in america!
-
Teslas are already the least safe cars sold in america!
If you ask Tesla drivers, they often purchased the car because someone told them Teslas are safe, despite them being the statistically most lethal car you could drive in America.
We live in a dystopian information environment.