Tesla Robotaxi Freaks Out and Drives into Oncoming Traffic on First Day
-
I am entirely opposed to driving algorithms. Autopilot on planes works very well because it is used in open sky and does not have to make major decisions about moving in close proximity to other planes and obstacles. Its almost entirely mathematical, and even then in specific circumstances it is designed to disengage and put control back in the hands of a human.
Cars do not have this luxury and operate entirely in close proximity to other vehicles and obstacles. Very little of the act of driving a car is math. It's almost entirely decision making. It requires fast and instinctive response to subtle changes in environment, pattern recognition that human brains are better at than algorithms.
To me this technology perfectly encapsulates the difficulty in making algorithms that mimic human behavior. The last 10% of optimization to make par with humans requires an exponential amount more energy and research than the first 90% does. 90% of the performance of a human is entirely insufficient where life and death is concerned.
Investment costs should be going to public transport systems. They are more cost efficient, more accessible, more fuel/resource efficient, and far far far safer than cars could ever be even with all human drivers. This is a colossal waste of energy time and money for a product that will not be par with human performance for a long time. Those resources could be making our world more accessible for everyone, instead they're making it more accessible for no one and making the roads significantly more dangerous. Capitalism will be the end of us all if we let them. Sorry that train and bus infrastructure isnt "flashy enough" for you. You clearly havent seen the public transport systems in Beijing. The technology we have here is decades behind and so underfunded its infuriating.
I always have the same thought when I see self driving taxi news.
"Americans will go bankrupt trying to prop up the auto/gas industries rather than simply building a train".
And it's true. So much money is being burned on a subpar and dangerous product. Yet we've just cut and cancelled extremely beneficial high speed rail projects that were overwhelmingly voted for by the people.
-
Public transport systems are just part of a mobility solution, but it isn't viable to have that everywhere. Heck, even here in The Netherlands, a country the size of a post stamp, public transport doesn't work outside of the major cities. So basically, outside of the cities, we are also relying on cars.
Therefore, I do believe there will be a place for autonomous driving in the future of mobility and that it has the potential to reduce number of accidents, traffic jams and parking problems while increasing the average speed we drive around with.
The only thing that has me a bit worried is Tesla's approach to autonomous driving, fully relying on the camera system. Somehow, Musk believes a camera system is superior to human vision, while it's not. I drive a Tesla (yeah, I know) and if the conditions aren't perfect, the car disables "safety' features, like lane assist. For instance when it's raining heavily or when the sun is shining directly into the camera lenses. This must be a key reason in choosing Austin for the demo/rollout.
Meanwhile, we see what other manufacturers use and how they are progressing. For instance, BMW and Mercedes are doing well with their systems, which are a blend of cameras and sensors. To me, that does seem like the way to go to introduce autonomous driving safely.
I believe Austin was chosen because they're fairly lax about the regulations and safety requirements.
Waymo already got the deal in Cali. And Cali seems much more strict. Austin is offering them faster time to market as the cost of civilian safety.
-
A man who can’t launch a rocket to save his life is also incompetent at making self driving cars? His mediocrity knows no bounds.
To be fair Musk only has money and doesnt Do shit at either Company
-
Which brings up an interesting question, when is a driverless car 'parked' vs. 'stopped'?
When the engine is off?
Of course, how to tell this with an electric car?
-
It already jumped up about 10% on monday simply because the service launched. Even if the service crashes and burns, they'll jump to the next hype topic like robots or AI or whatever and the stock price will stay up.
And its fallen back down about 30% of those gains already. Hype causes spikes.. that's nothing new.
-
You seem to have missed the point. Whether or not you think that would be an easy job, the whole reason you'd be there is to be the one that takes all the blame when the autopilot kills someone. It will be your name, your face, every single record of your past mistakes getting blasted on the news and in court because Elon's shitty vanity project finally killed a real child instead of a test dummy. You'll be the one having to explain to a grieving family just how hard it is to actually pay complete attention every moment of every day, when all you've had to do before is just sit there.
How about you pay attention and PREVENT the autopilot from killing someone? Like it's your job to do?
-
So, Tesla Robitaxis drive like a slightly drunk and confused tourist with asshole driving etiquette.
Those right turns on red were like, "oh you get to go? That's permission for me to go too!"
I know many people who believe that "right on red" means they have the right of way to make the turn and don't have to stop first or yield to traffic.
-
To be fair Musk only has money and doesnt Do shit at either Company
It's hilarious to me that Musk claims to work 100 hours a week but he's the CEO of five companies. Even if the claim were true (and of course it isn't) it means being the CEO of one of his companies is a 20-hour-a-week job at best.
-
I know many people who believe that "right on red" means they have the right of way to make the turn and don't have to stop first or yield to traffic.
I know many
peoplefucking morons -
To be fair Musk only has money and doesnt Do shit at either Company
He meddles. That much is apparent. The cybertruck is obviously a top down design as evidenced by the numerous atrocious design compromised the engineers had to make just to make it real. From the glued on "exoskeleton" to the hollowed ALUMINUM frame to the complete lack of physical controls to the default failure state turning it into a coffin to the lack of waterproofing etc.
-
Still work to be done, it uses the blinkers.
At least they were used incorrectly to be just as unpredictable.
-
That wasn’t FSD. In the crash report it shows FSD wasn’t enabled. The driver applied a torque to the steering wheel and disengaged it. They were probably reaching into the back seat while “supervising”.
I covered that crash.
FSD is never enabled at the moment of impact, because FSD shuts off less than a second before impact, so that Tesla's lawyers and most loyal fans can make exactly the arguments you are making. Torque would be applied to the steering wheel when any vehicle departs the roadway, as the driver is thrown around like a ragdoll as they clutch the wheel. Depending on the car, torque can also be applied externally to the tires by rough terrain to shift the steering wheel.
No evidence to suggest the driver was distracted. Prove me wrong if you have that evidence.
Also welcome to the platform, new user!
-
I covered that crash.
FSD is never enabled at the moment of impact, because FSD shuts off less than a second before impact, so that Tesla's lawyers and most loyal fans can make exactly the arguments you are making. Torque would be applied to the steering wheel when any vehicle departs the roadway, as the driver is thrown around like a ragdoll as they clutch the wheel. Depending on the car, torque can also be applied externally to the tires by rough terrain to shift the steering wheel.
No evidence to suggest the driver was distracted. Prove me wrong if you have that evidence.
Also welcome to the platform, new user!
Tesla counts any crash within 5 seconds of FSD disengagement as an FSD crash. Where is the cabin camera footage of the driver not being distracted?
Here is a video that goes over it: https://youtube.com/watch?v=JoXAUfF029I
Thanks for the welcome, but I’m not new just a lemm.ee user.
-
That wasn’t FSD. In the crash report it shows FSD wasn’t enabled. The driver applied a torque to the steering wheel and disengaged it. They were probably reaching into the back seat while “supervising”.
You weren't the user who posted that video, but you seem to be quite knowledgeable in this specific case...
Can you link that crash report? Or can you cite some confirmed details about the incident?
-
You weren't the user who posted that video, but you seem to be quite knowledgeable in this specific case...
Can you link that crash report? Or can you cite some confirmed details about the incident?
See this video: https://youtube.com/watch?v=JoXAUfF029I
-
Ah, Ok.
I agree with accountability, but not with the point system. That's almost like a "three strikes" rule for drunk drivers.
That's not really accountability, that's handing out free passes.
That's almost like a "three strikes" rule for drunk drivers.
Oh man, that would be amazing. If after 3 strikes, all drunk driving could be eliminated... If only we could be so lucky.
He's not talking about a per-vehicle points system, he's talking about a global points system for Tesla inc. If after a few incidents, essentially Tesla FSD had it's license revoked across the whole fleet, I mean, that's pretty strict accountability I'd say. That's definitely not handing out free passes, it's more like you get a few warnings and a chance to fix issues before the entire program is ended nation wide.
-
How about you pay attention and PREVENT the autopilot from killing someone? Like it's your job to do?
This is sarcasm, right?
-
When the engine is off?
Of course, how to tell this with an electric car?
When the motor drivers are energized?
-
Important feedback for the passenger to ensure the car is actually following the rules. I would freak out at a corner if I couldn't tell the car was signaling.
The rider shouldn't have to care.
Naturally, simply being in a "self-driving" Tesla is reason enough to worry.
-
A good example is ADHD. I have severe ADHD so I take meds to manage it. If I am driving an automatic car on cruise control I find it very difficult to maintain long term high intensity concentration. The solution for me is to drive a manual. The constant involvement of maintaining speed, revs, gear ratio, and so on mean I can pay attention much easier. Add to that thinking about hypermiling and defensive driving and I have become a very safe driver, putting about 25-30 thousand kms on my car each year for over a decade without so much as a fender bender. In an automatic I was always tense, forcing focus on the road, and honestly it hurt my neck and shoulders because of the tension. In my zippy little manual I have no trouble driving at all.
Are you me? I love weaving through traffic as fast as I can... in a video game (like Motor Town behind the wheel). In real life I drive very safe and it is boring af for my ADHD so I do things like try to hit the apex of turns just perfect as if I was driving at the limit but I am in reality driving at a normal speed.
Part of living with severe ADHD is you don't get breaks from having to play these games to survive everyday life, as you say it is a stressful reality in part because of this. You brought up a great point too that both of us know, when our focus is on something and activated we can perform at a high level, but accidents don't wait for our focus, they just happen, and this is why we are always beating ourselves up.
We can look at self driving car tech and intuit a lot about the current follies of it because we know what focus is better than anyone else, especially successful tech company execs.
I'm glad other people understand the struggles required for daily life in this respect