Tesla Robotaxi Freaks Out and Drives into Oncoming Traffic on First Day
-
I saw the Tesla Robotaxi:
- Drive into oncoming traffic, getting honked at in the process.
- Signal a turn and then go straight at a stop sign with turn signal on.
- Park in a fire lane to drop off the passenger.
And that was in a single 22 minute ride. Not great performance at all.
Sounds like the indian guy driving it with a joystick was a bit hungover. You'd think they'd screen that thing at the entrance of the cubicle farm where all these AI folk drive these from. AI is just "anonymous indians" for elmo's grifting kind.
-
While I agree focusing on public transport is a better idea, it's completely absurd to say machines can never possibly drive as well as humans. It's like saying a soul is required or other superstitious nonsense like that. Imagine the hypothetical case in which a supercomputer that perfectly emulates a human brain is what we are trying to teach to drive. Do you think that couldn't drive? If so, you're saying a soul is what allows a human to drive, and may as well be saying that God hath uniquely imbued us with the ability to drive. If you do think that could drive, then surely a slightly less powerful computer could. And maybe one less powerful than that. So somewhere between a casio solar calculator and an emulated human brain must be able to learn to drive. Maybe that's beyond where we're at now (I don't necessarily think it is) but it's certainly not impossible just out of principle. Ultimately, you are a computer at the end of the day.
I never did say it wouldn't ever be possible. Just that it will take a long time to reach par with humans. Driving is culturally specific, even. The way rules are followed and practiced is often regionally different. Theres more than just the mechanical act itself.
The ethics of putting automation in control of potentially life threatening machines is also relevant. With humans we can attribute cause and attempted improvement, with automation its different.
I just don't see a need for this at all. I think investing in public transportation more than reproduces all the benefits of automated cars without nearly as many of the dangers and risks.
-
I've been saying for years that focusing on self driving cars is solving the wrong problem. The problem is so many people need their own personal car at all.
Exactly. Bring back trams, build less suburbs, better apartment housing. If we want a society reorganized around accessibility then let's actually build that.
-
Wow that turn signal sound is annoying. Why does it even need to make a sound in a car that’s supposed to be driving itself?
Important feedback for the passenger to ensure the car is actually following the rules. I would freak out at a corner if I couldn't tell the car was signaling.
-
I saw the Tesla Robotaxi:
- Drive into oncoming traffic, getting honked at in the process.
- Signal a turn and then go straight at a stop sign with turn signal on.
- Park in a fire lane to drop off the passenger.
And that was in a single 22 minute ride. Not great performance at all.
What's crazy is that the safety driver's hair has gone completely grey in just two days.
-
Not the article, the post from njordamir that you were directly replying to.
shouldn't accountability be in place now,
Again literally what that user was suggesting
Ah, Ok.
I agree with accountability, but not with the point system. That's almost like a "three strikes" rule for drunk drivers.
That's not really accountability, that's handing out free passes.
-
Man, I cannot figure out why that vehicle was turning. What is it trying to avoid? Why does it think there could be road there? Why doesn't it try to correct its action mid way?
I'm really concerned about that last question. I have to assume that at some point prior to impact, the system realized it made a mistake. Surely. So why didn't it try to recover from the situation? Does it have a system for recovering from errors, or does it just continue and say "well I'll get it next time, now on with the fetal crash"?
That wasn’t FSD. In the crash report it shows FSD wasn’t enabled. The driver applied a torque to the steering wheel and disengaged it. They were probably reaching into the back seat while “supervising”.
-
I never did say it wouldn't ever be possible. Just that it will take a long time to reach par with humans. Driving is culturally specific, even. The way rules are followed and practiced is often regionally different. Theres more than just the mechanical act itself.
The ethics of putting automation in control of potentially life threatening machines is also relevant. With humans we can attribute cause and attempted improvement, with automation its different.
I just don't see a need for this at all. I think investing in public transportation more than reproduces all the benefits of automated cars without nearly as many of the dangers and risks.
Driving is culturally specific, even. The way rules are followed and practiced is often regionally different
This is one of the problems driving automation solves trivially when applied at scale. Machines will follow the same rules regardless of where they are which is better for everyone
The ethics of putting automation in control of potentially life threatening machines is also relevant
You'd shit yourself if you knew how many life threatening machines are already controlled by computers far simpler than anything in a self driving car. Industrially, we have learned the lesson that computers, even ones running on extremely simple logic, just completely outclass humans on safety because they do the same thing every time. There are giant chemical manufacturing facilities that are run by a couple guys in a control room that watch a screen because 99% of it is already automated. I'm talking thousands of gallons an hour of hazardous, poisonous, flammable materials running through a system run on 20 year old computers. Water chemical additions at your local water treatment plant that could kill thousands of people if done wrong, all controlled by machines because we know they're more reliable than humans
With humans we can attribute cause and attempted improvement, with automation its different.
A machine can't drink a handle of vodka and get behind the wheel, nor can it drive home sobbing after a rough breakup and be unable to process information properly. You can also update all of them all at once instead of dealing with PSA canpaigns telling people not to do something that got someone killed. Self driving car makes a mistake? You don't have to guess what was going through its head, it has a log. Figure out how to fix it? Guess what, they're all fixed with the same software update. If a human makes that mistake, thousands of people will keep making that same mistake until cars or roads are redesigned and those changes have a way to filter through all of society.
I just don't see a need for this at all. I think investing in public transportation more than reproduces all the benefits of automated cars without nearly as many of the dangers and risks.
This is a valid point, but this doesn't have to be either/or. Cars have a great utility even in a system with public transit. People and freight have to get from the rail station or port to wherever they need to go somehow, even in a utopia with a perfect public transit system. We can do both, we're just choosing not to in America, and it's not like self driving cars are intrinsically opposed to public transit just by existing.
-
The Tesla is is just following the regional driving style. Humans make the same mistakes at 15:06
/s
This but unironically. If this is the worst thing that happened on launch day then that seems pretty successful to me. This is the worst version of the robo taxi we will ever see.
-
I saw the Tesla Robotaxi:
- Drive into oncoming traffic, getting honked at in the process.
- Signal a turn and then go straight at a stop sign with turn signal on.
- Park in a fire lane to drop off the passenger.
And that was in a single 22 minute ride. Not great performance at all.
Navigation issue / hesitation
The video really understates the level of fuck up that the car did there...
And the guy sitting there just casually being ok with the car ignoring the forced left going straight into oncoming lanes and flipping the steering wheel all over the place because it has no idea what the hell just happened... I would not be just chilling there..
Of course, I wouldn't have gotten in this car in the first place, and I know they cherry picked some hard core Tesla fans to be allowed to ride at all...
-
Yea, this one isn't an issue. If you are dropping off passengers, you are allowed to stop in a fire lane because that is not parking.
Which brings up an interesting question, when is a driverless car 'parked' vs. 'stopped'?
-
What's crazy is that the safety driver's hair has gone completely grey in just two days.
That safety driver did not give a single fuck about driving on the wrong side of the road..
-
Tbh it's not as bad as I was expecting. Those clips could definitely have resulted in an accident, but the system seems to actually work most of the time. I wonder if it couldn't be augmented with lidar at this point to make it more reliable? A live stress test is ridiculously irresponsible and will definitely kill people, but at least it's only Texans at risk (for now).
I was skeptical of the idea of robotaxis, but this kind of sold me on it. If they're cheaper than human drivers, I might even be able to get rid of my car some day. It doesn't change the fact that I'll never get into one because the CEO is a nazi though.
Keep in mind this is a system with millions of miles under it's belt and it still doesn't understand what to do with a forced left turn lane in a very short trip in a fairly controlled environment with supremely good visual, road, and traffic conditions. LIDAR wouldn't have helped the car here, there was no "whoops, confusining visibility", it just completely screwed up and ignored the road markings.
It's been in this state for years now, of being surprisingly capable, yet horrible screw ups being noted frequently. They seem to be like 95% of the way there and stuck, with no progress in reality just some willfull denial convincing them to move forward anyway.
-
I saw the Tesla Robotaxi:
- Drive into oncoming traffic, getting honked at in the process.
- Signal a turn and then go straight at a stop sign with turn signal on.
- Park in a fire lane to drop off the passenger.
And that was in a single 22 minute ride. Not great performance at all.
So, Tesla Robitaxis drive like a slightly drunk and confused tourist with asshole driving etiquette.
Those right turns on red were like, "oh you get to go? That's permission for me to go too!"
-
I saw the Tesla Robotaxi:
- Drive into oncoming traffic, getting honked at in the process.
- Signal a turn and then go straight at a stop sign with turn signal on.
- Park in a fire lane to drop off the passenger.
And that was in a single 22 minute ride. Not great performance at all.
So it emulates a standard BMW driver. Well done.
-
AI=ALways indian.
AI = Actually Indians
-
Watch that stock price fall... wheeeee
It already jumped up about 10% on monday simply because the service launched. Even if the service crashes and burns, they'll jump to the next hype topic like robots or AI or whatever and the stock price will stay up.
-
So it emulates a standard BMW driver. Well done.
Still work to be done, it uses the blinkers.
-
That safety driver did not give a single fuck about driving on the wrong side of the road..
He must have seen so much worse to not even be flinching at that.
-
I saw the Tesla Robotaxi:
- Drive into oncoming traffic, getting honked at in the process.
- Signal a turn and then go straight at a stop sign with turn signal on.
- Park in a fire lane to drop off the passenger.
And that was in a single 22 minute ride. Not great performance at all.
Woaw! Damn! The robotaxis are a dangerous fuck up!? That's most surprising thing that happened all year! There's literally no way I could've seen that coming.