Tesla In 'Self-Drive Mode' Hit By Train After Turning Onto Train Tracks
-
That actually sounds like a reasonable response. Driving assist means that a human is supposed to be attentive to take control. If the system detects a situation where it's unable to make a good decision, dumping that decision on the human in control seems like the closest they have to a "fail safe" option. Of course, there should probably also be an understanding that people are stupid and will almost certainly have stopped paying attention a long time ago. So, maybe a "human take the wheel" followed by a "slam the brakes" if no input is detected in 2-3 seconds. While an emergency stop isn't always the right choice, it probably beats leaving a several ton metal object hurtling along uncontrolled in nearly every circumstance.
schrieb am 23. Juni 2025, 18:24 zuletzt editiert von zaphod@sopuli.xyzThat actually sounds like a reasonable response.
If you give the driver enough time to act, which tesla doesn't. They turn it off a second before impact and then claim it wasn't in self-driving mode.
-
This post did not contain any content.schrieb am 23. Juni 2025, 18:31 zuletzt editiert von
Car drove itself on to the tracks, gets hit by a train. This is some Maximum Overdrive shit.
-
This post did not contain any content.schrieb am 23. Juni 2025, 18:35 zuletzt editiert von
How the fuck do you let any level 2 system go 40 to 50 fucking feet down the railroad tracks.
We're they asleep?
-
Deregulation, ain't it great.
schrieb am 23. Juni 2025, 18:36 zuletzt editiert vonI'll just fork it; we can do better than this.
-
Self-driving not being reliable yet is one of the biggest disappointments of the last decade.
schrieb am 23. Juni 2025, 18:37 zuletzt editiert vonWhat did we even do all those ReCAPTCHAs for
-
This post did not contain any content.schrieb am 23. Juni 2025, 18:39 zuletzt editiert von
That … tracks
-
I have a nephew that worked at Tesla as a software engineer for a couple years (he left about a year ago). I gave him the VIN to my Tesla and the amount of data he shared with me was crazy. He warned me that one of my brake lights was regularly logging errors. If their telemetry includes that sort of information then clearly they are logging a LOT of data.
schrieb am 23. Juni 2025, 18:54 zuletzt editiert vonModern cars (in the US) are required to have an OBD-II Port for On-Board Diagnostics. I always assumed most cars these days were just sending some or all of the real-time OBD data to the manufacturer. GM definitely has been.
-
That actually sounds like a reasonable response. Driving assist means that a human is supposed to be attentive to take control. If the system detects a situation where it's unable to make a good decision, dumping that decision on the human in control seems like the closest they have to a "fail safe" option. Of course, there should probably also be an understanding that people are stupid and will almost certainly have stopped paying attention a long time ago. So, maybe a "human take the wheel" followed by a "slam the brakes" if no input is detected in 2-3 seconds. While an emergency stop isn't always the right choice, it probably beats leaving a several ton metal object hurtling along uncontrolled in nearly every circumstance.
schrieb am 23. Juni 2025, 19:04 zuletzt editiert vonSo, maybe a “human take the wheel” followed by a “slam the brakes” if no input is detected in 2-3 seconds.
I have seen reports where Tesla logic appears as "Human take the wheel since the airbag is about to deploy in the next 2 micro seconds after solely relying on camera object detection and this is totally YOUR fault, kthxbai!" If there was an option to allow the bot to physically bail out of the car as it rolls you onto the tracks while you're still sitting in the passenger seat, that's how I would envision how this auto pilot safety function works.
-
How so? The human in the car is always ultimately responsible when using level 3 driver assists. Tesla does not have level 4/5 self-driving and therefore doesn’t have to assume any liability.
schrieb am 23. Juni 2025, 19:12 zuletzt editiert von pika@sh.itjust.worksThis right here is another fault in regulation that eventually will catch up because Especially with level three where it's primarily the vehicle driving and the driver just gives periodic input It's not the driver that's in control most of the time. It's the vehicle so therefore It should not be the driver at fault
Honestly, I think everything up to level two should be drivers at fault because those levels require a constant driver's input. However, level three conditional driving and higher should be considered liability of the company unless the company can prove that the autonomous control, handed control back to the driver in a human-capable manner (i.e Not within the last second like Tesla currently does)
-
If only there was a way to avoid the place where trains drive.
I checked first. They didn't make a turn into a crossing. It turned onto the tracks. Jalopnik says there's no official statement that it was actually driving under FSD(elusion) but if it was strictly under human driving (or FSD turned itself off after driving off) I guarantee Tesla will invade privacy and slander the driver by next day for the sake of court of public opinion
schrieb am 23. Juni 2025, 19:17 zuletzt editiert vonThey didn't make a turn into a crossing. It turned onto the tracks.
Just to be clear for others, it did so at a crossing. That's still obviously not what it should have done and it's no defence of the self-driving feature, but I read your comment as suggesting it had found its way onto train tracks by some other route.
-
Since the story has 3 separate incidents where "the driver let their Tesla turn left onto some railroad tracks" I'm going to posit:
Teslas on self-drive mode will turn left onto railroad tracks unless forcibly restrained.
Prove me wrong, Tesla
schrieb am 23. Juni 2025, 19:22 zuletzt editiert vonMap data obtained and converted from other formats often ends up accidentally condensing labeling categories. One result is train tracks being categorized as generic roads instead of retaining their specific sub-heading. Another, unrelated to this, but common for people that play geo games is when forests and water areas end up being tagged as the wrong specific types.
-
How the fuck do you let any level 2 system go 40 to 50 fucking feet down the railroad tracks.
We're they asleep?
schrieb am 23. Juni 2025, 19:29 zuletzt editiert vonI'm not sure I'd be able to sleep through driving on the railroad tracks. I'm going to guess this person was simply incredibly fucking stupid, and thought the car would figure it out, instead of doing the bare fucking minimum of driving their goddamn 2 ton heavy death machine themself.
-
Modern cars (in the US) are required to have an OBD-II Port for On-Board Diagnostics. I always assumed most cars these days were just sending some or all of the real-time OBD data to the manufacturer. GM definitely has been.
schrieb am 23. Juni 2025, 19:30 zuletzt editiert vonDude, in today's world we're lucky if they stop at the manufacturer. I know of a few insurances that have contracts through major dealers and they just automatically get the data that's registered via the cars systems. That way they can make better decisions regarding people's car insurance.
Nowadays it's a red flag if you join a car insurance and they don't offer to give you a discount if you put something like drive pass on which logs you're driving because it probably means that your car is already getting that data to them.
-
I'm not sure I'd be able to sleep through driving on the railroad tracks. I'm going to guess this person was simply incredibly fucking stupid, and thought the car would figure it out, instead of doing the bare fucking minimum of driving their goddamn 2 ton heavy death machine themself.
schrieb am 23. Juni 2025, 19:36 zuletzt editiert vonLOL right? Like deciding to see what the car will do ON RAILWAY TRACKS is absolutely fucking bonkers.
-
Map data obtained and converted from other formats often ends up accidentally condensing labeling categories. One result is train tracks being categorized as generic roads instead of retaining their specific sub-heading. Another, unrelated to this, but common for people that play geo games is when forests and water areas end up being tagged as the wrong specific types.
schrieb am 23. Juni 2025, 19:49 zuletzt editiert vonAha. But that sounds correctable.... So not having any people assigned to checking on railroads and making sure the system recognizes them as railroads would be due to miserliness on the part of Tesla then.... And might also say something about why some Teslas have been known to drive into bodies of water (or children, but that's probably a different instance of miserliness)
-
That, and little kids... and motorcycles... and school busses
schrieb am 23. Juni 2025, 20:19 zuletzt editiert von wizardbeard@lemmy.dbzer0.comDon't forget "styrofoam walls painted to look like tunnels". Fucking looney tunes.
-
How the fuck do you let any level 2 system go 40 to 50 fucking feet down the railroad tracks.
We're they asleep?
schrieb am 23. Juni 2025, 20:36 zuletzt editiert vonI was gonna say it’s not so much the fact that the car was hit by a train, but that it turned on to the tracks …but 40 or 50 feet?
-
This post did not contain any content.schrieb am 23. Juni 2025, 20:39 zuletzt editiert von
It simply saw a superior technology and decided to attack.
-
That actually sounds like a reasonable response.
If you give the driver enough time to act, which tesla doesn't. They turn it off a second before impact and then claim it wasn't in self-driving mode.
schrieb am 23. Juni 2025, 20:41 zuletzt editiert vonNot even a second, it's sometimes less than 250-300ms. If I wasn't already anticipating it to fail and disengage as it went though the 2-lane wide turn I would have gone straight into oncoming traffic
-
That … tracks
schrieb am 23. Juni 2025, 20:46 zuletzt editiert vonFull stream ahead with the train puns
-
Itch.io deindexes NSFW games after becoming the latest target of skittish credit card companies and anti-porn group Collective Shout, catching an award-winning indie and more in the crossfire
Technology134 vor 4 Tagenvor 10 Tagen1
-
-
Do you trust Xi with your 'private' browsing data? Apple and Google app stores still offer China-based VPNs.
Technology 13. Juni 2025, 20:391
-
-
ChatGPT "Absolutely Wrecked" at Chess by Atari 2600 Console From 1977
Technology 10. Juni 2025, 21:181
-
-
-
‘Digital blitz’: Misinformation on social media casts shadow on US-China trade truce
Technology 31. Mai 2025, 17:161