My Self-Driving Car Crash – The Tesla was driving perfectly–until it wasn't

ForHackernews 33 points 15 comments March 17, 2026
www.theatlantic.com · View on Hacker News

Discussion Highlights (7 comments)

ForHackernews

Paywall https://www.theatlantic.com/magazine/2026/04/self-driving-ca...

jqpabc123

Constant supervision of "self driving" requires nearly as much attention and effort as driving. Otherwise, you are risking your safety. So why pay extra to take risks?

rogerrogerr

You'd think that such a personal traumatic experience would make this author want to use their own words, not outsource writing about it to LLMs.

t0mas88

It's part of the EASA pilots training material on human factors: Humans are notoriously bad at monitoring something that goes fine 99% of the time. Our brains are not made for that. For aircraft it's very rare to receive control back from the autopilot in an upset-state. An exception is a trim-runaway, which is a very serious emergency that gets trained for. In cars you don't have that luxury, it's much more likely that you have to act immediately with very short notice. That does not work for human drivers.

edgineer

>The car was making a turn. Something felt off—the steering wheel jerked one way, then the other, and the car decelerated in a way I didn’t expect. I turned the wheel to take over. I don’t know exactly what the system was doing, or why. I only know that somewhere in those seconds, we ended up colliding with a wall. >I don’t know enough about what actually happened during my accident to say that Tesla’s technology crashed the car. The cause may have been the combination of FSD and human takeover. When the car fucks up, the driver can take over poorly. Sort of how 'overcorrection' on a highway could spin you out of control. There is a gray area where a car's guidance drives stupidly, yet would not actually result in an accident. The hot take is that a driver with his face buried in his phone the whole time may have had a better outcome.

x187463

It would be great to have some useful information about the failure condition for FSD in this situation but the author provides essentially nothing. > The car was making a turn. Something felt off—the steering wheel jerked one way, then the other, and the car decelerated in a way I didn’t expect. I use the latest FSD in an M3 and I have noticed it behave indecisively when changing lanes, not so much when turning, but I believe the author's account. > I turned the wheel to take over. I don’t know exactly what the system was doing, or why. I only know that somewhere in those seconds, we ended up colliding with a wall. The author disengaged FSD (reasonable when concerned) and ran into a wall. I almost never let go of the steering wheel when FSD is driving. I want to be able to take over with the minimum delay. I don't know that I'll ever trust it to drive unsupervised. It's an unbelievable driver-assistance system. But you need to treat it as such. Tesla may market it, and name it, otherwise, but anybody using FSD should quickly realize it has limits.

1970-01-01

>My memory is hazy, and some of it comes from one of my sons, who watched the whole thing unfold from the back seat. The car was making a turn. Something felt off—the steering wheel jerked one way, then the other, and the car decelerated in a way I didn’t expect. I turned the wheel to take over. I don’t know exactly what the system was doing, or why. I only know that somewhere in those seconds, we ended up colliding with a wall. FSD cannot take the blame if the driver took over controls and simply cannot remember anything that happened after doing so.

Semantic search powered by Rivestack pgvector
3,471 stories · 32,344 chunks indexed