Around two months ago, a Tesla owner was driving his Model 3 in FSD (Full Self-Driving) mode in Alabama when the car suddenly, for no obvious reason, veered off the road, crashed and flipped upside down. The driver suffered a chin injury but was okay. The owner shared a video of the car veering left, a video that has Tesla examiner’s baffled. (Go to the 30-second mark).
Since Tesla’s FSD was first launched in 2020, we have heard of hundreds of non-fatal crashes involving Tesla’s so-called ‘Autopilot’ FSD software and 51 reported fatalities, 44 of which The National Highway Traffic Safety Administration’s (NHTSA) investigations had later verified.
Tesla’s FSD naming is actually a misnomer
In an October 2024 Reuters story, the publication reported that the NHTSA had opened an investigation into 2.4 million Tesla vehicles fitted with the carmaker’s FSD software after four collisions and one fatal crash in Arizona in 2023. In the NHTSA’s report findings, “Tesla concluded in its 23V838 filing that, in certain circumstances, Autopilot’s system controls may be insufficient for a driver assistance system that requires constant supervision by a human driver.”
Tesla calls its well-known autonomous driving system ‘full-self drive’ which is actually a misnomer. Despite its title, FSD is still defined as a level 2 driver assist system and is not fully self-driving. Drivers are expected to stay attentive at all times and be ready to take control, which is one reason Tesla has recently added ‘Supervised’ to the name. According to Tesla, the driver is responsible for the car’s behavior, even if FSD is activated.
Interestingly though, just last week, Tesla released a post on X in which it commented that drivers just have to “lean back and watch the road” when using its FSD.
The driver was paying attention but had no time to react
That’s exactly what Wally, a Tesla Model 3 owner in Toney, Alabama was doing when his car suddenly veered off the road and crashed in March this year. The driver was aware that even though he had engaged the car’s FSD, he was required to pay attention and be ready to take control immediately if something untoward happened.
Wally explained that, “I used FSD every chance I could get I actually watched YouTube videos to tailor my FSD settings and experience. I was happy it could drive me to Waffle House and I could just sit back and relax while it would drive me on my morning commute to work.”
Wally said that he didn’t have time to react even though he was paying attention, and a quick look at the video reveals just how quickly the FSD system can malfunction. He said, “I was driving to work and had FSD on. The oncoming car passed, and the wheel started turning rapidly, driving into the ditch, side-swiping a tree, and the car flipped over. I did not have any time to react.”
So what caused the malfunction? That’s hard to determine
From this writer’s point of view, it appears as though Wally’s Tesla veered left as soon as it sensed the passing of the oncoming pickup truck as his car turned a split second after the truck passed. Had it turned in even one second earlier his car would have collided head on with the truck. But there were other shadows on the road and a sign post which could have triggered to unexpected response.
Wally explained that he was using Tesla FSD v13.2.8 on Hardware 4, Tesla’s latest FSD software. He requested that Tesla send him the data from his car to better understand what happened.
As you can see from the video, the driver had less than a second to react, which counters Tesla’s claim that the driver can take control if the system suddenly malfunctions. The driver here appears to be not at fault. The car is. From what we saw in this incident, it would seem as though Tesla needs to go back to the drawing board and rethink its FSD system, especially as it plans to rollout this self-driving tech on Robotaxis in June.