Driver inattention combined with flaws in Tesla’s Autopilot and Full Self-Driving (FSD) technologies have resulted in hundreds of injuries and dozens of deaths, according to a report by the NHTSA.
The findings indicate that users of these systems “were not sufficiently engaged in vehicle operation.” Meanwhile, Tesla’s technologies themselves “did not ensure an adequate level of driver attention.”
From January 2018 to August 2023, the regulator investigated 956 accidents. In these incidents, some involving collisions with other vehicles, 29 people lost their lives.
The agency also recorded 211 incidents where “the front of a Tesla vehicle struck another vehicle or obstacle in its path.” In the most severe cases, 14 people died and 49 were injured.
NHTSA initiated the investigation after Tesla drivers crashed into stationary emergency vehicles parked on the roadside. Most of these incidents occurred in the evening and nighttime, with the software ignoring road safety measures such as warning lights, flares, cones, and arrow boards.
Drivers on Autopilot Fail to React in Time
The agency concluded in its report that Autopilot and, in some cases, FSD were not designed to keep the driver’s attention on the road. Tesla, in turn, warns customers of the need to remain vigilant when using these technologies, including keeping hands on the wheel and eyes on the road.
However, according to NHTSA, in many cases, drivers became overconfident and lost focus. When it came time to react to a dangerous situation, it was often too late.
In 59 accidents studied, the agency found that Tesla drivers had ample time—five or more seconds before colliding with another object—to react. In 19 cases, the threat was apparent for 10 seconds or more.
After reviewing data logs and information provided by Tesla, NHTSA found that in most such situations, drivers failed to brake or steer to avoid a collision.
“Crashes where the driver made no or delayed evasive maneuvers were found in all versions of Tesla hardware and incident circumstances,” NHTSA stated.
Tesla’s Promises Are Overstated
NHTSA also compared Tesla’s Level 2 (L2) automation features with those of other automakers. Unlike competitors, Autopilot disengages rather than allowing the driver to adjust steering. According to NHTSA, this “discourages” drivers from staying alert at the wheel.
“Comparing Tesla’s design choices with similar L2 solutions revealed that the company is an industry outlier in its approach to technology, as the weak driver engagement system does not align with Autopilot’s permissive capabilities,” the agency noted.
Even the brand name Autopilot is misleading, NHTSA claimed. It supposedly suggests that the driver is not in control of the situation.
While other companies use terms like “assist,” “sense,” or “command,” Tesla’s products lead drivers to believe they have more capabilities than they actually do.
The California Attorney General and the state’s Department of Motor Vehicles are investigating the company over misleading branding and marketing.
NHTSA acknowledges that its investigation may be incomplete due to “gaps” in Tesla’s telemetry data. In reality, there may be far more accidents involving Autopilot and FSD than the agency has identified.
Autopilot Update Fails to Address Safety Concerns
Late last year, Tesla issued a voluntary software recall in response to the NHTSA investigation, updating the program to add additional warnings in Autopilot. However, NHTSA has announced a new investigation, as several safety experts pointed out the “inadequacy” of the measures taken, which allow for improper use of driver assistance systems.
The investigation’s findings contradict Tesla CEO Elon Musk’s claims that his company is developing AI and is on the verge of releasing a fully autonomous electric car for personal use. He plans to unveil a robotaxi by the end of this year, which is expected to usher in a new era for the automaker.
Last week, during a first-quarter earnings report, Musk again stated that Tesla vehicles are safer than human-driven cars.
“If a statistically significant amount of data convincingly shows that a self-driving car is, say, twice as unlikely to be involved in accidents as a human-driven car, I think it’s hard to ignore,” Musk said.
He believes that halting the use of self-driving cars is equivalent to “killing people.”
Elon Musk Eyes Chinese Market
Coinciding with the release of the NHTSA report, Musk made an unannounced visit to China. According to sources, one of the topics discussed was the implementation of Autopilot and FSD in the country.
Earlier, Musk wrote on X that Tesla could make Full Self-Driving available to customers in China “very soon.”
It may be possible very soon
— Elon Musk (@elonmusk) April 20, 2024
Chinese state broadcaster CCTV, in its report on Musk’s meeting with Chinese Premier Li Qiang, did not mention whether they discussed FSD or data.
Musk’s trip took place just over a week after he canceled a planned visit to India to meet with Prime Minister Narendra Modi.
Back in May 2023, U.S. Transportation Secretary Pete Buttigieg criticized Tesla’s Autopilot.
