Advertisement

NHTSA Is Investigating More Crashes That Followed Tesla’s Autopilot Fix

tesla model 3
NHTSA: More Crashes Followed Tesla’s Autopilot Fixpicture alliance - Getty Images
  • NHTSA seeks additional data from Tesla after over-the-air update was distributed to address recall for functions that were "insufficient to prevent driver misuse."

  • The agency noted that 20 more crashes have occurred despite remedies that were added to the software by the automaker months ago.

  • Driver monitoring systems (DMS) and their effectiveness have received closer scrutiny following the rollout of several SAE Level 2 systems in the US, but regulatory pressure on automakers is viewed as lacking.


Following a December 2023 recall for Autopilot driver-assist software that Tesla offers in its vehicles, the National Highway Traffic Safety Administration is seeking information on remedy effectiveness from the automaker related to the over-the-air updates for the system.

ADVERTISEMENT

The events that prompted the current recall query, as it is officially termed, are some 20 accidents that occurred after the OTA update was performed by Tesla in which Autopilot is believed to have been used.

Tesla's initial response to the recall, in which it sought to address systems that were "insufficient to prevent driver misuse," included five separate software remedies aimed at increasing driver attentiveness, including one-week suspensions of Autopilot in individual cars and various alerts that the system could generate to compel driver attention.

The NHTSA recall query follows a longer investigation into Autopilot safety provisions, following a significant number of crashes in which Tesla vehicles either drove into the backs of emergency vehicles usually parked in traffic lanes, or struck other objects.

Several of these crashes, which occurred at variety of speeds, were rather severe, with over two dozen deaths reported between January 2018 and August 2023.

"Throughout the PE21020 and EA22002 investigations, ODI observed a trend of avoidable crashes involving hazards that would have been visible to an attentive driver," the agency noted in an April 2024 document. "Before August 2023, ODI reviewed 956 total crashes where Autopilot was initially alleged to have been in use at the time of, or leading up to, those crashes."

tesla announces new price cuts ahead of earnings report
Tesla has offered Autopilot in its vehicles since 2015, in addition to Enhanced Autopilot and a system it calls Full Self-Driving (Beta).Scott Olson - Getty Images

The NHTSA's Office of Defects Investigation (ODI) determined that drivers were not sufficiently engaged in the actual driving task, and that Autopilot warnings did not adequately ensure that drivers were paying attention to the road.

The agency also cited the problematic use of the term Autopilot in describing what is still an SAE Level 2 driver-assist system—an issue that has been raised by industry observers since Autopilot debuted in late 2015.

"Notably, the term 'Autopilot' does not imply an L2 assistance feature, but rather elicits the idea of drivers not being in control," the agency added in the same April 2024 document. "This terminology may lead drivers to believe that the automation has greater capabilities than it does and invite drivers to overly trust the automation."

Tesla has until July 1 of this year to provide a list of answers to the ODI's questions, which cover several pages, before it faces monetary penalties.

The EV maker has faced years of criticism for weak driver monitoring systems, with a whole genre of YouTube videos having appeared in which drivers were able to circumvent Autopilot's safety systems enough to move to the passenger seat or the back seat while the vehicle was in motion.

Various third-party devices also appeared on sale that allowed drivers to evade the system's detection of drivers' hands on the steering wheel, allowing them to drive hands-free for prolonged periods of time.

All of these safety gaps went largely unaddressed until a string of high-profile crashes in which Autopilot use was suspected or later confirmed, as vehicles drove into the backs of fire trucks and other first responder vehicles, prompting a NHTSA probe.

Tesla currently plans to make the leap to an SAE Level 4 robotaxi, having promised to unveil such a model this August.

Have you tried any SAE Level 2 systems offered in vehicles today? Let us know in the comments.