Tesla recalls nearly all vehicles on US roads over lack of Autopilot safeguards

WASHINGTON, Dec 13 (Reuters) – Tesla (TSLA.O) is recalling just over 2 million vehicles in the United States fitted with its Autopilot advanced driver-assistance system to install new safeguards, after a federal safety regulator said the system posed safety concerns.

The National Highway Traffic Safety Administration (NHTSA) has been investigating the electric automaker led by billionaire Elon Musk for more than two years over whether Tesla vehicles adequately ensure that drivers pay attention when using Autopilot. The largest-ever Tesla recall appears to cover nearly all of its vehicles on U.S. roads.

Tesla said in a recall filing that Autopilot’s software system controls “may not be sufficient to prevent driver misuse” and could increase the risk of a crash.

Acting NHTSA Administrator Ann Carlson at a U.S. House hearing on Wednesday praised Tesla for agreeing to the Autopilot recall. “One of the things we determined is that drivers are not always paying attention when that system is on,” she said.

Carlson added that when she kept hearing about fatal crashes involving the use of Autopilot, the agency opened a safety probe in August 2021. “My immediate response was, ‘We have to do something about this,'” she said.

Shares of the world’s most valuable automaker were down 3.4% at $228.97 on Wednesday afternoon.

Tesla’s Autopilot is intended to enable cars to steer, accelerate and brake automatically within their lane, while enhanced Autopilot can assist in changing lanes on highways but does not make them autonomous.

One component of Autopilot is Autosteer, which maintains a set speed or following distance and works to keep a vehicle in its driving lane.

Tesla said it did not agree with NHTSA’s analysis but would deploy an over-the-air software update that will “incorporate additional controls and alerts to those already existing on affected vehicles to further encourage the driver to adhere to their continuous driving responsibility whenever Autosteer is engaged.”

The company did not respond to a question on whether the recall would be performed outside the United States or offer more precise details of the new safeguards. It is not immediately clear if China will demand a recall over the same issue.

A spokesperson for the Italian Transport Ministry had no knowledge of similar actions being taken in Italy. Regulators in Germany said they are looking into the issue.

A Model 3 Tesla vehicle navigates morning rush hour using the car's auto pilot feature in Los Angeles

[1/2]A Model 3 Tesla vehicle navigates morning rush hour using the car’s auto pilot feature in Los Angeles, California, U.S., March 20, 2019. REUTERS/Mike Blake/File Photo Acquire Licensing Rights

‘FORESEEABLE MISUSE’

NHTSA opened its August 2021 probe of Autopilot after identifying more than a dozen crashes in which Tesla vehicles hit stationary emergency vehicles and upgraded it in June 2022. NHTSA said it found “Tesla’s unique design of its Autopilot system can provide inadequate driver engagement and usage controls that can lead to foreseeable misuse of the system.” NHTSA reviewed 956 crashes where Autopilot was initially alleged to have been in use and focused on 322 Autopilot-involved crashes in its probe.

Bryant Walker Smith, a University of South Carolina law professor who studies transportation issues, said the software-only fix will be fairly limited. The recall “really seems to put so much responsibility on human drivers instead of a system that facilitates such misuse,” Smith said.

Separately, since 2016, NHTSA has opened more than three-dozen Tesla special crash investigations in cases where driver systems such as Autopilot were suspected of being used, with 23 crash deaths reported to date.

NHTSA said there may be an increased risk of a crash in situations when the system is engaged but the driver does not maintain responsibility for vehicle operation and is unprepared to intervene or fails to recognize when it is canceled or not.

NHTSA’s investigation into Autopilot will remain open as it monitors the efficacy of Tesla’s remedies.

The company will roll out the update to 2.03 million Model S, X, 3 and Y vehicles in the United States dating back to the 2012 model year, the agency said.

The update based on vehicle hardware will include increasing prominence of visual alerts on the user interface, simplifying engagement and disengagement of Autosteer and additional checks upon engaging Autosteer.

Tesla disclosed in October that the U.S. Justice Department had issued subpoenas related to its Full Self-Driving (FSD) and Autopilot systems. Reuters reported in October 2022 that Tesla was under criminal investigation over claims the company’s electric vehicles could drive themselves.

Tesla in February recalled 362,000 U.S. vehicles to update its FSD Beta software after NHTSA said the vehicles did not adequately adhere to traffic safety laws and could cause crashes.

NHTSA closed an earlier investigation into Autopilot in 2017 without taking any action. The National Transportation Safety Board (NTSB) has criticized Tesla for a lack of system safeguards for Autopilot, and NHTSA for a failure to ensure the safety of Autopilot.

Democratic U.S. Representative Jan Schakowsky said, “it’s past time to rein in Tesla’s hazardous advanced driving systems” and praised NHTSA for taking “action to protect all road users from misuse of these systems.”

Reporting by Mrinmay Dey and Aditya Soni in Bengaluru and David Shepardson in Washington, additional reporting by Angelo Amante in Rome and Christina Amann in Berlin Editing by Tomasz Janowski and Matthew Lewis