The National Highway Traffic Safety Administration (NHTSA) announced on Friday that it has opened an investigation into whether Tesla’s December 2021 recall of over 2 million vehicles equipped with its Autopilot driver assistance system was sufficient to address safety risks. The probe comes amid ongoing concerns about the potential for drivers to misuse or become overly reliant on the technology.
In December 2021, Tesla initiated a voluntary recall of approximately 2,031,220 Model S, Model X, Model 3, and Model Y vehicles manufactured between 2012 and 2023 to update Autopilot software and introduce additional safeguards against driver inattention. The recall aimed to mitigate risks associated with Autopilot being activated on roads unsuitable for the system, such as city streets with intersections.
However, the NHTSA’s investigation, disclosed in documents posted on its website, suggests that the agency is not fully satisfied with Tesla’s remedial actions. The NHTSA stated that it has identified crashes that occurred after the recall was implemented, and that Tesla has made subsequent modifications to Autopilot that were not part of the original recall plan.
“This investigation will scrutinize why these updates were not integrated into the recall or otherwise remedied a defect that potentially poses an unreasonable safety risk,” the NHTSA said in its filing.
The agency’s probe follows a nearly three-year examination of Autopilot, during which it reviewed 956 crashes allegedly involving the system and zeroed in on 322 software-related incidents. The investigation also builds upon the NHTSA’s prior findings that Tesla’s driver engagement monitoring may be inadequate, leading to a “significant safety gap.
Concerns Over Driver Misuse and Overreliance
Safety experts and regulators have long expressed concerns about the potential for drivers to misuse or become overly dependent on Autopilot, which can control steering, braking, and acceleration under certain conditions but requires active driver supervision.
Critics argue that Tesla’s marketing and messaging around Autopilot and its more advanced “Full Self-Driving” system may lead some drivers to believe the vehicles are more capable of autonomous operation than they actually are.
“The elephant in the room is that the software in Tesla is still beta software and they are still using retail customers with no training as test drivers,“ said Philip Koopman, an automotive safety researcher and associate professor at Carnegie Mellon University.
Tesla’s Response and Regulatory Challenges
In a statement posted on the social media platform formerly known as Twitter, Tesla’s policy chief Rohan Patel defended the company’s cooperation with regulators and argued that the current regulatory system is working “about as well as it can given the lack of clear regulations in this field.
“However, some lawmakers and safety advocates contend that the NHTSA’s investigation is overdue and that more stringent regulations are needed to ensure the safe deployment of driver assistance technologies.
“We urge NHTSA to continue its investigations to spur necessary safety improvements, and Tesla to stop misleading drivers and putting the public in great danger,“ said U.S. Senators Richard Blumenthal and Edward J. Markey in a joint statement.
As the NHTSA’s probe unfolds, it underscores the ongoing challenges regulators face in keeping pace with rapidly evolving vehicle automation technologies and ensuring their safe integration into the transportation system. The outcome of the investigation could have significant implications for Tesla and the broader industry as it seeks to balance innovation with public safety concerns.
See more : Traces of Bird Flu Detected in 20% of U.S. Milk Samples, FDA Reports