Tesla recalls US models with ‘full self-driving’ system
Tesla is set to update its Full Self-Driving (FSD) ‘beta’ software in the United States market after the US National Highway Traffic Safety Administration (NHTSA) warned the system did not meet traffic safety laws and could lead to crashes. NHTSA said that Tesla’s software could cause vehicles to travel through intersections unlawfully or unpredictably, or exceed speed limits, increasing the risk of a crash.
According to a report by the Reuters news agency, U.S. senators Ed Markey and Richard Blumenthal said the recall was “long overdue,” and added that “Tesla must finally stop overstating the real capabilities of its vehicles.”
NHTSA’s recall is the latest in a series of safety concerns in the United States over Tesla’s driver assistance system. Last year, the company recalled almost 54,000 US vehicles with FSD beta software due to concerns over the system’s ability to conduct rolling stops, posing a safety risk. The agency has an ongoing investigation into 830,000 Tesla vehicles with Autopilot over a string of crashes with parked emergency vehicles, reviewing whether Tesla vehicles ensure drivers are paying attention.
Since 2016, NHTSA has opened more than three dozen investigations involving Tesla crashes where advanced driver assistance systems were suspected of use, with 19 deaths reported. Former NHTSA senior safety advisor, Prof Missy Cummings, has recently reiterated her concerns about the safety risks posed by such assisted driving systems, including General Motors’ Super Cruise, notably due to drivers over-trusting the technology.
The EU has no equivalent agency overseeing vehicle safety in Europe, and no data are available on the number of crashes involving driver assistance systems. ETSC says such an agency is essential to the safe further rollout of assisted and automated driving systems. In September, Kristian Schmidt, the European Commission’s coordinator for road safety, said that the idea of establishing such an agency was ‘gaining ground’.