Detroit — The government’s auto safety agency is investigating whether last year’s recall of Tesla’s Autopilot driving system did enough to make sure drivers pay attention to the road.
The National Highway Traffic Safety Administration says in documents posted on its website Friday that it has concerns about the December recall of more than 2 million vehicles, nearly all the vehicles that Tesla had sold at the time.
The agency pushed the company to conduct the recall after a two-year investigation into Autopilot’s driver monitoring system, which measures torque on the steering wheel from a driver’s hands.
The fix involves an online software update to increase warnings to drivers. But the agency said in documents that it’s found evidence of crashes after the fix and that Tesla added updates that weren’t part of the recall.
“This investigation will consider why these updates were not part of the recall or otherwise determined to remedy a defect that poses an unreasonable safety risk,” the agency wrote.
A message was left early Friday seeking comment from Tesla.
The new recall probe includes Models Y, X, S, 3 and Cybertruck vehicles in the U.S. that have Autopilot systems manufactured in the 2012 and 2024 model years, the NHTSA said.
The NHTSA also said Friday it is concerns that the “Autopilot” name “may lead drivers to believe that the automation has greater capabilities than it does and invite drivers to overly trust the automation.”
The agency said Tesla reported 20 crashes that apparently happened after the recall remedy was sent out. The agency has required Tesla and other automakers to report crashes involving partially and fully automated driving systems.
The NHTSA said it will evaluate the recall, including the “prominence and scope” of Autopilot’s controls to address misuse, confusion and use in environments that the system isn’t designed to handle.
It also said Tesla has stated that owners can decide whether they want to opt into parts of the recall remedy and that it lets drivers reverse parts of it.
Safety advocates have long expressed concern that Autopilot, which can keep a vehicle in its lane and a distance from objects in front of it, wasn’t designed to operate on roads other than limited access highways.
The investigation comes just a week after a Tesla that may have been operating on Autopilot hit and killed a motorcyclist near Seattle, raising questions about whether a recent recall went far enough to ensure Tesla drivers using Autopilot pay attention to the road.
After the April 19 crash in a suburban area about 15 miles northeast of the city, the driver of a 2022 Tesla Model S told a Washington State Patrol trooper that he was using Autopilot and looked at his cellphone while the Tesla was moving.
“The next thing he knew there was a bang and the vehicle lurched forward as it accelerated and collided with the motorcycle in front of him,” the trooper wrote in a probable-cause document.
The 56-year-old driver was arrested for investigation of vehicular homicide “based on the admitted inattention to driving, while on Autopilot mode, and the distraction of the cell phone while moving forward, putting trust in the machine to drive for him,” the affidavit said.
The Tesla driver told the trooper he was driving home from having lunch when the crash occurred at about 3:45 p.m.
The motorcyclist, Jeffrey Nissen, 28, of Stanwood, Washington, was under the car and pronounced dead at the scene, authorities reported.
Authorities said they haven’t yet independently verified whether Autopilot was in use at the time of the crash.
The Associated Press reported shortly after the recall that experts said it relied on technology that may not work.
Tesla, the leading manufacturer of EVs, reluctantly agreed to the recall last year after NHTSA found that the driver monitoring system was defective and required a fix.
The system sends alerts to drivers if it fails to detect torque from hands on the steering wheel, a system that experts describe as ineffective.
Government documents filed by Tesla say the online software change will increase warnings and alerts to drivers to keep their hands on the steering wheel. It also may limit the areas where the most commonly used versions of Autopilot can be used, though that wasn’t entirely clear in Tesla’s documents.
The NHTSA began its investigation in 2021, after receiving 11 reports that Teslas that were using the partially automated system crashed into parked emergency vehicles. Since 2016, the agency has sent investigators to at least 35 crashes in which Teslas that were suspected of operating on a partially automated driving system hit parked emergency vehicles, motorcyclists or tractor trailers that crossed in the vehicles’ paths, causing a total of 17 deaths.
Research conducted by NHTSA, the National Transportation Safety Board and other investigators shows that merely measuring torque on the steering wheel doesn’t ensure that drivers are paying sufficient attention. Experts say night-vision cameras are needed to watch drivers’ eyes to ensure they’re looking at the road.