On a typical weekday, Walter Huang usually woke around 7 a.m., made breakfast for his family, then dropped his son and daughter off at their schools before driving to work.
On his typical commute, he sometimes had trouble with the Autopilot driver-assist system in his Tesla Model X, always at the same Silicon Valley location.
Huang complained to family and friends that his car veered toward a concrete barrier on multiple occasions at a specific location along southbound U.S. 101. Data analyzed by crash investigators verified that at least one of these incidents occurred four days before his Model X swerved into the concrete barrier and claimed Huang’s life.
Details about the March 23, 2018, crash in Mountain View, Calif., were contained in a trove of documents released last week by the National Transportation Safety Board, which has examined the case and others like it. Huang’s case is the fourth investigation the NTSB has undertaken in recent years involving Autopilot. It’s unusual for the agency to expend such efforts probing a single technology, especially given the likelihood that findings will echo those from previous crash investigations involving Autopilot, resulting in the same recommendations.
Why might the NTSB conduct more investigations merely to reiterate what it has already said? Maybe because that’s all it can do.
The agency is charged with investigating crashes and making recommendations on how to improve safety. It has no authority to implement changes. That belongs to the U.S. Department of Transportation and NHTSA, which have not yet seen fit to mandate any of the changes the NTSB has suggested for not just Autopilot, but all driver-assist systems.
“This is NTSB crying out for NHTSA to do something, and it seems to be availing itself of the only real mechanism it has to do that, which is to publish detailed reports that lead technical experts to understand what is rather obvious: Which is that there’s a problem here,” said Jason Levine, executive director of the Center for Auto Safety.
NHTSA may be well aware of that possibility. The regulatory agency launched its own probe into the first fatal crash involving Autopilot, which occurred in May 2016, when Josh Brown drove his Tesla Model S beneath a tractor-trailer that crossed his path along a Florida highway.
Ultimately, the agency determined the system had operated within its intended design domain. That is, because it was not designed to sense a truck crossing its path, no defect existed. Tesla CEO Elon Musk took that conclusion as a full-blown exoneration of the system. In retrospect, that seems premature.
Driver inattention and overreliance on vehicle automation are probable causes of these crashes, the NTSB has found. But the agency also says Autopilot is part of the problem: By design, it permits drivers to disengage from the driving task.
Investigations are mounting as the death toll rises. The NTSB is investigating a deadly 2019 crash in Delray Beach, Fla., with near-identical circumstances to the 2016 crash. Meanwhile, NHTSA’s Special Crash Investigations division has made 14 separate probes of Autopilot, including two ongoing examinations of separate fatal crashes that occurred Dec. 29, 2019.
Tesla did not return a request for comment last week.
There’s mounting pressure for federal regulators to do something — up to and including using its recall authority — to compel changes in everything from the way Autopilot ensures human attention stays on the road to the way Tesla markets the feature.
Perhaps regulators are inching toward those changes. Last month, U.S. Transportation Secretary Elaine Chao said her department is supporting a “Clearing the Confusion” initiative that seeks to clarify the jumble of terminology automakers have created to market driver-assist technology.
Also last month, the DOT released the fourth iteration of its Federal Automation Vehicle Policy, in which Chao underscores that dozens of separate federal departments have finally aligned on a coherent national policy toward vehicle automation.
In theory, that alignment may mean an agency such as the Federal Trade Commission is better equipped to examine Tesla’s marketing of Autopilot. But Levine, whose safety advocacy organization has called upon the FTC to launch such an investigation, isn’t buying the idea that a more organized government is one with more teeth.
“When the agency with the lead responsibility is clearly doing nothing,” he said, “why should the others step out of their general jurisdictional lanes to support efforts that aren’t actually moving?”
Inaction speaks louder than words. And more families such as Huang’s may be left to wonder whether their loved ones would still be alive if someone with the power to act had done so.