Autonomous vehicle manufacturers need a better yardstick to show that their products are safe, said Derek Kan, under secretary for policy at the U.S. Department of Transportation.
The metrics that are most widely used by self-driving car developers — miles driven and the frequency of human intervention — alone are insufficient to demonstrate the safety of an autonomous automobile, Kan said at a conference in Washington on Tuesday.
Kan’s comments strike at a key aspect of the driverless car industry’s competition for public perception, where companies such as General Motors Co. and Alphabet Inc.’s Waymo unit tout millions of test miles traveled under computer control and the number of times a test engineer had to intervene to measure the vehicles’ self-driving progress.
Both of those metrics have flaws, Kan said. For one, all miles aren’t created equal. Navigating the complex streets of Manhattan is far more challenging than, say, cruising on an empty highway. Disengagements — or instances when an engineer takes over from a self-driving system — can be influenced by driving and engineering choices, and “don’t provide a very rich data set,” Kan said.
“We are now at that point where we’re saying ‘help us think through the right metrics,”’ to prove that a car driven by artificial intelligence is at least as safe as a human driver, Kan said at the conference, which was organized by the tech company Nvidia Corp., which makes high-performance processors used in artificial intelligence applications, including self-driving cars.
Kan said the department wants input from companies on how safety should be measured as part an effort to reform federal policies for autonomous vehicles.
As part of the department’s broader self-driving push, the National Highway Traffic Safety Administration is seeking comments to guide a new series of autonomous vehicle test projects in which the agency could temporarily lift rules that pose barriers to those tests. The agency also plans to propose and seek comment on changes to existing vehicle safety standards to allow for new automated vehicle designs, such as removing a steering wheel or foot pedals.
Transportation Secretary Elaine Chao has repeatedly stressed the need for companies to create public trust in the still-developing industry. Polling shows many Americans are leery of autonomous vehicles, and a series of accidents and fatalities involving semi- and fully self-driving technologies has further heightened scrutiny.
A pedestrian was struck and killed by a self-driving SUV tested by Uber in Arizona in March, and a California driver died the same month when his Tesla SUV slammed into a highway barrier while using the car’s semi-autonomous driving system.