Gauging the safety of self-driving vehicles has become increasingly vexing.

Manufacturers often tout automated vehicles as technological marvels that will dramatically reduce the human toll of crashes. But that’s a broad promise, and a new Rand Corp. study says a safety framework is needed as automakers prepare for commercial deployments.

“The meaning of safety in regard to AVs is surprisingly unclear — no standard definition exists,” says the nonprofit think tank’s study issued last week. The study, “Measuring Automated Vehicle Safety”, was sponsored by Uber.

Rand’s findings suggest common ground is needed to build trust among government officials and members of the public alike as an automated-driving age approaches. Further, the study is critical of the secretive nature of vehicle development.

Even as companies such as Waymo tout the millions of miles driven in autonomous mode, Rand researchers say the number of vehicle miles traveled during industry testing is insufficient to provide meaningful data on the safety of self-driving operations.

Rather than wait to deploy self-driving vehicles until statistics can prove they’re safer than human-driven vehicles — and risk missing out on their benefits in the interim — Rand says industry leaders, academics and policymakers should develop leading indicators that are essentially measures correlated to positive safety outcomes instead of waiting for crashes to happen and counting them.

“We know there can be benefits with self-driving cars, and we don’t want the perfect to be the enemy of the good,” said Marjory Blumenthal, project leader for the study. “The challenge is measuring.”

New barometers

One of the few yardsticks has been the annual disengagement reports manufacturers are required to submit to the California Department of Motor Vehicles. Manufacturers must list and describe circumstances in which their self-driving systems disconnect from driving or proceed into unsafe situations that require a human driver to intervene.

But the disengagement reports are, at best, snapshots more than a comprehensive tool. They do not offer context on the nature of the tests, whether they occur in complex urban environments or rural highways, or weather conditions. Individual operators even say what constitutes a disengagement, and thus a requirement to report, is open to interpretation.

Rand introduced the idea of “roadmanship” as a potential measure for the competence of self-driving systems. Instead of counting crashes or disengagements, a system could measure events correlated with safety outcomes, such as whether a vehicle’s actions would have violated a traffic law.

Beyond legality, roadmanship could benchmark how a self-driving system interacts with the traffic around it. It could set safety envelopes, lateral distances and differentiate between which vehicles cause unsafe conditions and which are responding to them. Rand notes the Responsibility Sensitive Safety model proposed by computer-vision supplier Mobileye, which takes human ideas of safe driving and places them within mathematical rules for vehicles to follow, is an example of roadmanship.

“We need indicators before we get to adverse events,” Blumenthal said.

Less secrecy

To achieve safety benefits, companies will need to share data on the circumstances in which its systems encounter on-road difficulties. Rand suggests data stemming from crashes and unexpected events be shared with crash investigators, government officials and academics. That’s not happening now.

While transportation officials have talked about data sharing as a way to improve automated vehicle safety, the Rand study says, “The industry appears to be ambivalent about whether, when, where and how to coordinate on key aspects correlated to development. … In brief, there seems to be more talking about sharing than actual sharing.”

Even organizations such as the National Transportation Safety Board, the federal agency that investigates crashes, must rely on the help of manufacturers to access and understand the data streams that detail their self-driving operations.

Sharing information and agreeing on safety parameters will help manufactures gain support for self-driving vehicles from a skeptical public, the study says. Nearly three-quarters of drivers are afraid to ride in fully self-driving vehicles, according to a study by AAA this year. But beyond helping to win over future customers, Rand argues manufacturers have a responsibility to share information with the public.

“On public roads, the roadway becomes a living laboratory,” the study says. “Other road users become involved in a study that they did not consent to take part in and cannot opt out of.”

Like it or not, they’re already part of the self-driving experiment, one undertaken to make dramatic safety gains but still lacking details on how to achieve that progress.