STOP ME IF you’ve heard this one before. On June 11, a self-driving Cruise Chevrolet Bolt had just made a left onto San Francisco’s Bryant Street, right near the General Motors-owned company’s garage. Then, whoops: Another self-driving Cruise, this one being driven by a Cruise human employee, thumped into its rear bumper. Yes, very minor Cruise on Cruise violence.
According to a Department of Motor Vehicles report, the kind any autonomous vehicle tester must submit to the state of California after any incident, both vehicles escaped with only scuffs. “There were no injuries and the police were not called,” Cruise reported.
A single incident does not a metaphor about self-driving technology make, but Cruise has had flurries of bumping and rear-ending incidents in San Francisco, where it has tested its technology since 2016. Many of these are unserious and relatively unremarkable, the sort of thing that might happen to a human driver and that an insurance company would never hear about.
Some are scarier, meriting check-ins to the hospital or legal wranglings. A California motorcyclist filed a lawsuit against GM, alleging a lane-changing Cruise AV knocked him off his bike and injured his back and shoulder. (GM settled the suit in June.) Some have been weird. One Cruise car got slapped by a cabbie. Another took a golf ball to the windshield while driving near a city course. (No, yelling “fore!” does nothing for a robot.)
Why the bumps and bruises? Well, because humans. To its credit, Cruise has chosen to test its cars in a super-challenging environment, the dense and oft-surprising streets of San Francisco. (In January, at least one pedestrian leapt into a Mission neighborhood crosswalk, “shouting, and struck the left side of the Cruise AV’s rear bumper and hatch with his entire body,” according to a DMV report.) Here, there are many opportunities to capture data on edge cases, the sorts of road activity (Traffic! Weird lane changes! Foul fog! Construction zones!) that self-driving cars need to understand before they can perform perfectly every time.
The company also says it purposefully programs its cars to be almost too-cautious, to brake when, for example, a cyclist even hints that she might be darting across the road. Last year, CEO Kyle Vogt told reporters that Cruise wants to nail safety before it can focus on smoothing out the herky-jerky behavior that might leave riders a bit queasy, and fellow road users a bit confused. (The company plans to launch a limited driverless taxi service in 2019.)
That said, the rear-endings demonstrate that the technology is far from perfect. Cruise cars follow road laws to a T, coming to full stops at stop signs and braking for yellow lights. But human drivers don’t—and Cruise cars will be self-driving among humans for decades to come. “There has to be a way for these cars and people to share the road in a more efficient manner and understanding manner,” a Cruise spokesperson said.
And that’s annoying, because humans are deeply imperfect. The fact that a driver Cruise trained to work with these vehicles still managed to rear-end one emphasizes exactly how flawed they are. To create a robot that operates with perfect safety among people, the vehicles just might have to learn to emulate some of their worst qualities. Just as long as they don’t start slapping people.