ALONG WITH ROBOT butlers, billboard-sized TVs, and inadequately sanitized wearables being tried on by untold hordes, self-driving demonstrations have become a staple of CES. As the show takes over Las Vegas, the Strip, hotel parking lots, and side streets play host to robo-vehicles with spinning sensors on the roof, pods with splashy logos, and even autonomous Lyfts. Usually, these demos go the same way: You sit in the back and try to glean whatever you can from a carefully staged ride.

So it was odd to find myself this week in the driver’s seat of a Lincoln MKZ that looked like a full self-driver, sensors and bold logos included. And I was being told not just that I’d have to drive, but that I would be monitored—and graded—on my concentration, trust, and emotional state.

This metallic blue sedan is the Learning Intelligent Vehicle, and the computer and I are going to drive it together, which, rather than full autonomy, is what many serious players in the self-driving space now think is a realistic near-term goal.

This step-by-step approach to autonomy, where the machine gradually takes over the work of driving, started to go out of fashion in 2012, when Google’s self-driving project (now Waymo) decided it was safer to go full-robo than to find a way to make the human and computer work together effectively. Much of the auto industry reached the same conclusion over the next few years, even vowing to take the steering wheel and pedals out of their cars.

The mastery to land that moonshot, though, has proven elusive, even in carefully prescribed areas like the Phoenix suburbs where Waymo operates. And so the gradual approach, built on capabilities that are achievable in the near-term, is making something of a comeback, with a focus on the human-machine interaction.

 “The industry has realized this elephant has to be eaten in smaller bites,” Veoneer CTO Nishant Batra says.
“The expectations that the industry had are now significantly pushed out,” says Nishant Batra, chief technology officer of Veoneer, the outfit that built the Learning Intelligent Vehicle, or LIV 3.0. Veoneer spun off from safety-focused industry supplier Autoliv in 2018, to focus on self-driving and driver assistance features. “The big—I don’t want to say hype—but expectation around full autonomy has not come through,” Batra says. But, he thinks, smaller steps toward autonomy can improve safety and convenience for drivers in the here and now.

Indeed, many autonomy-focused companies in Vegas this week say that drivers should expect to see assistance features that require human interaction or oversight, first. They use the same sensors and sort of software as more capable cars, but can be certified and put on the market right about now. “The industry has realized this elephant has to be eaten in smaller bites,” Batra says.

One of the first mouthfuls is monitoring the driver for distraction. So on my ride, Veoneer engineer Constantin Coestr, riding shotgun, asks me to name each landmark I drive past. Seems easy enough. “The Eiffel Tower. Walmart. Sydney Opera House,” I call out. “Cher. Elvis. This is hard.” Although you probably could see facsimiles of those things (maybe with a real Walmart) on a typical drive through Vegas, we’re on a closed circuit behind the convention center, passing by painted signs and inflatable buildings. “You’re driving too slowly,” says Coestr, but as I try to focus on the world outside, and keep pace while staying on the road, a robotic female voice booms from the speakers.

“You seem distracted,” it says. “I’m going to take over. Autonomous drive active.” Bright green LEDs light up around the rim of the steering wheel, and the car starts moving automatically. Demoted but relieved, I sit back and enjoy the view. Oh, there’s the White House!

This is a core function of LIV. Using an infrared camera above the center stack, the car’s computer monitors the driver, looking for distraction or confusion, or even anger or happiness, by interpreting face and head position, and pupil size. If it determines you’re distracted, it suggests you let it take over. If you’re really having trouble—like I was—it just takes over. Veoneer calls this “collaborative driving,” and it’s the sort of system most car buyers will experience long before they get to ride in a fully autonomous car. If, that is, Veoneer can convince automakers to take it on.

Collaborative systems like Veoneer’s LIV, which offer help when and where they can, are one possibility for a limited vision of self-driving cars.
Regardless of what Veoneer pulls off, drivers already have access to systems like Tesla Autopilot and Cadillac Supercruise, which on the highway can handle the driving, but require constant human supervision. Progress will eventually demote the human from overseer to backup, for when the computer encounters conditions it can’t handle. And as the human role diminishes, the mechanisms for handing control back and forth will have to get more sophisticated.

“The industry now needs to be really responsible and mature,” says Dennis Nobelius, CEO of Zenuity, a joint venture Autoliv and Volvo created to develop autonomous and driver assistance features. “You need to be really clear that, now the driver is responsible, or now the car is responsible,” he says. “We don’t know exactly how that looks today.” Collaborative systems like the LIV, offering help when and where they can, are one possibility.

My drive in Veoneer’s car wasn’t perfect. The automated system, for example called out every stop sign. I prefer the old-fashioned method of seeing those with my eyes. Eventually, the engineers say, the computer would learn that the reminders irritate me, and quit them. (In my case, the best driver monitoring would pick up on eye rolls and gritted teeth.)

The demo did prove, though, that there are ways to improve my interaction with a car and the environment. And it showed that however long it takes for the car to do all the driving, we won’t have to wait so long to offload some of the work onto the machine.