“Wait, did you switch on the turn signal,” I asked the person behind the wheel of an autonomous Audi Q5 developed by Delphi, “or did the vehicle do that on its own?” The test driver didn’t answer. Instead he closely watched the driver on the opposite side of the intersection near Delphi’s lab in Mountain View, California as she crawled along without signaling her intentions.
“The automated system takes over and drives the route once it’s program,” Glen DeVos, Delphi’s VP of services, said from the backseat. As we watched the pokey human driver slowly turn right – sans signal – the Audi accelerated through the intersection just as the light turned yellow.
It’s the kind of scenario that human drivers confront every day and that autonomous technology will need to negotiate before its ready for the road. But judging from this and other situations that the autonomous Audi confronted on a drive to demonstrate the Centralized Sensing Localization and Planning (CSLP) system the company is developing with Mobileye, the technology has the potential to accelerate self-driving.
It does this by using an array of sensors and sophisticated processing that will allow the technology to go into production by 2019. DeVos explained that the Audi uses “one lidar sensor facing rear, another facing the front, and on all four corners.”
But unlike a Google self-driving Lexus RX, one of which I spotted on the way to the Delphi Lab (and rode in last year), the Audi Q5 has lidar sensors incorporated into the vehicle so that there isn’t a gangly gumball-machine-looking contraption on top. “We wanted it to look like a production vehicle and integrate the sensors where they would be in production,” DeVos added.
In addition to the lidar sensors, the Audi also has a forward-looking radar sensor and one at each corner of the car. “With our Mobileye partnership,” DeVos added, “we’ll put full surround-view cameras as well as the latest tri-view camera looking forward, so it will be the most advanced vision system available.”
Throughout the half-hour drive, the CSLP system showed how self-driving technology is quickly evolving by deftly handling situations similar to the left turn at the traffic light that started the demo. “One of the things we want to do is understand all these interesting use cases,” said DeVos.
“On the one hand, you can’t be so timid that you don’t go and get out into traffic, and on the other you can’t be too aggressive,” he added. “That’s where we think the machine learning will really be beneficial to govern how a vehicle behaves.”
Delphi and Mobileye call this behavioral modification of machines “semantic understanding,” which allows the vehicle to not only know its surroundings but also the intentions of others within it. “Semantic understanding is a set of subtle visual cues,” added Erez Dagan, senior VP of advanced development and strategy for Mobileye and DeVos’s backseat companion during our drive.
“It’s something that you use to define the conduct of the vehicle,” Dagan added. “There are more straightforward semantics, such as arrows on the road, traffic lights and such. But you want to not only know whether an object is a sidewalk or an exit, but also assess the posture of a pedestrian. That gives you some indication of the probability that they may be about to cross into the street.”
As if on cue, the Audi stops at a light ready to make a right turn. But it doesn’t breach the crosswalk the way a human driver might, and instead waits while a pedestrian crosses in front on us. “Even if that person was way over on the edge of the crosswalk,” said DeVos, “we set up a rule that we’re not going to encroach into that crosswalk while anyone is in it.”
“Just like a human driver, it would need to accelerate, find a gap and pull in,” DeVos added. “This car is programmed to be very conservative. The challenge is how do you teach the car to do that on its own.”
Just after making the turn, the Audi was confronted with its next decision: a trash truck stopped in the right lane. The Audi paused for just a second or two and then slowly pulled into the left lane to move around the truck, just as a human driver would. When it came across two separate road construction scenarios in which traffic had to merge into a single lane – and one involved workers in the roadway – the Audi handled each by barely slowing down.
This kind of artificial intelligence should only improve once the CSLP system integrates Mobileye’s Road Experience Management (REM) mapping technology that was introduced at CES 2016 and will capture road data using cameras in millions of cars. Delphi and Mobileye also believe that REM will allow sidestepping the more “map-heavy” approach favored by Google and many automakers that allows a self-driving vehicle to know its location and, more importantly, its detailed surroundings.
But such meticulous although static mapping software can become quickly outdated and not reflect the sort of road construction we encountered during our autonomous drive. “This gives rise to the issue of how fresh the maps would be in an autonomous vehicle,” said DeVos. “If those construction areas were set up three hours ago, you need to be able to rely on your map to show them. And that’s the key proposition of REM.”
Autonomous vehicles that rely on detailed maps also require a robust connection to constantly feed accurate mapping data to the car, whereas the REM system uploads data in small bursts that can be easily handled by the 4G LTE connectivity already found in many vehicles. The real-time, constantly updated data provided by REM can also be shared among millions of cars equipped with the system.
While Tesla’s Autopilot system uses a similar crowd-sourcing approach to improve performance, DeVos pointed out that it’s from a relatively limited pool of vehicles. “Any vehicle that has a Mobileye camera system that’s REM-enabled becomes an information-gathering vehicle – whether it’s automated or not,” he noted.
The REM technology as well as the CSLP system can also be “seamlessly integrated with existing vehicle platforms,” DeVos said. Which means the new car you buy in 2020 could handle most of the driving – and will always remember to use a turn signal even when other drivers don’t.
Originally published by Forbes.com