In a more autonomous future, how to keep imperfect humans behind the wheel engaged?

Until fully autonomous systems become mature, drivers will have to adapt to partial systems and stay alert

FILE - This Jan. 22, 2018, file still frame from video provided by KCBS-TV shows a Tesla Model S electric car that has crashed into a fire engine on Interstate 405 in Culver City, Calif. A government report says the driver of the Tesla that slammed into a firetruck was using the car’s Autopilot system when a vehicle in front of him suddenly changed lanes and he didn’t have time to react. The National Transportation Safety Board said Tuesday, Sept. 3, 2019, that the driver never saw the parked firetruck and didn’t brake. (KCBS-TV via AP, File)
Powered by automated translation

To the human driver, it would have been an obvious obstacle: a police car and fire truck, emergency lights blazing, blocking the lane ahead.

But to the Tesla Model S traveling down a Southern California freeway last year on Autopilot, it was a far more vexing technical challenge that’s inherently difficult for the growing number of vehicles that automakers are equipping with driver-assist systems.

The car slammed into the rear of the fire truck, resulting in no injuries but drawing the attention of federal investigators concerned about the emerging technology.

“It’s not unique to Tesla,” said David Zuby, chief research officer at the Insurance Institute for Highway Safety, which has studied how automated driver-assist systems perform. “We’ve seen evidence in our test driving of other systems with this kind of problem.”

The radars and cameras used to sense obstructions ahead each have their limitations and computer software that evaluates the data is still a work in progress, according to the experts and advocates. In many cases, they are better at tracking moving vehicles ahead than recognising parked ones.

To be sure, automated driving systems have clear potential to improve traffic safety by supplementing the driver. Automatic emergency braking alone has been found by IIHS to reduce rates of rear-end crashes by half, and the insurer-funded group estimates that the system could reduce police-reported crashes of all types by 20 per cent.

So far, Tesla is the only carmaker cited by the US National Transportation Safety Board in an accident investigation for how it designed its partially autonomous system, but the case highlights the broader limitations of similar technology. It also puts a spotlight on a related concern: how to keep the imperfect humans behind the wheel engaged.

The sensors on a Tesla and other cars are relatively good at following a vehicle in the same lane and adjusting speed to maintain a safe distance. But when a vehicle changes lanes - known as the “cut-out scenario” - it can leave the trailing vehicle’s sensors struggling to assess what’s ahead.

“The cut-out is one of the hardest scenarios,” said Phil Koopman, an engineering professor at Carnegie Mellon University and co-founder of Edge Case Research, a Pittsburgh-based autonomous vehicle technology company. “There’s no question about that.”

The radar and camera system on the Tesla involved in the January 22, 2018, crash in Culver City, California, didn’t “see” the fire truck in time to brake, according to the NTSB. The car’s automatic braking system didn’t activate, though it gave the driver a collision warning 0.49 seconds before impact, the investigation found.

The Tesla sped up after the vehicle it had been following changed lanes several seconds before the impact, hitting the fire truck at 50 kilometers per hour.

The Model S involved in the crash was a 2014 model. Since 2016, Tesla’s vehicles have received additional cameras, improved automatic braking and, according to the company, they can better handle the cut-out hazard.

Because the technology isn't great at everything, it's important that the driver stay engaged at all times.

Tests to replicate cut-outs were found to be one of the most challenging to automated driving systems examined in 2018 by Euro NCAP, which tests and assigns safety ratings for vehicles in Europe.

Automated driver-assist systems on some models tested last year by Euro NCAP found several vehicles struggled to automatically handle stationary objects and in the cut-out scenario. The Mercedes-Benz C-Class, BMW 5 Series and Nissan Leaf, for example, offered “limited” or “very limited” automated support and primarily relied upon the driver to handle the situation.

But it’s not just a sudden lane change that can flummox such systems, Mr Koopman said.

The radars typically used on vehicles that have automated braking can’t distinguish very well between a road sign and a stopped vehicle, he said. If a car slammed on the brakes for every object it sensed ahead, it would cause endless false alarms.

That can pose risks, too. More than 80 late-model Nissan Rogue drivers have complained to auto regulators that the SUV’s automatic emergency braking system activated unintentionally, 10 of whom claimed the misfire occurred when the road ahead was clear. The National Highway Traffic Safety Administration may open a defect inquiry into the issue after the Center for Auto Safety, a consumer advocacy group, reported the complaints to the agency.

A Tesla that struck the side of a semi-truck in Florida in 2016, killing the car’s driver, didn’t activate its automatic brakes and wasn’t designed to stop for such stationary objects, according to an NTSB report.

Many driver-assist programmes, including Tesla’s, combine data from cameras with radar. That’s not fail safe, though, Mr Koopman and Mr Zuby said.

Mr Zuby said he has observed other scenarios in which the sensors didn’t perform well, such as sensing vehicles that have stopped ahead on some highways.

“A lot of times these systems will not see or respond to the queuing traffic to bring the vehicle to a safe stop,” he said.

Until fully autonomous systems - which will rely on more robust and reliable sensors - become mature, drivers will have to adapt to the partial systems like those on the Tesla.

And keep their eyes on the road.

FILE PHOTO: The Tesla Model S version 7.0 software update containing Autopilot features is demonstrated during a Tesla event in Palo Alto, California, U.S., October 14, 2015. REUTERS/Beck Diefenbach/File Photo
FILE PHOTO: The Tesla Model S version 7.0 software update containing Autopilot features is demonstrated during a Tesla event in Palo Alto, California, U.S., October 14, 2015. REUTERS/Beck Diefenbach/File Photo

Some companies, such as General Motors, have installed cameras in vehicles that monitor eye movements to ensure drivers are watching the road. Tesla monitors driver engagement by detecting steering wheel feedback caused by a driver’s hands resting on the wheel. The company also instructs drivers using Autopilot to remain attentive at all times, and dashboard prompts require them to acknowledge that responsibility.

Jason Levine, executive director of the Center for Auto Safety, said manufacturers should make sure drivers are ready to take over in an emergency.

“Because the technology isn’t great at everything, it’s important that the driver stay engaged at all times,” he said.

The driver of the Tesla involved in the California crash told NTSB investigators he was looking ahead, but was unable to see the fire truck in time. A witness in another car told the NTSB and police that the driver was looking down at what appeared to be an electronic device.

The NTSB concluded on Wednesday that the design of the Tesla automation, which it had previously cited in the 2016 Florida crash, partly caused the accident. It also said the driver’s inattention played a role.

Tesla defended its Autopilot suite of automation systems and said it has adjusted the time intervals between warnings the car gives drivers whose hands aren’t detected on the steering wheel.

“Tesla owners have driven billions of miles with Autopilot engaged, and data from our quarterly Vehicle Safety Report indicates that drivers using Autopilot remain safer than those operating without assistance,” the company said in an emailed statement. “While our driver-monitoring system for Autopilot repeatedly reminds drivers of their responsibility to remain attentive and prohibits the use of Autopilot when warnings are ignored.”

Advocates for cutting the death toll on US roadways are eagerly awaiting the maturity of the technology because of its promise to reduce accidents, particularly rear-end collisions, Levine said.

“It is a potential giant leap forward in terms of protecting drivers, passengers and pedestrians,” he added. “It just has to be done right.”