Home » Status Report » 2018 » Article
Status Report, Vol. 53, No. 4 | SPECIAL ISSUE: AUTONOMOUS VEHICLES | August 7, 2018 Subscribe

Fatal Uber crash shows risks of testing on public roads

Photo courtesy of the National Transportation Safety Board

Self-driving cars are supposed to be better at averting crashes than human drivers, but tests of prototype vehicles on public roads so far indicate that they aren't always up to the task. Absent regulatory oversight, the race to deploy autonomous vehicles risks jeopardizing public trust and safety and the lifesaving promise of the technology.

The National Transportation Safety Board (NTSB) on May 24 issued a four-page preliminary report on the first fatal crash involving a pedestrian and a self-driving vehicle operating under the control of a computer, not a person.

Until the March 18 tragedy, Uber Technologies Inc. had done extensive testing of its fleet of prototype autonomous vehicles in Arizona. That testing has since been shelved in the state.

The details of the March 18 crash in Tempe, Arizona, are by now well-known. What wasn't known publicly until the NTSB report's release, however, is that while an Uber experimental vehicle operating in self-driving mode is capable of detecting impending conflicts, it isn't programmed to brake or warn the test operator to take action.

"According to Uber, emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior," the NTSB report states. "The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator."

Elaine Herzberg, 49, was walking her bicycle across a four-lane arterial road around 10 p.m. when she was struck by a 2017 Volvo XC90 modified with Uber's sensors and software to operate in autonomous mode. Herzberg had crossed more than three lanes before she was struck by the SUV at about 39 mph. Dash cam video released by Tempe police shows Herzberg didn't look in the SUV's direction until just before it hit her.

In the Uber, the lone test operator at the wheel wasn't watching the road just before impact, according to the NTSB report and the dash cam video. The operator told investigators that she had been monitoring the system interface in the center console. Tempe police in June released a report indicating that the operator's smartphone was streaming a TV show in the 42 minutes preceding the crash.

The NTSB report indicates that the Uber self-driving system first detected Herzberg about 6 seconds before impact, initially classifying her as an unknown object, then a vehicle and then as a bicycle. At 1.3 seconds before impact, the report states, the system "determined that an emergency braking maneuver was needed to mitigate a collision." Less than a second before impact, the test operator took the wheel and started to brake just after hitting Herzberg.

The Uber self-driving system was operating normally at the time of the crash, with no faults or diagnostic messages.

"What's chilling is that the engineers behind Uber's software program disabled the system's ability to avoid a life-or-death scenario while testing on public roads," says David Zuby, the Institute's chief research officer. "Uber decided to forgo a safety net in its quest to teach an unproven computer-control system how to drive."

Autobrake with pedestrian detection

Institute staff have logged more than 80,000 combined highway miles behind the wheels of cars and SUVs equipped with advanced driver assistance technologies to gauge the performance and quirks of each system and how drivers interact with and view them (see "Drivers prefer automated systems that operate smoothly," March 29, 2018, and "IIHS-HLDI test drives uncover driver assistance system quirks," Nov. 10, 2016).

For the past several years, IIHS and HLDI researchers have studied the crash avoidance technologies that are the precursors of autonomous driving systems, analyzing data in insurance claims and police reports and conducting test track, on-road and lab evaluations (see "Park assist helps drivers avoid backing crashes," Feb. 22, 2018, and "Stay within the lines: Lane departure warning, blind spot detection help drivers avoid trouble," Aug. 23, 2017).

The XC90 is among the vehicles IIHS researchers have tested.

The model involved in the Uber crash was equipped with Volvo's automatic emergency braking and pedestrian detection system designed to prevent or mitigate pedestrian crashes. On conventional Volvos, the default is always "on" for the technology.

Uber diagram

At 1.3 seconds before impact, Uber's system determined that emergency braking was needed but took no action, the NTSB found. (Photo courtesy of the National Transportation Safety Board)


"The crash avoidance system on the XC90 would have prevented or mitigated this crash, but it was never given the opportunity to intervene or even alert the test driver," Zuby says.

The NTSB report states that "All these Volvo functions are disabled when the test vehicle is operated in computer control but are operational when the vehicle is operated in manual control."

In IIHS tests, the XC90 earns the highest rating of superior for front crash prevention. IIHS doesn't yet rate autobrake systems for pedestrian detection but has done extensive research tests. In 35 mph track tests of an XC90, the Volvo system proved extremely capable of avoiding hitting a pedestrian.

Euro NCAP gave the same pedestrian detection system on a 2017 XC60 high marks for its ability to completely avoid collisions with pedestrians at speeds up to about 37 mph. And a test in the U.K. by Thatcham indicates that Volvo's system is capable of braking for a pedestrian walking a bicycle across the vehicle's path in the dark.

Braking when the Uber's sensors first detected something in the road would have given the system more time to identify Herzberg as a pedestrian and Herzberg more time to finish crossing the road.

"At 6 seconds out, the automated driving system had a range of choices it could make," Zuby says. "Just like humans might ignore something unexpected and unclear moving toward their path, Uber's system didn't react despite sophisticated sensors and artificial intelligence. That's unacceptable. To be better than human drivers, automated systems have to make safer choices."

Deadly inattention

In lieu of intervention from an autobrake system with pedestrian detection, an alert human driver may have been able to slow down the vehicle enough to reduce the severity of the crash.

In the dash cam video, the pedestrian appears about 80 feet before impact. A best-case driver reaction time is about one second. At 40 mph the XC90 would have traveled 60 feet before the driver initiated braking at 20 feet, reducing the SUV's speed by roughly 10 mph and the impact speed to 30 mph. The driver still would have struck the pedestrian, but the reduced speed may have raised her chances of survival.

The video shows the Uber operator looking down approximately 9½ seconds of the 12½-second video. At 40 mph, the video covered 750 feet during that short period. The driver was distracted for all of the last 5½ seconds, or 330 feet, before impact.

Experimental studies have shown that drivers can lose track of what automated systems are doing, fail to notice when something goes wrong and have trouble retaking control. Some companies require two operators in autonomous vehicles undergoing testing on public roads. Uber had used two operators in earlier road tests but had recently scaled back to one.

"Looking at the center stack takes the operator's eyes off the road," Zuby points out. "Two operators in the vehicle might have allowed one to monitor the system computer and the other to focus on the road ahead."

SIDEBAR
Why good headlights matter

Some of the sensors used by the Uber vehicle that struck and killed a pedestrian rely on ambient light, just like human eyes.

©1996-2018, Insurance Institute for Highway Safety, Highway Loss Data Institute | www.iihs.org