Fatal Tesla crash highlights risk of partial automation

August 7, 2018

Photo courtesy of S. EnglemanPhoto courtesy of S. Engleman(Photo courtesy of S. Engleman)
A Tesla Model X struck a barrier in Mountain View, Calif., on U.S. Highway 101 where lanes diverge. The driver had used "Autopilot" for nearly 19 minutes before his fatal crash.
Photo courtesy of S. Engleman

The deadly crash of a Tesla Model X on a Mountain View, Calif., highway in March demonstrates the operational limits of advanced driver assistance systems and the perils of trusting them to do all of the driving, even though they can't.

The driver, Walter Huang, had used the "Autopilot" feature continuously in the final 18 minutes and 55 seconds before his car crashed into a highway divider, the National Transportation Safety Board (NTSB) stated in its preliminary report.

The system gave Huang two visual alerts and one auditory alert to place his hands on the wheel during this period. In the final 6 seconds before impact, his hands weren't detected on the wheel, and the Tesla didn't make any emergency braking or steering maneuvers to avert the crash.

The Model X had been following a lead vehicle and traveling in the second lane from the left at about 65 mph 8 seconds before the crash, the NTSB report states. Traffic-Aware Cruise Control was set to 75 mph on the 65-mph highway. At 7 seconds out, the SUV began a left steering movement into the paved gore area dividing the main travel lane from an exit ramp. At 4 seconds out, the Tesla was no longer following the lead vehicle. At 3 seconds out, the SUV accelerated from 62 mph to 70.8 mph before slamming into the barrier at about 71 mph. The Model X rotated counterclockwise, collided with two other cars and caught fire. Huang died of his injuries.

The circumstances are similar to a September 2017 single-vehicle crash in Hayward, Calif., involving a Model S operating on Autopilot. The car struck a lane-separating divider on U.S. Highway 92 and sustained damage similar to what occurs in the IIHS passenger-side small overlap front crash test. The driver was uninjured.

IIHS test drives of the Model S on public roads suggest Autopilot may be confused by lane markings and road seams where the highway splits.

Fatal Tesla autopilot crash
The first fatal crash in the U.S. of a Tesla in "Autopilot" mode occurred in Florida in May 2016. Neither the Model S (above) nor the driver braked for a tractor-trailer crossing the car's path. (Photo: Florida Highway Patrol investigators)
Police department tweet
Several other Tesla crashes have made headlines, including one in Laguna Beach, Calif.

David Aylor, IIHS manager of active safety testing, has logged many miles in a Model S in Autopilot mode. He has observed instances in which the car lost track of lane markings and began to drift or even attempt to run off the road before he intervened. The car has crossed lines without warning the driver to take over.

Aylor points to one YouTube video by a Chicago area driver who filmed himself on a freeway in a Model S with Autopilot engaged. The driver abruptly drops his phone as his car is about to plow into a median barrier as the roadway splits, just as the freeway does in the Mountain View crash.

"For human drivers, road splits like these can be tricky to maneuver," Aylor says. "In this case, Autopilot was controlling the vehicle and it proved no better at avoiding the same mistakes human drivers might make."

IIHS engineers have observed similar issues with Level 2 systems from other manufacturers. These systems are intended for use on limited-access highways with no at-grade intersections.

Some systems can "read" speed limit signs and adjust speeds accordingly, but they aren't programmed to respond to traffic signals. While all Level 2 systems control speed in free-flowing traffic, they vary in their ability to slow or stop smoothly when encountering much-slower moving or stopped traffic.

Other manufacturers' Level 2 vehicles likely have been involved in crashes while drivers were using advanced driver assistance features, but none of them have grabbed headlines like Tesla.

Since the first fatal crash of a Tesla operating on Autopilot in Florida in May 2016, in which a Model S struck a tractor-trailer turning into the car's path, there have been several other high-profile Tesla crashes.

In a May crash in Utah, a Model S driver reportedly ran a red light and struck the back of a firetruck without slowing down.

The driver, who sustained a broken ankle, told police that "she had been using the ‘Autopilot' feature in the Tesla" and "admitted that she was looking at her phone prior to the collision," the South Jordan Police Department said in a statement.

Police Sergeant Samuel Winkler added this caution: "As a reminder for drivers of semi-autonomous vehicles, it is the driver's responsibility to stay alert, drive safely, and be in control of the vehicle at all times."

It is good advice for any driver, but especially one who may be lulled into a false sense of security by automated systems that appear to handle parts of the driving task with ease but can quit at any moment.

On May 29, a Tesla operating in Autopilot mode struck a parked police department SUV on Laguna Canyon Road in Laguna Beach, Calif. The Tesla driver sustained minor injuries, local police reported.

The crash occurred in a marked exit lane where vehicles also park. Confusing lane markings may have come into play.

End of main content