Tesla's Autopilot Was Involved in Another Deadly Car Crash

The automaker says its semi-autonomous system was engaged when a Model X SUV hit a freeway barrier last week in California, killing the driver.
Image may contain Car Vehicle Transportation Automobile Tire Wheel Machine Car Wheel Spoke and Alloy Wheel
Tesla

Tesla now has another fatality to hang on its semi-autonomous driving system. The company just revealed that its Autopilot feature was turned on when a Model X SUV slammed into a concrete highway lane divider and burst into flames on the morning of Friday, March 23. The driver, Wei Huang, died shortly afterwards at the hospital.

This is the second confirmed fatal crash on US roads in which Tesla’s Autopilot system was controlling the car. It raises now familiar questions about this novel and imperfect system, which could make driving easier and safer, but relies on constant human supervision.

In a blog post published this evening, Tesla says the logs in the car’s computer show Autopilot was on, with the adaptive cruise control distance set to the minimum. The car stays in its lane and a fixed distance from the vehicle ahead, but the driver is supposed to keep his hands on the wheel and monitor the road, too. Take your hands off the wheel for too long, and you get a visual warning, on the dashboard. Ignore that, and the system will get your attention with a beep. If you’re stubborn or incapacitated, the car will turn on its flashers and slow to a stop.

Based on data pulled from the wrecked car, Tesla says Huang should have had about five seconds, and 150 meters of unobstructed view of the concrete barrier, before the crash. Huang’s hands were not detected on the wheel for six seconds prior to the impact. Earlier in the drive, he had been given multiple visual warnings and one audible warning to put his hands back on the wheel.

The car’s manual reminds Tesla drivers that Autopilot is a driver assistance tool, not a replacement, and that they retain responsibility for driving safely. (The big center screen conveys the same message when you engage Autopilot for the first time.) But critics say the ease with which Tesla’s system handles regular freeway driving can lull a driver into thinking it’s more capable than it is, and allow them to become distracted or take their eyes off the road.

X content

This content can also be viewed on the site it originates from.

Drivers need to be ready to grab the wheel if the lane markings disappear, or lanes split, which may have been a contributing factor in this crash. Systems like Autopilot have known weaknesses. The manual also warns that it may not see stationary objects, a shortcoming highlighted when a Tesla slammed into a stopped firetruck near Los Angeles in January. The systems are designed to discard radar data about things that aren’t moving, to prevent false alarms for every overhead gantry or street-side trash can.

Autopilot was first enabled on Tesla’s cars, via an over-the-air software updates, in October 2015. The system combines radar-controlled cruise control with automatic steering to stay within painted lane lines. The first person known to die using Autopilot was Joshua Brown, whose Model S crashed into a truck that turned across his path in Florida, in May 2016. Neither he nor the car’s computers saw the white truck against the bright sky.

Federal investigators pored over the crash site and the vehicle logs, as they are doing with this second fatality. The National Highway Traffic Safety Administration concluded that the system was operating as intended, wasn’t defective, and that Tesla didn’t need to recall any cars. The crash, in other words, was Brown’s fault. It went further, and said that crashes dropped 40 percent in Tesla cars equipped with the autosteer feature.

The National Transportation Safety Board was more damning, saying Tesla should bear some of the blame for selling a system that is too easy to misuse.

After Brown’s death, Tesla modified Autopilot to rely more on data from its radar, and less on the camera, to spot obstacles in the car’s path. It also sent out a software update that sharply curtailed the length of time a driver can let go of the wheel, and introduced brighter, flashing warnings. That length of time varies according to speed and road conditions, but can still be a few minutes.

Autopilot was groundbreaking when Tesla introduced it, and Elon Musk promises his cars are capable of even more, from changing lanes all by themselves, to full self-driving. Other luxury car makers have introduced similar systems with varying restrictions—and far less grand promises. Cadillac’s Super Cruise uses an infrared camera to monitor the driver’s head position (so it knows when he’s looking at the road), instead of relying on torque sensors in the steering wheel.

The federal investigations into Huang’s crash are ongoing, and may not produce reports for several months (the NTSB typically takes 12 to 18 months to finalize and publish its findings). In the meantime, Tesla used its blog post to point out some extreme circumstances in this accident. The barrier that Huang hit was supposed to have a crash attenuator, which crumples to absorb some of the impact. But it had been crushed in a previous accident, and not replaced, the company says. “We have never seen this level of damage to a Model X in any other crash,” the blog post reads.

Coupled with Uber’s fatal crash in Arizona, in which one of its self-driving cars hit and killed a pedestrian pushing a bike, this incident marks the beginning of what is likely to be a difficult time for the autonomous vehicle industry. Engineers are convinced that taking the easily-distracted human out of the driving equation will cut down on the 40,000 road deaths each year on American roads. But right now, the systems aren’t sophisticated enough to operate without human oversight, which is difficult to ensure. And that leaves everyone in a difficult middle ground—a no man’s land with no obvious or immediate route out.


Autonomous Accidents