Uber’s autonomous vehicle radar detected pedestrian Elaine Herzberg more than five seconds before the SUV crashed into her, according to a new report of the National Safety Transportation Board. Unfortunately, a series of poor software design decisions prevented software from taking measurements for up to 0.2 seconds before the fatal crash in Tempe, Arizona.
Herzberg’s death occurred in March 2018, and the NTSB released its initial report on the case in May of that year. This report made it clear that poorly written software, not faulty hardware, was responsible for the crash that killed Herzberg.
But the new report, released on Tuesday, marks the end of the NTSB’s 20-month investigation. It provides a lot more detail on how Uber’s software works and how it all went wrong in the final seconds before the crash that killed Herzberg.
A chronology of classification errors
Like most self-driving software, Uber’s software tries to classify every object it detects into one of several categories, such as car, bicycle, or “other”. Then, based on this classification, the software calculates a speed and a probable trajectory for the object. This system failed catastrophically in Tempe.
The NTSB report includes a second-to-second timeline showing what the software “thought” as it approached Herzberg, who was pushing a bicycle down a multi-lane road away from any crosswalks:
- 5.2 seconds before impact, the system classified it as an “other” object.
- 4.2 seconds before impact, it was reclassified as a vehicle.
- Between 3.8 and 2.7 seconds before impact, the classification alternated several times between âvehicleâ and âotherâ.
- 2.6 seconds before impact, the system classified Herzberg and his bicycle as a bicycle.
- 1.5 seconds before impact, it became “unknown”.
- 1.2 seconds before impact, it became a “bike” again.
Two things are remarkable about this sequence of events. First, at no time did the system classify it as pedestrian. According to the NTSB, this is because “the design of the system did not take into account pedestrians walking on a jaywalk.”
Second, the ever-changing classifications prevented Uber’s software from accurately calculating its path and realizing that it was on a collision course with the vehicle. You might think that if an autonomous driving system sees an object moving in the path of the vehicle, it will brake even if it didn’t know what type of object it was. But that’s not how Uber’s software worked.
The system used the previously observed locations of an object to help calculate its speed and predict its future trajectory. However, “if the perception system modifies the classification of a detected object, the tracking history of this object is no longer taken into account when generating new trajectories”, reports the NTSB.
This meant in practice that, because the system couldn’t tell what kind of object Herzberg and his bike were, the system acted as if she wasn’t moving.
From 5.2 to 4.2 seconds before the accident, the system classified Herzberg as a vehicle and decided that it was “static”, meaning that it was not moving, and that it was therefore not likely to move in the path of the car. A little later the system recognized that she was moving but predicted that she would stay in her current lane.
When the system reclassified it as a bicycle 2.6 seconds before impact, the system again predicted that it would stay in its lane, a much easier mistake to make if you deleted the previous location data. At 1.5 seconds before impact, it became an “unknown” object and was once classified as “static”.
It wasn’t until 1.2 seconds before the crash, as she began to pull into the lane of the SUV, that the system realized an accident was imminent.
“Deletion of actions”
At this point it was probably too late to avoid a collision, but slamming the brakes could have slowed the vehicle enough to save Herzberg’s life. This is not what happened. The NTSB explains why:
“When the system detects an emergency situation, it triggers the action removal. This is a one-second period during which the [automated driving system] removes scheduled braking while the system verifies the nature of the detected hazard and calculates an alternate path, or the vehicle operator takes control of the vehicle. “
The NTSB says that according to Uber, the company “implemented the action removal process due to concerns about the developing automated detection system identifying false alarms, causing the vehicle to engage in unnecessary extreme maneuvers.” .
As a result, the vehicle did not start applying the brakes until 0.2 seconds before the fatal crash, far too late to save Herzberg’s life.
Even after this one-second delay, according to the NTSB, the system does not necessarily apply the brakes at full force. If a collision can be avoided with hard braking, the system brakes hard, up to a fixed maximum deceleration level. However, if a crash is unavoidable, the system applies less braking force, initiating a “progressive deceleration of the vehicle”, while alerting the driver to take over.
A 2018 report by Julie Bort of Business Insider suggested a possible reason for these confusing design decisions: The team was preparing to demonstrate to Uber’s recently hired CEO Dara Khosrowshahi. Engineers were urged to reduce the number of “bad experiences” experienced by runners. Shortly thereafter, Uber announced that it was “disabling the car’s ability to make emergency decisions on its own, such as slamming the brakes or swerving hard.”
The swerves were eventually reactivated, but the restrictions on hard braking remained in place until the fatal accident in March 2018.
The Uber vehicle was a Volvo XC90 system with a sophisticated emergency braking system. Unfortunately, before the 2018 crash, Uber automatically deactivated Volvo’s crash prevention system when Uber’s own technology was active. One of the reasons for this, the NTSB said, was that Uber’s experimental radar was using some of the same frequencies as Volvo’s radar, creating a potential for interference.
Since the crash, Uber has redesigned its radar to operate at different frequencies than Volvo’s radar, allowing Volvo’s emergency braking system to stay on while Uber tests its own autonomous driving technology.
Uber also says it has redesigned other aspects of its software. There is no longer a “take action” period before braking in an emergency situation. And the software no longer removes past location data when an object’s classification changes.
Update: An Uber spokesperson sent us the following comment.
We regret the accident of March 2018 involving one of our autonomous vehicles which claimed the life of Elaine Herzberg. In the aftermath of this tragedy, the Uber ATG team adopted critical program enhancements to put safety more first. We deeply appreciate the thoroughness of the NTSB’s investigation into the crash and look forward to reviewing their recommendations once released after the NTSB board meeting later this month.