
The National Transportation Safety Board has released a preliminary report on the fatal Arizona collision between an Uber self-driving car and a pedestrian walking a bicycle.
The report is short and easy to read, so I’ll embed it here and post some follow-up thoughts below.
https://www.ntsb.gov/investigations/AccidentReports/Reports/HWY18MH010-prelim.pdf
First and foremost, every vehicle fatality is a tragedy. Elaine Herzberg, the pedestrian killed in this collision, was a mother, a daughter, and a sister. Over 1 million automotive fatalities occur worldwide every year, 37,000 of them in the United States. In many cases, including this one, the tragedy is borne by both the deceased and the survivors.
While Uber ATG is a Udacity partner, I have not spoken with them about this incident and thus have no inside knowledge of what happened. The Uber ATG engineers are some of the best in the business. Uber has logged more self-driving miles than any company except Waymo. This is an important collision to study because it provides a window onto how top-notch engineers design an autonomous system, and what, if any, failures might still occur.
It seems that parts of the system functioned exactly as intended, and other parts seemed to fail, although the failures are not yet clear.
Classification
The report states:
“..the system first registered radar and LIDAR observations of the pedestrian about 6 seconds before impact, when the vehicle was traveling at 43 mph. As the vehicle and pedestrian paths converged, the self-driving system software classified the pedestrian as an unknown object, as a vehicle, and then as a bicycle with varying expectations of future travel path.”
It makes sense that the Herzberg was picked up by radar and lidar, but not by cameras. Watching video of the collision, it seems that the environment is too dark for cameras to see.
Cameras are the best sensor for classifying objects, and in the absence of camera data, it makes sense that the rest of the sensor system had difficulty classifying Herzberg. This is especially true because Herzberg was engaging in unusual behavior: jaywalking a bicycle across the street on foot. That behavior is not super-rare, but it is less usual than a bicycle riding down the street, or walking a bicycle through an intersection, or a pedestrian jaywalking. The combination of a lack of camera data plus unusual behavior probably were presumably the driving factors behind the difficulty in classifying the object.
Prediction
This classification difficulties are important because the vehicle prediction system relies on classification to predict what other objects on the road will do next. Imagine, for example, that a self-driving car classifies a vehicle in the next lane as cycling with the flow of traffic. In that case, the prediction system is likely to predict that the future behavior of the cyclist will be to continue cycling in that lane, with the flow of traffic. Thus it is safe for the self-driving vehicle to proceed ahead, and pass in the adjacent lane.
On the other hand, if the object is classified as a cyclist crossing the street, the prediction system may well predict that the future behavior of the cyclist will be to cross into the self-driving vehicle’s lane. Thus it is not safe to proceed.
The absence of an accurate classification of Herzberg as a pedestrian, while understandable, made it difficult to predict what she would do next.
Braking
The report goes on to state:
“At 1.3 seconds before impact, the self-driving system determined that an emergency braking maneuver was needed to mitigate a collision. According to Uber, emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior. The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator.”
This is the crux of the collision, and the description confuses me.
The phrase, “emergency braking maneuvers are not enabled while the vehicle in under computer control”, in the context of the report, appears to refer to the Volvo factory-installed emergency braking system. The report states that Uber disabled this system while the vehicle was in self-driving mode. The disablement in and of itself strikes me as reasonable — you don’t necessarily want the brakes controlled simultaneously by two separate safety systems, designed by two different companies, independently of each other. It would be interesting to know how other third-party self-driving car developers handle this.
However, the report also states that, “the self-driving system determined that an emergency braking maneuver was needed”. This appears to tie the Uber self-driving system to the Volvo emergency braking system, which presumably it cannot control and has been disabled anyway.
The report then goes on to state: “The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator.” The juxtaposition of those two sentences is hard to comprehend, and makes me believe there must be more to the story. I am curious to read further reporting on how Uber designed and anticipated both emergency braking and intervention by vehicle operators.
Vehicle Operator
The same video, above, that shows the collision, also has footage of the vehicle interior. That footage shows the vehicle operator looking down most of the time leading to the collision. Speculation at the time of the crash was that the operator may have been distracted by a personal phone, however:
“the vehicle operator stated that she had been monitoring the self-driving system interface. The operator further stated that although her personal and business phones were in the vehicle, neither was in use until after the crash, when she called 911.”
A few items of note here. The vehicle operator appears to have been tasked with several duties while operating the vehicle:
“the operator is responsible for monitoring diagnostic messages that appear on an interface in the center stack of the vehicle dash and tagging events of interest for subsequent review.”
This could be a problem or maybe not. Most human drivers are habituated to monitoring vehicle diagnostic messages (i.e. the check engine light) while driving, although it’s not clear how frequent or cognitively taxing the diagnostic messages were in this case. Similarly, “tagging events of interest” could mean anything from simply tapping a button on the steering wheel, or it could mean looking down and typing notes on a touch-screen keypad, which would be much more distracting.
Elaine Herzberg
The ultimate tragedy in this sequence of events is the death of Elaine Herzberg. Nonetheless, her actions leading up to the collision also raise questions.
“the pedestrian was dressed in dark clothing and that the bicycle did not have any side reflectors. The bicycle had front and rear reflectors and a forward headlamp, but all were facing in directions perpendicular to the path of the oncoming vehicle. The videos show that the pedestrian crossed in a section of roadway not directly illuminated by the roadway lighting.”
Herzberg also tested positive for methamphetamine and marijuana on autopsy, which may have contributed to her decisions and behavior.
At a much higher level, it depresses me that a homeless woman collided with a self-driving car, one of the most advanced technological innovations humans have ever created. I have been involved with efforts to end homelessness in the San Francisco Bay Area for six years, and this incident in Arizona really brings home the struggle and failure to provide for the most vulnerable people in our midst.
Hope
The goal of self-driving car engineers, and the hope of self-driving cars, is to bring automotive fatalities as close to zero as possible.
The silver lining, such as it is, of this collision, is that this scenario, or close simulations of it, will go into the test suite of every self-driving car company. Every time software gets updated or improved, it will be checked against this scenario, to make sure the vehicle successfully averts this type of collision. Unlike humans, self-driving cars learn from each other, and they get better and better over time.
Hopefully this particular type of collision never happens again.