Your browser is not supported. Please update it.

23 March 2018

Uber’s fatal pedestrian accident triggers watershed moment

A 49-year-old woman was struck by a Volvo XC90 that was part of Uber’s self-driving test fleet, in Tempe, Arizona, at 10pm on Sunday night. The woman, Elaine Herzberg, was hit as she crossed the road, but was not crossing on a dedicated crosswalk, according to Tempe police. Uber has suspended its fleet, pending investigations into what is understood to be the first third-party death attributed to a self-driving car.

The car was being driven in autonomous mode, with a test driver behind the wheel ready to intervene. However, neither the car or driver could not avoid the collision. Tempe police have said that Herzberg suddenly emerged from the shadows, and that on reviewing the video footage, “it’s very clear it would have been difficult to avoid this collision in any kind of mode.”

It seems that Herzberg was at fault, although the car was traveling at 38mph in a 35mph zone. Neither the car nor driver attempted to brake. The nighttime visibility will have hampered the test driver’s visibility, and because Herzberg was pushing a bicycle laden with bags or packages, there’s a good chance the car’s machine-vision systems won’t have known exactly what it might have been looking at (even if it had time to take action).

However, video released by the police shows that while there was not enough time to avoid the collision, the Uber driver was consistently looking away from the road in the moments leading up to the crash. It is not clear what the driver was looking at, whether it was a control panel for the car or a mobile phone, but it does not paint the driver in a good light. Similarly, it is not clear why none of the sensors on the vehicles spotted Herzberg, who was walking across an empty road. They should have detected the object.

Following the first fatal crash in a Tesla, operating its Autopilot feature with strong evidence that the driver was ignoring the safety instructions to keep their eyes on the road, there were calls for much stricter safety regulations – although there was a second similar death in China too.

Not much actually changed, in terms of legislation, but now that an apparently properly operated self-driving car has been involved in a fatal incident, this looks to be a watershed moment for the automotive industry. Uber was cleared of any wrongdoing in another crash in Arizona last year, but rivals including Google and GM have been involved in other crashes.

For context, 2016 saw 37,461 people killed in traffic incidents in the US, including 6,000 pedestrians – the latter up 9% year-on-year, and thought to be attributable to smartphone usage. Globally, around 1.2m people are killed in traffic incidents annually. To date, in around 4 years of testing, this is the third known death attributable to a self-driving car – of which the two Tesla ones seem attributable to driver inattention that would have been avoided if they had been looking at the road properly.

So Uber will be under intense scrutiny here, only a month after coming out of the weird Uber-Waymo lawsuit – settled abruptly, with only a $250m equity payment to Waymo, after all the hot-air about theft of trade secrets.  Uber has also pushed the envelope in acquiring the correct permits, being ordered out of San Francisco by the Californian DMV in 2016 for not seeking the correct exemptions.

While there are no signs of wrongdoing by Uber so far, its current public image won’t do it any favors if a regulator decides to dole out punishments of some sort. In contrast, Waymo and the automakers like GM and Audi have squeaky-clean records when it comes to their self-driving programs.

Therefore, the industry is at a watershed moment, comparable to the Columbia explosion that effectively ended the US Space Shuttle program, or the Air France 4590 crash that grounded Concorde and triggered the demise of supersonic passenger flight (the only fatal crash in its 27-year operational history).

It is unclear what sort of regulatory actions could be pursued if Uber is found to be at fault. A lawsuit from Herzberg’s family or health insurance provider is a likely outcome, but the police have been pretty clear in their statements that Uber was not at fault. Uber appears to have been operating well within the current legal requirements for testing, and so should be able to challenge any punishment on those grounds.

The Californian DMV publishes the disengagement reports from the companies testing self-driving vehicles in the state. These reports can provide a benchmark for the current performance of these vehicles, and while they often require manual intervention (when the autonomous system disengages and hands control back to the test pilot), their collective safety record is impressive.

In 2016, the US fatalities per million miles driven figure was 1.18. Waymo alone has driven over 5m miles using its fleet, and has not been involved in a fatal incident – although its disengagement rate is one per 5,956 miles driven. Uber has travelled over 2m miles too, and Tesla claims that its Autopilot mode has travelled over 222m miles. The lingering question is the potential severity of these disengagement events if there was not a human ready to intervene.

It is not clear from the DMV data whether the average disengagement event represented the risk of a collision – a simple bumper scrape, mild fender-bender, or potential T-bone crash. Anecdotal evidence suggests that the cars often disengage at things that would not bother a human driver, and often disengage for reasons that the test pilot can’t comprehend. The systems are collectively very risk averse, which is of course a good thing at this stage.

But moving forward, these disengagements will be worked on to remove the need for the handover. This will require careful balancing of tolerances and risk, until automakers and regulators are happy with the vehicles’ decision making – but it likely means more near misses and crashes.

There will be more injuries and deaths as the self-driving industry moves into the launch-phase for these commercial offerings – with full autonomy being pegged for 2020. That might be an enthusiastic estimate, given the length of automotive development pipelines, but it seems that the technology available today is not far away from a tolerable capability.