This marks the 14th time Google's autonomous cars have been involved in a crash in the past six years. Six of those accidents have occurred in the past five months. It all sounds frightening for motorists already leery about the day steering wheels will no longer be installed in their cars – and according to a University of Michigan survey, that's just about everybody.
What's not emphasized in the wave of media consternation is the latest crash also marks the 14th time a human driver has been responsible for causing an accident with the Google self-driving cars. That's right. Google's autonomous technology has been responsible for zero of the 14 accidents. In 13 of the accidents, drivers of the other cars involved were at fault. In the remaining one, a human Google driver rear-ended another car while operating in manual mode.
In the most recent crash, a Google Lexus SUV stopped at an intersection behind two other cars. It was rear-ended by another car traveling approximately 17 miles per hour, according to a report filed with the California Department of Motor Vehicles. This isn't an isolated case for Google. In 11 of the 14 crashes, the company's cars have been rear-ended.
Chris Urmson, director of Google's self-driving car program, sees those numbers as validation autonomous technology is needed. "The clear theme is human error and inattention," he wrote Thursday on Medium. "We'll all take this as a signal that we're starting to compare favorably with human drivers."
He goes on to underscore a key Google talking point. The company believes that rear-end crashes are under-reported across the country. Citing a report from the National Highway Traffic Safety Administration, he says rear-end crashes may account for 55 percent of all traffic accidents. ( NHTSA's official statistics paint a more conservative picture – rear-end crashes comprised 32.2 percent of reported crashes in 2013, the latest year for which data was available).
Why are the 25 Google autonomous cars active on public roads prone to getting rear-ended? That's a worthwhile question. Maybe Urmson's right, and these types of fender-benders go vastly underreported. Maybe the fact Google cars follow the letter of the traffic laws and most drivers don't complicates the way the humans and algorithms interact on the road. Maybe the novelty of seeing a Google car on the road is in and of itself a distraction. We don't know the answer yet.
But instead of exploring those possibilities, much of media coverage thus far has merely been rote accounting of each and every human-caused accident involving a self-driving car. There's a two-fold problem with that: it creates and reinforces the fears regular drivers have expressed about autonomous technology, and it completely ignores the greater problem. In 2013 alone, human drivers caused 1,806 fatalities in 1,831,000 rear-end crashes.
No one is advocating autonomous cheerleading. Scrutiny of self-driving cars is needed, and we've pointed out in the past that Google needs to be more transparent in releasing accident reports and describing how it plans to protect – or use – vehicle data. But scrutiny shouldn't come without context. Which should scare drivers more, the autonomous technology that hasn't rear-ended anyone in more than 1 million miles of testing? Or our fellow human drivers?
The focus on the accidents without context is reminiscent of the hysteria surrounding a fire that occurred in a Chevy Volt lithium-ion battery three weeks after a government-sponsored crash test in 2011. Although no real-world examples of the sequence were ever recorded and not a single motorist was actually harmed by a Volt battery fire, NHTSA opened an investigation of the batteries. Breathless media coverage ensued, stoking imagined fears of Chevy Volt infernos around every corner.
Coverage largely ignored the greater problem – the 152,300 annual car fires and the 209 people typically killed by them. In the end, an exhaustive 135-page report from NHTSA concluded that "no discernible defect trend exists" and safety officials believed the Volt posed no greater risk for post-crash fires than cars driving around with gallons of highly flammable gasoline in their tanks. But the damage was done to the Volt's reputation, and you could argue the incendiary coverage played a key part in derailing Chevy's hopes for annually selling tens of thousands of Volts. The Volt never recovered.
It'd be a shame for autonomous technology to receive the same misguided treatment. Much in the same way the Volt could have transformed fuel efficiency, self-driving cars could transform traffic safety. Human error is responsible for more than 90 percent of all car crashes, according to NHTSA, and self-driving cars present the real possibility of preventing 30,000 deaths and hundreds of thousands of injuries every year in America alone.
We're still awaiting the first autonomous-caused car accident on public roads. That will be a day of real news.