Featured

Elon Musk’s worst nightmare, Missy Cummings, is now tormenting Waymo and Cruise too

Her research shows that all autonomous cars are worse than humans, not better

Missy Cummings flew fighter jets for the Navy. Now, as a leading expert on automation and AI, she's taking aim at self-driving cars. Chelsea Jia Feng/Insider

In 2021, an engineer named Missy Cummings drew the ire of Elon Musk on the social network then called Twitter. A professor at Duke University, Cummings had conducted research on the safety of self-driving cars, and the findings led her to issue some stark warnings about Tesla's driver-assistance tech. The cars, she wrote, had "variable and often unsafe behaviors" that required more testing "before such technology is allowed to operate without humans in direct control." On the strength of her research, Cummings was appointed to the National Highway Traffic Safety Administration — to help with regulation of robot cars.

Tesla fans reacted with their usual equanimity and sense of perspective, by which I mean they absolutely lost it. Their insistence that Cummings would attempt to unfairly regulate their boy Elon soon prompted Musk himself to join the thread. "Objectively," he tweeted, "her track record is extremely biased against Tesla." In response, Musk's stans unleashed their full fury on Cummings — her work, her appearance, her motives. They accused her of conflicts of interest, signed petitions demanding her removal, and emailed death threats.

But the thing is, Musk's bros of war were messing with the wrong engineer. As one of the Navy's first female fighter pilots, Cummings used to fly F/A-18s. (Call sign: Shrew.) She wasn't intimidated by the dick-wagging behavior of a few people on Twitter with anime profile pics. She posted the worst threats on LinkedIn, hired some personal security, and kept right on fighting. "I'm like, are you really going to do this?" she recalls thinking. "I double down. The fighter pilot in me comes out. I love a good fight."

She didn't exactly win that particular engagement. A lot of whinging from Tesla pushed NHTSA to force Cummings to recuse herself from anything involving the company. But you know what they say about any landing you can walk away from. Cummings took a new gig at George Mason University and broadened her research from Tesla to the wider world of all self-driving vehicles. With companies like Cruise and Waymo unleashing fully roboticized taxis on the streets of San Francisco and other cities, the rise of the machines has begun — and Cummings is on the front lines of the resistance. In a controversial new paper, she concludes that the new robot taxis are four to eight times as likely as a human-driven car to get into a crash. And that doesn't count the way self-driving vehicles are causing weird traffic jams, blocking emergency vehicles, and even stopping on top of a person who had already been hit by a human-driven car.

"In the paper that really pissed all the Tesla trolls off, I actually say that this is not just a Tesla problem — that Tesla is the first one to experience the problems," Cummings tells me. "For years I have been telling people this was going to happen, that these problems would show up in self-driving. And indeed they are. If anyone in the self-driving car community is surprised, that's on them."

It turns out that serving in the Navy is a very good way to train for inbound ire from Muskovites. In her 1999 memoir, "Hornet's Nest," Cummings recalls how she loved flying jets, and says the excitement of getting catapulted off an aircraft carrier — or landing on one — never got old. But the environment was far from welcoming. Sexual harassment in the Navy was routine, and male colleagues repeatedly told Cummings she wasn't qualified to fly fighters simply because she was a woman. When she and another female officer showed up at a golf tournament on base, they were told to put on Hooters uniforms and drive the beer carts. Cummings declined. 

Flying tactical engines of destruction also provided Cummings with a firsthand lesson in the hidden dangers of machines, automation, and user interfaces. On her first day of training, two pilots were killed. On her last day, the Navy experienced the worst training disaster that had ever taken place aboard a carrier. In all, during the three years that Cummings flew, 36 people died in accidents.

In 2011, while conducting research on robot helicopters for the Navy, Cummings had an epiphany. Even surrounded by nothing but air, those helos were far from perfect — and they relied on the same sensors that self-driving cars do while operating right next to cars and people. "When I got in deep on the capabilities of those sensors," Cummings says, "that's when I woke up and said, whoa, we have a serious problem in cars."

Some of the dangers are technical. People get distracted, self-driving systems get confused in complicated environments, and so on. But other dangers, Cummings says, are more subtle — "sociotechnical," as she puts it. What she calls the "hypermasculine culture in Silicon Valley" intertwines with Big Tech's mission statement to "move fast and break things." Both bro culture and a disruptive mindset, as she sees it, incentivize companies to gloss over safety risks. 

All of which makes it even tougher for women when they level the kind of critiques that Cummings has. "When Elon Musk sicced his minions on me, the misogyny about me as a woman, my name — it got very dark very quickly," she recalls. "I think the military has made a lot of strides, but I do think that's what's happening in these Silicon Valley companies is just a reminder that we haven't come as far in our society as I thought we would have."

An example: Last month, the head of safety at Waymo touted a new study from his company on LinkedIn. The research was unpublished and had not undergone peer review. But Waymo used the study to argue that its robot cars were actually much less likely to get into crashes than cars driven by biological organisms like you and me.

a white car blocks a line of cars waiting behind it on a city street. It's a Waymo self-driving taxi.
A self-driving taxi from Waymo blocks traffic in San Francisco. "They got complacent," says Cummings. "They lost their safety culture." Terry Chea/AP

Cummings wasn't having it. She had her new results — also still in preprint — which showed self-driving taxis to be way more crash-prone. So she went on LinkedIn, too, and said so.

The response was familiar to her from her days in the Navy. Kyle Vogt, the CEO of Cruise, slid into the comments. "I'd love to help you with this analysis," he wrote to Cummings, questioning her number-crunching. "Would be great to connect and discuss this further."

Cummings responded in kind. "I'd love to help you with your understanding of basic statistics, use of computer vision, and what it means to be a safe and responsible CEO of a company," she wrote. "Call anytime."

Women, she figures, caught her vibe. "Every woman who read that was like: Mmm-hmm, you go," Cummings says. But men — friends in Silicon Valley — did not. They thought she had been too mean to Vogt. "He was just trying to help you," they told her.

"All the guys read it like: She's such a shrew!" Cummings says. But, ever the fighter pilot, she was unfazed. "That's how I got my call sign," she says. "So I live with it."

So who's right: Cummings, or the self-driven men of Waymo and Cruise and Tesla? It's hard to tell, for a simple reason: The data on the safety of robot cars sucks. 

Take Cummings' approach in her new paper. First she had to wrestle with NHTSA's nationwide data for nonfatal crashes by human drivers, to get numbers she could compare to California, the only place where the robot cars run free. Then she had to figure out comparable nonfatal crash numbers and miles traveled for Waymo and Cruise, tracked by divergent sources. Her conclusion: Cruise has eight nonfatal crashes for every human one, and Waymo has four — comparable to the crash rates of the fatigued and overworked drivers at ride-hail services like Uber and Lyft.

The purveyors of robot taxis argue that Cummings is wrong for a bunch of reasons. Chiefly, they say, the numbers for human crashes are actually undercounts. (Lots of fender benders, for instance, go unreported.) Plus, crash numbers for the whole country, or even just California, can't be compared to those for San Francisco, which is way denser and hillier than the state as a whole. Looked at that way, Cruise argued in a recent blog post, its taxis have been involved in 54% fewer crashes than cars driven by humans. The company also maintains that ride-hail drivers get into one nonfatal crash for every 85,027 miles of driving — 74% more collisions than Cruise's robots.

Cummings ain't buying it. A blog post isn't science; it's a press release. "Every company has a fiscal interest in getting a paper out that makes them look good, and in the case of Cruise it makes rideshare drivers look bad," she says. "So that's what they're doing." This is exactly the sort of sociotechnical culture that Cummings is criticizing — that she's uniquely qualified to criticize.

Other experts also discount Cruise's claims, coming as they do from folks who are incentivized to welcome our new robot overlords. "If we were to believe the numbers Cruise is putting out there for ride-hailing drivers, those drivers would be having on average two crashes per year," says Steven Shladover, a research engineer at UC Berkeley's Institute of Transportation Studies. "How many drivers have two crashes every year? That is pretty extreme."

But Shladover is also skeptical of the numbers crunched by Cummings. "Missy is assuming a human driver crash rate that's too low for San Francisco, and Cruise is showing a human crash rate that's too high," he says. "The reality is probably somewhere in between."

So maybe Cummings is right, and self-driving cars are a menace. Or maybe it's not quite as bad as her new paper suggests. Until robot cars have traveled for hundreds of millions of miles, there's no way to get a statistically significant, unequivocal conclusion. But the bottom line is: It shouldn't matter. When the data on a product or device's safety is equivocal, regulatory agencies are supposed to make and enforce rules that protect consumers, just as they do in other industries. If the data on robot cars is equivocal or incomplete, then those rules should keep them off the road. The burden of proof is on Waymo and Cruise and Tesla, not Missy Cummings. And if those companies want to put 2-ton robots on public streets, blogging about data benchmarks isn't the way to show people they're ready.

"One of the big things I'm on about now, pulling from my aviation years, is that all these companies need a chief AI pilot," Cummings says. "They need to have somebody, one person, who stands up and says, 'I'm responsible.' We do that right now for aviation. That's why so many heads rolled with the problems that happened with the Boeing 737 Max. They got complacent. They lost their safety culture."

Cummings is a careful researcher. She's also, as one transport-safety researcher put it privately, "provocative." She is more than happy to strafe companies like Tesla and Waymo and Cruise, and to argue that tech bros need to be brought inside a stricter regulatory framework. In a sense, she's Elon Musk's worst nightmare. She has repeatedly and routinely risked her life to test the incredible capabilities — and the lethal limits — of human-machine interfaces. And she did it in an environment where the stakes are far higher than the battlefields of Twitter and LinkedIn. To her, the safety of self-driving cars is not an abstract question. It's a matter of life and death.

"I'm a tenured professor. My work speaks for itself. I'm trying to save your life, right?" Cummings says. "And there's the side of me where I'm like Don Quixote on steroids. There's no windmill I don't want to tilt at."

Adam Rogers is a senior correspondent at Insider.

Read the original article on Business Insider

Share This Photo X