Fully driverless Waymo taxis are due out this year, alarming critics
By Timothy B. Lee
8 - 10 minutes
Waymo, Google's self-driving car project, is planning to launch a driverless taxi service in the Phoenix area in the next three months. It won't be a pilot project or a publicity stunt, either. Waymo is planning to launch a public, commercial service—without anyone in the driver's seat.
And to date, Waymo's technology has gotten remarkably little oversight from government officials in either Phoenix or Washington, DC.
If a company wants to sell a new airplane or medical device, it must undergo an extensive process to prove to federal regulators that it's safe. Currently, there's no comparable requirement for self-driving cars. Federal and state laws allow Waymo to introduce fully self-driving cars onto public streets in Arizona without any formal approval process.
That's not an oversight. It represents a bipartisan consensus in Washington that strict regulation of self-driving cars would do more harm than good.
"A very difficult task"
"If you think about what would be required for some government body to examine the design of a self-driving vehicle and decide if it's safe, that's a very difficult task," says Ed Felten, a Princeton computer scientist who advised the Obama White House on technology issues.
Under both Barack Obama and Donald Trump, the federal government has taken a hands-off approach to driverless car regulation. Instead of enacting new safety regulations for self-driving cars, Felten says, federal policies have tried "to make sure that vehicle safety regulations don't inadvertently make it more difficult to roll out self-driving vehicles."
Self-driving cars do need to comply with an existing set of safety regulations called the Federal Motor Vehicle Safety Standards. But that's not a big hurdle in practice. Waymo plans to address it by simply building its service using Chrysler Pacifica vans that are already FMVSS-compliant.
Meanwhile, Congress is considering legislation that would make it easier for companies to manufacture driverless vehicles that aren't fully FMVSS-compliant. This would allow GM to start making a car with no steering wheel as early as next year.
This hands-off regulatory approach drives some safety advocates crazy.
"I think it's stunning," says Cathy Chase, the head of the Advocates for Highway and Auto Safety, about the deregulatory trend.
Mary "Missy" Cummings, an engineering professor at Duke, agrees. "I don't think there should be any driverless cars on the road," she tells Ars. "I think it's unconscionable that no one is stipulating that testing needs to be done before they're put on the road."
But so far these advocates' demands have fallen on deaf ears. Partly that's because federal regulators don't want to slow the introduction of a technology that could save a lot of lives in the long run. Partly it's because they believe that liability concerns give companies a strong enough incentive to behave responsibly. And partly it's because no one is sure how to regulate self-driving cars effectively.
When it comes to driverless cars, "there's no consensus on what it means to be safe or how we go about proving that," says Bryant Walker Smith, a legal scholar at the University of South Carolina.
Other industries face stricter regulation
To get an idea for what a more robust regulatory framework might look like, it's helpful to look at how the federal government regulates other complex, safety-critical technologies. Cummings did just that in a recent paper that compares the regulation of airplanes, medical devices, and cars.
An aircraft manufacturer is expected to meet with the Federal Aviation Administration while an airplane is still on the drawing board. Together, they draw up a plan that includes "a project timeline, checklists for moving on to the next phases, means of compliance, testing plans, and other project management information," Cummings writes. The FAA stays involved throughout the development process, verifying that the agreed-upon tests are passed before allowing a new airplane onto the market.
While the FAA is involved in aircraft design from the outset, the Food and Drug Administration typically waits until a medical device is ready for testing on human patients before getting involved.
Prior to clinical trials for a first-in-class device, a device maker must submit detailed technical information to the FDA, including a "device description, drawings, components, specifications, materials, principles of operation, analysis of potential failure modes, proposed uses, patient populations, instructions, warnings, training requirements, clinical evaluation criteria and testing endpoints, and summaries of bench or animal test data or prior clinical experience." The device maker then conducts the tests, submits the data to the FDA, and waits for FDA approval before bringing the product to market.
Waymo won't have to do anything remotely like this. The company has had informal discussions with government officials at the federal, state, and local levels. But there is no formal process requiring the company to submit information about its technology and test results to regulators in Phoenix or Washington. The law simply doesn't require Waymo to prove that its driverless technology is safe before putting cars on the road.
The rigorous processes for approving airplanes and medical devices come at a high cost. Cummings says that it can take as much as eight years to bring a new airplane to market. A survey of medical device companies found that it took an average of four-and-a-half years for the FDA to sign off on a new medical device (the FDA's process is shorter than the FAA's partly because the agency doesn't get involved until after the product has been developed). And it cost $94 million, on average, to bring a new medical device to market.
Self-driving car advocates argue that slowing down the development of self-driving cars could ultimately cost more lives than it saves. In 2016, more than 37,000 people died from highway crashes, with many being caused by human error, so self-driving cars have the potential to prevent thousands of highway deaths in the coming years.
Even safety advocates like Chase and Cummings don't necessarily want to see cars subjected to the kinds of comprehensive regulations imposed on aircraft and medical device makers. But they'd like to see the government take a more active role in testing self-driving cars—before they're allowed on public roads.
But Princeton's Ed Felten questions whether that's realistic. He points out that there are unique challenges to testing self-driving cars.
"The rate of accidents or fatalities per vehicle mile is required to be very, very low," he says. Human drivers get in a fatal accident about once per 100 million miles. So to determine whether a driverless system is as safe as a human driver, you have to figure out how it handles rare situations—situations that a typical driver might only encounter once in a lifetime.
"It requires a huge amount of simulation and road testing under controlled conditions in order to know if it's safe," Felten argues. "It's hard to imagine the government doing that."
Chase, for example, advocates giving driverless cars a "vision test" to demonstrate that they "can see and respond to what's happening on the roads." But it's not clear how much value this would have in practice. Having this kind of perception capability is certainly necessary for a fully self-driving car, but it's far from sufficient to prove that driving software will be as safe as a human driver.
And while Cummings told me that "there has never been any kind of real-world testing" of Waymo's cars, that doesn't seem quite fair to Waymo. Last year I traveled to Waymo's testing facility in California, where the company has put its cars through its paces in hundreds of controlled scenarios. I watched a Waymo car slow down as another car cut it off, stop as "movers" dropped a pile of boxes in its path, and yield to a car backing out of a driveway.
Maybe federal regulators would come up with some tests that Waymo's engineers haven't tried yet. But overall it's hard to believe that the feds would come up with a more rigorous battery of tests than the tests Waymo is already conducting in the California desert and elsewhere.
Ultimately, the only way to test how a self-driving car will perform on real public streets is to test them on real public streets.
Waymo could be more transparent
One place where Waymo's critics absolutely have a point, however, is that Waymo hasn't been very transparent about its testing.
Consistent with the Trump administration's deregulatory approach, companies are encouraged, but not required, to file a report detailing the safety features of their self-driving cars. Waymo was the first company to file a report like this last year.
When I talked to Cathy Chase in August, she was scathing about safety reports filed by Waymo and other carmakers. "They look more like glossy marketing brochures, rather than providing data," she said.
"We need 'waymo' information," said Henry Jasny, a lawyer at the Advocates for Highway and Auto Safety, shortly after Waymo published its safety report last year.
In an emailed statement, Waymo argued that its 43-page report actually provided a wealth of information about the safety of its vehicles.
"We are committed to educating the public and policymakers about this new technology, which we believe could save thousands of lives, and the rigorous process we’ve put in place to test it," a spokeswoman wrote by email last week. She argued that Waymo's safety report "provides great detail about our testing programs, both in the text portions on testing and the appendices that specifically explain particular types of tests we conduct, that was designed to be informative for policymakers and the general public."
It's true that Waymo's safety report includes a long list of test scenarios Waymo has performed. For example, Waymo says that its cars can handle a scenario it labels "fully self-driving vehicle approaches lead vehicle decelerating."
But the safety report doesn't include the kind of detailed information that would allow for independent analysis of Waymo's testing process. What were the exact parameters of the test? How many times was it run? How did the vehicle perform? The report doesn't say.
And the same point applies with even more force to Waymo's testing on public roads. Waymo has conducted nine million miles of public road testing—a lot of them with safety drivers behind the wheel. But the public has very little information about how these tests have gone. That's especially true in Arizona, which (unlike neighboring California) does not require self-driving car companies to file regular reports on their testing activities.
In late August, for example, The Information published an article reporting that some residents in the Phoenix area were frustrated with having to share the roads with Waymo vehicles that frequently hesitated at times when a human driver wouldn't. A Mountain View resident has posted a series of videos that show Waymo cars seeming to freeze up in situations that would not have confused human drivers.
Do these reports mean there are widespread problems with Waymo's software? Or, are these a few isolated incidents being blown out of proportion? It's hard to tell without comprehensive data about the real-world performance of Waymo's vehicles. Waymo undoubtedly has this kind of data; it just hasn't made it available to the public.
My own hunch is that Waymo will ultimately prove naysayers wrong. The company started developing self-driving technology long before its rivals, so it has had the luxury of taking a slow and steady path toward commercialization.
But testing is always going to be more rigorous and trustworthy with independent oversight. The public has every reason to be skeptical unless and until Waymo makes an affirmative case—backed up by comprehensive data and independent analysis—that its vehicles are safe. Current federal and state law doesn't require Waymo to make that case. But it might be in Waymo's interest to make it anyway.
"Providing more information about how their vehicles are performing would be to their benefit because ultimately you want people to trust the cars, and the way you imbue that trust is by proving they're safe," Chase says.
If Waymo launches a commercial service without releasing significant performance data or allowing independent review of its technology, that will set a precedent that will make it easier for other companies—perhaps less scrupulous companies—to do the same thing.
Under current rules, "anyone can put an autonomous vehicle on the road," Chase says. "Joe's Garage can build one, put it on the road, then there's a horrific crash."
The trustworthy company model
If formal FDA-style testing isn't realistic, what could regulators do instead? Bryant Walker Smith advocates what he calls a "trustworthy company" model for regulating self-driving cars. Instead of writing prescriptive, technology-focused standards for driverless cars, he says, regulators should focus on validating car companies' own processes for developing and testing driverless cars.
Smith would like to "have governments say: are these companies making a credible case? Are they candidly communicating? Does the company support their assertions?"
"Regulation is not just a rule or a prospective approval," Smith notes. "Regulation is all of the tools available to governments: investigations, inquiries, recalls, prosecutions for misrepresentations to governments."
In this model, the regulator's focus would not so much be on directly evaluating the technology, but instead on making sure that the companies building driverless cars have a corporate culture and a set of processes that prioritize safety. Regulators would press companies to make a thorough, evidence-backed case for the safety of their vehicles.
So far, Waymo hasn't really done this—at least not publicly. Ever since its launch almost two years ago, Waymo has emphasized the safety of its technology. "Safety is the core of Waymo’s mission and everything we do," a Waymo spokeswoman told me via email.
But the company hasn't released much data to back up its safety claims. We know Waymo has logged millions of miles on Arizona roads, but we know very little about how its vehicles have performed.
Waymo needs to not just build safe technology, but also convince the public that its technology is safe. Being more transparent about both its technology and its testing efforts could help.