Just before Christmas, a passenger using the Waymo One ride-hailing service found himself in a self-driving car that seemed to have lost its way.
The human safety driver – present just in case something goes wrong – had already taken control of the self-driving car once when it took too long to make a left turn. After the autonomous car took over again, the passenger – who we’ll refer to as “Carl” to protect his identity – realized he was cruising right past the destination he’d entered in the Alphabet-owned company’s Uber-style app.
Carl’s stop was off to the right, but there were no obvious openings that Waymo’s autonomous vehicle could take to move into the right lane. Instead of merging, it continued an extra quarter-mile down the street, eventually making a legal U-turn so it could take advantage of a protected left turn before dropping him off.
“It didn’t even turn on a turn signal and hope that someone let it in,” Carl told Futurism. “It just stayed in the left lane and went all the way down past my destination.”
On an early January trip with the same pickup and dropoff locations, the autonomous vehicle avoided the whole mess by taking the freeway.
Carl shared his insight into Waymo’s self-driving car services, which he’s used periodically over the past five months, with Futurism under the condition of anonymity, as he had to sign a nondisclosure agreement to take part in an earlier, experimental version of the autonomous ride-hailing service.
Overall, Carl is optimistic and excited about self-driving technology. But he also emphasized that Waymo’s new service doesn’t yet fully live up to marketing and hype. Back in December, when Waymo announced an Uber-style service called Waymo One that would shuttle drivers around Phoenix, Arizona, it made bold promises.
Passengers – who would initially come from an applicant pool of participants in Waymo’s “early rider” test program – would be able to call a self-driving minivan at any time of the day or night, and safety drivers would be there only “for riders’ comfort and convenience,” the company claimed back when it announced Waymo One. Eventually, the safety drivers might even chill in the back seat.
But Carl’s experiences as he used the services in the patch of downtown Phoenix where they’re allowed to operate tell a different story — one in which Waymo’s self-driving cars struggle to make left turns, change lanes, or merge onto the freeway. In general, they run into difficulty with tasks that require a human-like sense of intuitive timing and an understanding of other drivers’ intentions, not just the ability to spot signs, obstacles, and road markings at which Waymo vans have become adept.
And as for the driver chilling in the back seat? For now, a Waymo employee remains in the driver’s seat during every Waymo One trip and regularly takes over when the van gets confused. The safety drivers are instructed to take over when they feel it’s safer to do so than to let the car continue driving itself.
A Waymo spokesperson declined to comment on the record about most specific incidents described in this story, instead corroborating some information on background.
For now, Waymo’s early rider program and Waymo One are impressive demos. But the services are still heavily mediated by human employees and don’t quite provide the driverless experience the company initially promised.
Glitches aside, Carl wants to live in a world where robots and autonomous cars handle dangerous tasks like driving, an activity that led to 37,461 crash-related deaths in America in 2016, and is optimistic that Waymo’s self-driving technology is on its way there. Carl was inspired by movies like “Minority Report,” where a centralized, well-managed network of autonomous cars seamlessly shuttles people around.
“It seems so safe,” he said. “I think that future is a long way off, but it’s something that I hope for.”
That future seemed to be especially far off back in March, when one of Uber’s autonomous vehicles struck and killed a pedestrian in a tragic crash that was later revealed to have involved a distracted safety driver. One might assume this sort of situation is exactly why Waymo embraces such a cautious approach – one that could be a sign of things to come for the oft-criticized self-driving car industry.
The technology that Waymo has and continues to develop is impressive, and Waymo One could still someday grow into the seamless, autonomous ride-hailing app of the future that the company first envisioned. The question of when that might happen comes down to how quickly Waymo can ensure that any new features are completely safe and reliable.
“I do trust the car,” said Carl. “I do trust that it’s going to get me there, and I trust that the person behind the wheel is going to keep me safe in the meantime.”
But with that trust comes an understanding that the autonomous vehicles drive very differently from people. On one hand, that means that the autonomous vehicles are less aggressive than your typical Phoenix drivers, who Carl characterized as notoriously antagonistic, and who have even reportedly taken to throwing rocks at self-driving cars.
Back in December, Ars Technica spoke to Michael Richardson, a member of the experimental Waymo early rider program who used the service in September and October. Like Carl, Richardson described a mostly-functional self-driving vehicle service that occasionally struggled with left turns and changing lanes.
“Our vehicles complete unprotected left turns in autonomous mode regularly,” a Waymo spokesperson told Ars in response to Richardson’s claims. “However, we’ve always said that unprotected lefts on high-speed roads are among the most difficult driving maneuvers. As our technology is new, we are going to be cautious because safety is our highest priority.”
Operating cautiously also often means sacrificing expediency. Several months after Richardson’s account, those problems persist in the Waymo One program, according to Carl.
The act of merging onto a highway is a nuisance for any driver. But after signaling and maybe even edging your car into the lane a little bit, someone will likely let you through. For now, Waymo’s vehicles are still struggling with those skills.
“The first time I took it, it got onto the freeway on-ramp,” said Carl, “but people are dicks.”
No one gave the car room to merge onto the freeway, and it never used its turn signal to alert drivers that it wanted to do so. Instead, Carl said that the car sped up to the freeway speed limit, seemingly to shift lanes when an opening appeared. When that didn’t happen, the car exited the freeway without switching lanes and tried again at the next ramp, where it finally found an opening. Carl said it was several months before he was in another Waymo vehicle that attempted to take the freeway, even if doing so would have been the obvious route.
“It didn’t try to slow down and find an opening or anything,” Carl said. “It just kind of passed everybody ’cause they were moving kinda slow. That was kind of weird in my opinion. It has never taken the freeway since.”
Based on Carl’s account, it would seem Waymo programmed its autonomous vehicles to treat turn signals as an indicator that the car is about to execute a turn or merge, not an indicator that it is looking for the opportunity to do so.
“If there’s other cars there, it’s not going to [merge],” Carl said. “And then it doesn’t even turn on the turn signal and wait for someone to give it an opening.”
However, according to Waymo’s account of its service, the company’s self-driving vehicles are supposed to, and regularly do, slow down and signal before changing lanes.
In situations that confuse the car, Carl says the human driver sometimes takes over. During one trip, the driver had to briefly turn off autonomous mode because the car was too hesitant to make a left turn at an intersection where it had a green light but not a separate light for left turns.
Another time, in mid-November, Carl used a Waymo car to get home late at night. When he got into the autonomous vehicle, the safety driver drove the whole time, not once switching on the car’s self-driving capabilities.
Aside from the minor annoyance of having a trip take longer than it should, these stories suggest the company’s self-driving cars could be inadvertently holding up traffic as well.
Both Carl and Richardson reported riding in a Waymo vehicle that got stuck behind a city bus that pulled over to load and unload passengers while still partially blocking the lane. Richardson told Ars that his safety driver took over to steer around the bus, but Carl’s vehicle sat and waited.
Carl told us that his vehicle stopped itself before crossing an intersection because on the other side of the crossing, a bus had pulled into a bus stop. Meanwhile, a growing queue of irate drivers found themselves stuck behind the hesitant car that was unwilling to go around the bus.
“People behind it were honking because they probably wanted to turn right — it was the right lane,” Carl told us. “And they were just honking at this car, like that makes a difference to the car.”
A stopped bus is a minor annoyance for a human driver, but briefly swerving around and continuing onward is second nature. That’s not the case for an autonomous vehicle programmed to treat lane dividers as impenetrable borders.
“If there’s an object obstructing the road and it’s safe to do so, the vehicle can steer around it provided that it’s within the law and safe to do so,” a Waymo spokesperson told Futurism. “But if it were something like flashing lights on a bus, you’re not supposed to pass those because there may be schoolchildren.”
Carl’s challenges with his Waymo experience weren’t limited to what happened outside the car. The rules the safety drivers are told to follow can make sharing the confined space of the test vehicles awkward.
Carl found it difficult to strike up a conversation with many of the safety drivers — something that passengers were told not to do during onboarding — and often felt awkward. Passengers were also told not to record video during rides or ask drivers technical questions.
It became even harder to chat with drivers when Waymo added a second employee to some vehicles. Waymo still runs its early rider program parallel to Waymo One. In the experimental early rider program, Waymo tests out new features that may make it Waymo One, including different safety driver arrangements. For several of Carl’s rides, there was a safety driver sitting behind the wheel and a second employee with a laptop computer in the passenger seat.
In the moment, Carl wondered whether the drivers were being monitored, but this time he chose not to ask in case striking up a conversation might get the drivers in trouble.
But Carl once rode with a driver who was not only willing to chat, but who said he hadn’t heard that they weren’t supposed to. This driver, Carl said, explained that he talked with passengers every now and then, especially when the car was stuck in traffic or a dust storm rendered its computer vision system inoperable. Even then, Carl had to take the initiative to break the silence.
“I said ‘Hey, are you allowed to talk to me?'” Carl said, recapping the conversation he had with a driver. “And he’s like, ‘Yeah, why wouldn’t I be?’ I said, ‘Well, in the onboarding, it said I wasn’t supposed to talk to you,’ and he was kind of surprised by that. He said ‘Oh, well, we’re supposed to give the driverless experience, so I guess that’s why.'”
Waymo encourages passengers to push a button and chat with rider support instead of their drivers, in part to simulate a driverless experience but also so that Waymo can directly learn more about the passengers’ experience, gather data, and answer questions in real time. Further, safety drivers don’t have the authority to take the car anywhere off its predetermined path. Instead, Carl said passengers had to contact a technician in a central office in order to change their destination.
It would be easier for riders to pretend they were alone in the car, though, if the safety drivers weren’t constantly talking to the vehicle itself.
Carl told us that every time the car runs into trouble, whether it takes too long to make a left turn or has trouble with an on-ramp, the driver makes a voice log into one of the many radios or sensors in the vehicle. That way, the data being collected as Waymo’s vehicles drive around gets supplemented by a time-stamped record of the driver’s observations that the company can use to improve later on.
These voice logs, in addition to data collected from the ongoing early rider program, can be used to develop, test, and fix new experimental features that may slowly-but-surely trickle into the commercial Waymo One service. And perhaps that pace is okay — it’s these incremental advances that will help bring us to a reality with ubiquitous self-driving cars.
The awkward human interactions, however, are here to stay for the foreseeable future.
For a while, Carl thought that when his driver spoke, they were merely commenting on the vehicle’s performance, not unlike how people chat about the weather when they’re sharing a long elevator ride. But every time Carl joined in what they thought was a conversation, the driver would immediately shut it down.
“One of the first times it was pulling up in front of my house, I walked up and it slammed on its brakes because it thought I was going to walk in front of it,” Carl told us. “So I opened the door and the guy said something about how it had slammed on its brakes. And I was like, ‘Oh, yeah. Sorry. That was my bad.”
Immediately, without turning his head, the Waymo safety driver replied.
“Oh, I’m not talking to you.”