Recently Waymo released a video of their vehicle navigating an intersection where the traffic signal was off and traffic was being directed by an officer standing in the middle of an intersection. The car waits, then when the officer makes a small hand gesture, it proceeds through the intersection.
Not a big deal for a human driver, but an interesting milestone for a robocar. I say this because I have attended dozens of talks by people who are skeptics about the progress of robocars, and almost every such talk has included a slide of this situation and asked how the car is going to handle it – implying that this is an intractable problem.
The video Waymo offers does show one of the easier variants of this problem. In this case, the light is off. Waymo cars have a detailed map showing the location of every traffic signal where they drive. The exact 3-D location of each hanging set of lights, not just a note that there is a signal. They are able to look at the intersection and know where the lights should be, and in particular, look for a green light. While they understand the yellow and red lights, the important thing is the green. If you don't see the green, you should not go. And the car does this as it approaches this intersection. Then it sees and tracks the cop.
Today's machine learning based vision systems are not good enough to trust your life to in general situations. But they are good at looking for specific things and identifying them – in many cases today, such as recognizing specific traffic signs, they are actually superior to the average human. And Waymo has apparently trained theirs to be good enough to recognize the officer, and the various gestures and body language one will see from somebody directing traffic. It doesn't have to be perfect – humans aren't perfect – because it is going slowly and can correct any rare mistake it might make. In fact, even if there is no cop there, you treat such an intersection like a 4-way stop under the law, and everybody proceeds with caution when it is their turn. The robots have been able to handle that for a long time.
Waymo does not say what would happen if the light were still operating, and the officer was superseding it -- for example, waving at you to go when the light is showing red. That might be a harder problem they haven't solved yet. But it's not intractable, and that was the mistake of the skeptics. There are solutions to these problems. Building them is far from easy -- Waymo and other teams are spending billions to do it -- but it's not impossible. And no doubt having to turn left at such an intersection might be more challenging. But things are going well.
Any reasonably developed car would still handle this situation safely, avoiding entry into an intersection if a person (cop or otherwise) is present or if other vehicles are crossing. What's important here is the car's ability to not be so timid when identifying that the officer has signaled to go.
Waymo and most other large teams have another fallback in a situation like this. If a car finds it can't understand what's going on, it signals a control center, where human operators are standing by. They can look out through the cameras and understand the scene, and then give strategic advice to the car, like when to go, or where to turn. They do not steer the car like a remote drone pilot, they just solve its strategic problem. At worst, the car at rare moments is like the hesitant human drivers we see all the time and sometimes honk at. As long as it's rare, it's a good solution to the problems that it's hard for software to solve. This does not appear to be what was done in this video.
(Disclosure: The author worked on the Google Car project several years ago, and holds a small amount of Alphabet stock.)