The future of Amazon’s logistics network will undoubtedly involve artificial intelligence and robotics, but it’s an open question at what point AI-powered machines will be doing a majority of the work. According to Scott Anderson, the company’s director of robotics fulfillment, the point at which an Amazon warehouse is fully, end-to-end automated is at least 10 years away. Anderson’s comments, reported today by Reuters, highlight the current pace of automation, even in environments that are ripe for robotic labor, like an Amazon warehouse.
As it stands today, robots in the workforce are proficient mostly at specific, repeatable tasks for which they are precisely programmed. To get the robot to do something else takes expensive, time-consuming reprogramming. And robots that can perform multiple different tasks and operate in dynamic environments that require the robot see and understand its surroundings are still firmly in the realm of research and experimental trials. Even the simple process of identifying an object and picking it up without having never seen that object before requires a series of complex, sophisticated software and hardware that does not yet exist in commercial fashion.
So while a robot can help manufacture a microchip and the body of a Tesla motor vehicle, it’s not capable of doing human tasks that warehouse work requires. At Amazon facilities and other companies’ fulfillment centers, a bulk of the labor is still largely done by human hands, because it’s difficult to train robots to see the world and use robotic grippers with the dexterity of human workers.
But as part of the ongoing deep learning revolution that’s accelerated the progress of AI research over the last decade, robots are starting to gain levels of vision and motor control that are approaching human-levels of sophistication. Amazon is one of the companies pioneering such robots, and it’s held an annual so-called picking challenge, after the warehouse term from picking up one object to move it to another part of the logistics chain, to promote advances in the field.
A number of other companies and research labs have been making progress on that front, too. UC Berkeley has a robotics lab that’s made substantial progress in the field, and its new low-cost robot, a pair of humanoid arms controlled by a central system called Blue, can perform complex manual tasks like the folding of a towel thanks to an AI-powered vision system. Research lab OpenAI has similarly been using an AI training technique known as reinforcement learning to teach a robotic hand more precise and elegant movements, the types of motion that would be required of a robot to replicate a human in a warehouse. Kindred, a San Francisco-based startup, makes a robotic arm called Kindred Sort that it’s deployed in warehouses for the retailer Gap that uses a mix of human piloting and automation to perform dynamic product picking.
According to Reuters, Amazon has 110 warehouses in the US, 45 sorting centers, and roughly 50 delivery stations, all of which employ more than 125,000 full-time warehouse workers. But only a fraction of that work is performed by robots. Right now, robots are simply too imprecise and clumsy and require too much training to be deployed on factory floors outside very narrow use cases.
For instance, Amazon uses small, Roomba-shaped robots simply called “drives” mostly to deliver large stacks of products to human workers, by following set paths around the warehouse. “In the current form, the technology is very limited. The technology is very far from the fully automated workstation that we would need,” Anderson told Reuters, which toured an Amazon warehouse in Baltimore earlier today.