Dave Coplin is trying to explain to me why people across two continents are suddenly allowing their employers to put microchips under their skin.
“I do this to my dog — why wouldn’t I do it to myself?” Coplin says. I’m not convinced, so he launches into a anecdote about a club on the Mediterranean party island of Ibiza where people could chip themselves and then use the chip to buy drinks. Coplin suspects this was because they weren’t wearing many clothes.
But chipping yourself because you’re half-naked and don’t have a pocket for your wallet is very different from allowing your employer to chip you. So, how did we get here?
Coplin, who heads a consultancy called the Envisioners, says there are real benefits for both employer and employee — if we can only get over our squeamishness. “If it adds value, I’m all for it,” he says. “Today we look at people doing it and it feels a bit weird, but in reality there is something inevitable about it.”
Patrick McMullan is president of Three Square Market in Wisconsin. After following experiments at Swedish incubator Epicenter in Stockholm, which has been experimenting with chipping since 2015, his company decided to develop the technology further. Naturally, as a supplier and a developer, McMullan has a chip implant himself — one roughly the size of a grain of rice implanted under the skin between his thumb and index finger. It’s based on near-field communication (NFC) technology — the same chips that are used in contactless credit cards or mobile payments. Implants are done quickly and simply with a syringe and very little blood.
One current limitation, McMullan says, is that because the chip is a passive device, there is no way it can be tracked. For now, that means the chip is for accessing the building, logging into computers, and paying for things from the canteen. But McMullan’s employees are on a mission “to change the world,” he says, and more than 70 of them so far have volunteered to be part of the experiment.
“I do this to my dog — why wouldn’t I do it to myself?”
The idea seems to be spreading. In addition to Three Square Market, at least 160 people have been chipped at Epicenter’s monthly “chipping parties.” Several staff members at CityWatcher.com, a surveillance company in Cincinnati, have gotten chips, as have some at a digital marketing company in Belgium called NewFusion. No doubt it’s good publicity, but chipping advocates genuinely believe this will become common practice over the next decade.
Chips will offer more benefits as the technology progresses, McMullan believes. “We are developing medical uses that will monitor vital signs. Doctors will be able to proactively treat patients rather than always react,” he says. McMullan believes the numbers of chipped employees worldwide will reach millions over a few years because the benefits of a sub-$100 chip are potentially huge.
McMullan sees no downside, despite obvious concerns that it feels perfectly dystopian to be intimately connected to your employer in a way that is hard to control or remove. Take his own idea of chips monitoring people’s health: There is obvious advantage in future embedded technologies that could monitor cholesterol, blood sugar levels, or even just dehydration.
But what if someone had a chip to monitor alcohol intake as part of an agreement to quit? Would a surgeon be allowed to refuse to operate? Could an insurance company hike the patient’s premiums if they fell off the wagon? The question of what information could or should be gathered and where it could or should go will become far more complex as chips become more advanced and more widespread. And other experts have raised concerns about hacking, as well as known health problems already associated with similar chips used in pets.
“Obviously, privacy is a massive concern,” Coplin adds. “What will people do with the data? Who’s going to see it? In practical terms, it’s bad enough that I have to carry my phone around with me, and my wallet. If this gets around some of that, I’m up for it.”
Despite the concerns, many people seem to accept it’s going to happen — and quite rapidly. Lynda Shaw, PhD, a cognitive neuroscientist and author of Your Brain Is Boss, believes chipping is a natural progression that is likely to be more acceptable to young people.
“If you think of young men, when they’re teenagers, we often think of them as driving too fast, hotheaded,” Shaw explains. “In evolutionary psychology, that’s vital to have in society. In the old days, if a village’s crops failed, they would get the strongest young men to go and find food. They would go and find food by going beyond their usual areas and by being curious.” We may no longer be hunter-gatherers, Shaw’s theory goes, but young people will still test the boundaries, be curious, and do new things; it’s part of what they are.
In some ways, this is already an established technology, at least among people with health problems. We already use chips for cochlear implants and even for bypassing parts of the brain in the event of brain damage, Shaw points out. “Chipping the human body is not news, but there’s always the sinister side of us that says this is a bit too Orwellian,” she says. People might become worried about computer viruses living in their bodies or about what happens when and if the hardware becomes corrupted.
Rohit Talwar, futurist and CEO of the think tank Fast Future, sees chipping becoming widespread very quickly, particularly among tech companies that want to demonstrate they are forward thinking.
Chipping will also be used, Talwar says, among companies “who want very high security so people don’t get into systems or part of the building they shouldn’t, and who want to demonstrate to clients that they’re cutting edge in security terms. You might also see it being used as a way of enabling people to exchange money in canteens, vending machines — it will get rid of identity passes.”
Shaw sees benefits as well. If someone is ill and has a pacemaker or uses anticoagulant medication, making that information available with a quick scan could save their life. But she also points to darker implications for crime scenes. In regions where the crime rate is high and bodies turn up dismembered, Shaw notes that a criminal wouldn’t need the whole body to breach security, just the limb in which a chip had been embedded. “You could end up inadvertently inciting a more horrible crime than the one originally being contemplated,” she says.
Talwar’s view is that dystopia is in the eye of the beholder. A generation born as digital natives might see this as a natural evolution and plastic passes as old-fashioned, arcane, and certainly not able to capture the kind of information that a chip inside our bodies could capture about, say, health.
“Older generations may see this as terribly invasive,” Talwar says. “I was at an event last year where they were chipping people just for fun, and the lines were going down the corridor of people waiting to be chipped — for the story and for the experience.”
So, where is chipping going? Talwar sees it as part of an inevitable process in which those who are pioneers have said for some time that if humans are going to keep up with artificial intelligence, we will have to enhance our brains and bodies.
“This is just the start point of that process. You could easily predict your mobile phone memory being inserted into you, chips to accelerate your memory and your brain,” Talwar says. “We could see a massive acceleration in this as we move into enhancing and augmenting ourselves and stepping into the world of transhumanism.”
“You could end up inadvertently inciting a more horrible crime than the one originally being contemplated.”
Coplin sees chipping as part of a dialogue about how we relate to machines. He notes that one man in Australia who tried removing the chip from a travel card and embedding it in his hand failed because the terms and conditions said not to deface the card. “At the moment, it feels weird,” Coplin says, “but at the moment, I might have a device on my wrist that might have that technology. Why not a little further under my skin?”
Society has always contested the potential of technology and the changes it forces. A quarter-century ago, few people predicted the advent of mobile phones — fewer still anticipated that we’d use them as cameras and music centers. And now there are additional pressures on technologies.
“We’ve really lost trust with the people who handle our data—the banks, the Googles, the Facebooks,” Coplin says. “Until that trust is won back, we’re going to be very fearful of this kind of thing. And I think that’s a real shame because of the benefits we could have.”