Around the end of each year major dictionaries declare their “word of the year.” Last year, for instance, the most looked-up word at Merriam-Webster.com was “justice.” Well, even though it’s early, I’m ready to declare the word of the year for 2019.
The word is “deep.”
Why? Because recent advances in the speed and scope of digitization, connectivity, big data and artificial intelligence are now taking us “deep” into places and into powers that we’ve never experienced before — and that governments have never had to regulate before. I’m talking about deep learning, deep insights, deep surveillance, deep facial recognition, deep voice recognition, deep automation and deep artificial minds.
Some of these technologies offer unprecedented promise and some unprecedented peril — but they’re all now part of our lives. Everything is going deep.
Which is why it may not be an accident that one of the biggest hit songs today is “Shallow,” from the movie “A Star Is Born.” The main refrain, sung by Lady Gaga and Bradley Cooper, is: “I’m off the deep end, watch as I dive in. … We’re far from the shallow now.”
We sure are. But the lifeguard is still on the beach and — here’s what’s really scary — he doesn’t know how to swim! More about that later. For now, how did we get so deep down where the sharks live?
The short answer: Technology moves up in steps, and each step, each new platform, is usually biased toward a new set of capabilities. Around the year 2000 we took a huge step up that was biased toward connectivity, because of the explosion of fiber-optic cable, wireless and satellites.
Suddenly connectivity became so fast, cheap, easy for you and ubiquitous that it felt like you could touch someone whom you could never touch before and that you could be touched by someone who could never touch you before.
Around 2007, we took another big step up. The iPhone, sensors, digitization, big data, the internet of things, artificial intelligence and cloud computing melded together and created a new platform that was biased toward abstracting complexity at a speed, scope and scale we’d never experienced before.
So many complex things became simplified. Complexity became so fast, free, easy to use and invisible that soon with one touch on Uber’s app you could page a taxi, direct a taxi, pay a taxi, rate a taxi driver and be rated by a taxi driver.
Over the last decade, these advances in the speed of connectivity and the elimination of complexity have grown exponentially. Because as big data got really big, as broadband got really fast, as algorithms got really smart, as 5G got actually deployed, artificial intelligence got really intelligent. So now, with no touch — but just a voice command or machines acting autonomously — we can go so much deeper in so many areas.
Scientists and doctors can now find the needle in the haystack of health data as the norm, not the exception, and therefore see certain disease patterns that were never apparent before. Machines can recognize your face so accurately that the Chinese government can punish you for jaywalking in Beijing, using street cameras, and you will never encounter a police officer.
Indeed, with today’s facial recognition technology, I can dispense with the card reader at my office’s security gate and instead use each employee’s face as an ID. And cars can drive on their own.
DeepMind, the artificial intelligence arm of Google’s parent, developed an A.I. program, AlphaGo, that has now defeated the world’s top human players of the ancient strategy game Go — which is much more complex than chess — by learning from human play.
As The Times reported, DeepMind “showed yet another way that computers could be developed to perform better than humans in highly complex tasks” and to “mimic the way the brain functions.” DeepMind’s next breakthrough, AlphaZero, did not even need to learn from humans. It learned even faster by repeatedly playing against itself!
Today “virtual agents” — using conversational interfaces powered by artificial intelligence — can increasingly understand your intent when you call the bank, credit card company or insurance company for service, just by hearing your voice.
It means machines can answer so many more questions than nonmachines, also known as “humans.” The percentage of calls a chatbot, or virtual agent, is able to handle without turning the caller over to a person is called its “containment rate,” and these rates are steadily soaring. Soon, automated systems will be so humanlike that they will have to self-identify as machines.
Automation is also going deep, fast. The Times’s Kevin Roose quoted Mohit Joshi, the president of Infosys, a technology firm that helps other businesses automate their operations, as saying in Davos last week: “People are looking to achieve very big numbers. Earlier they had incremental, 5 to 10 percent goals in reducing their work force. Now they’re saying, ‘Why can’t we do it with 1 percent of the people we have?’”
But bad guys, who are always early adopters, also see the same potential to go deep in wholly new ways.
They can fake your face and voice so well that they can create a YouTube video that will go viral of you saying racists things or make it look like the president of the United States just announced a nuclear attack on Russia. They can use technology to fake a bank manager’s voice so well that it can call your grandmother, and, with a voice command, ask her to transfer $10,000 to an account in Switzerland and she’ll do it — and you’ll never catch them in time.
That’s why the adjective that so many people are affixing to all of these new capabilities to convey their awesome power is “deep.”
On Jan. 20, The London Observer looked at Harvard Business School professor Shoshana Zuboff’s new book, the title of which perfectly describes the deep dark waters we’ve entered: “The Age of Surveillance Capital.”
“Surveillance capitalism,” Zuboff wrote, “unilaterally claims human experience as free raw material for translation into behavioral data. Although some of these data are applied to service improvement, the rest are declared as a proprietary behavioral surplus, fed into advanced manufacturing processes known as ‘machine intelligence,’ and fabricated into prediction products that anticipate what you will do now, soon and later. Finally, these prediction products are traded in a new kind of marketplace that I call behavioral futures markets. Surveillance capitalists have grown immensely wealthy from these trading operations, for many companies are willing to lay bets on our future behavior.”
Unfortunately, we have not developed the regulations or governance, or scaled the ethics, to manage a world of such deep powers, deep interactions and deep potential abuses.
Two quotes tell that story: Last April, Senator Orrin Hatch was questioning Facebook C.E.O. Mark Zuckerberg during a joint hearing of the commerce and judiciary committees. At one point Hatch asked Zuckerberg, “So, how do you sustain a business model in which users don’t pay for your service?”
Zuckerberg, clearly trying to stifle a laugh, replied, “Senator, we run ads.” Hatch did not seem to understand that Facebook’s business model is to mine users’ data and then run targeted ads — and Hatch was one of Facebook’s regulators.
But then Zuckerberg was also clueless about how deep the powers of the Facebook platform had gone — deep enough that a few smart Russian hackers could manipulate it to help Donald Trump win the presidency.
When faced with evidence that fake news spread on Facebook influenced the outcome of the 2016 election, Zuckerberg dismissed that notion as a “pretty crazy idea.” It turns out that it was happening at an industrial scale and he later had to apologize.
Regulations often lag behind new technologies, but when they move this fast and cut this deep, that lag can be really dangerous. I wish I thought that catch-up was around the corner. I don’t. Our national discussion has never been more shallow — reduced to 280 characters.
This has created an opening and burgeoning demand for political, social and religious leaders, government institutions and businesses that can go deep — that can validate what is real and offer the public deep truths, deep privacy protections and deep trust.
But deep trust and deep loyalty cannot be forged overnight. They take time. That’s one reason this old newspaper I work for — the Gray Lady — is doing so well today. Not all, but many people, are desperate for trusted navigators.
Many will also look for that attribute in our next president, because they sense that deep changes are afoot. It is unsettling, and yet, there’s no swimming back. We are, indeed, far from the shallow now.