In an attempt to spot vulnerabilities in its system before bad actors exploit them, Facebook has hired a team of ex-intelligence officers, researchers, and media buyers, and set them loose on its products.
Facebook calls this group the "Investigative Operations Team" and has directed its members to find the worst possible things that can be done using Facebook, and to help the company prevent them.
The group, whose existence is being revealed for the first time in this story, is testing Facebook’s advertising systems, pages, Instagram, Messenger, and more. Facebook told BuzzFeed News the team is searching for troubling behavior in countries like Myanmar, examining keywords and other signals that could be used to promote violence. And it’s investigating Facebook’s merchant tools, attempting to spot problematic product sales.
“What we have now is a series of people who are truly looking for how could you possibly do something wrong,” Facebook Business Integrity Director Lynda Talgo told BuzzFeed News. Because the team is still being built out, she declined to specify the number of people on it, or the bad activity it’s found to date. “Their entire job is to look forward and figure out what’s coming around the corner.”
Facebook’s creation of this team is the latest sign of an ongoing mindset shift among Silicon Valley’s giants, who are realizing they can’t simply assume the best of their users and must prepare for the worst behavior imaginable. Google, for instance, has set up an “Intelligence Desk” at YouTube meant to detect controversial content before it mushrooms into crisis. And Twitter recently started limiting the visibility of tweets from people whose behavior indicates they might be trying to game its system.
“Speaking from my own experiences and my colleagues that I talk with in this field, it definitely seems like there’s a broader responsibility across the board that we all feel,” Talgo said.
The team originated last year, when company executives admitted internally what was painfully clear from the outside: Its problem wasn’t simply that it missed a few bad actors determined to exploit its system, but rather that it never anticipated such egregious abuses in the first place.
Facebook suffered from a “failure of imagination,” Rob Goldman, its advertising head, said in one meeting, according to Talgo. And Sheryl Sandberg, its chief operating officer, saw a gap in the company’s largely optimistic mentality, and told its leaders to hire people with different life perspectives than Facebook’s employees, many of whom have technical and analytical backgrounds. Facebook needed people who had seen some shit.
Of all the big tech companies, Facebook may need this shift in mentality the most. It has, at times, lived in a reality hole, seemingly oblivious to the bad elements of humanity exploiting its service right under its nose. It infamously missed a Kremlin-linked troll campaign that used its service throughout the 2016 US election to turn Americans against one another. It missed multiple discriminatory uses of its ad platform, from targeting allowing landlords to exclude people by race to employers excluding people by age. It missed the manipulation of user-generated advertising categories to create bigoted ad targeting options, including “how to burn Jews.” And it missed Cambridge Analytica’s use of illicitly obtained Facebook data in its work during the 2016 election.
It’s a painful list for Facebook, and the Investigative Operations team is a direct response to misused ad targeting and election interference, the company told BuzzFeed News.
While Facebook is still vague about the details of this team’s work, its creation indicates the company has learned a thing or two from the rolling crisis it’s endured over the past two years. “It was clear late last year that we needed to do more and do something different,” Talgo said. “And honestly it wasn’t clear before that.”
If you want to keep up with the latest on Facebook, Amazon, Apple, and Google, subscribe to Tech Giant Update, a BuzzFeed News newsletter by the author of this piece, Alex Kantrowitz.