On Wednesday, I learned a new way to get a news article erased from much of the internet.
If the article shows your home or apartment, says what city you’re in and you don’t like it, you can complain to Facebook. Facebook will then ensure that nobody can share the article on its giant platform and, as a bonus, block you from sending it to anyone in Facebook Messenger.
I learned this rule from a cheerfully intense senior Facebook lawyer. The lawyer, who was supplied by Facebook’s public relations department on the condition she could speak only anonymously to discuss a specific case, was trying to explain why the service had expunged a meanspirited New York Post article about a Black Lives Matter activist’s real estate purchases.
“The policy is superclear!” the lawyer told me over a Zoom call from her bright home office. But, she added, “I totally get why it sounds kind of crazy in this case.”
The policy sounds crazy because it could apply to dozens, if not hundreds, of news articles every day — indeed, to a staple of reporting for generations that has included Michael Bloomberg’s expansion of his townhouse in 2009 and the comings and goings of the Hamptons elites. Alex Rodriguez doesn’t like a story that includes a photo of him and his former fiancée, Jennifer Lopez, smiling in front of his house? Delete it. Donald Trump is annoyed about a story that includes a photo of him outside his suite at Mar-a-Lago? Gone. Facebook’s hands, the lawyer told me, are tied by its own policies.
Presumably, the only reason this doesn’t happen constantly is because nobody knows about the policy. But now you do!
I learned about this policy while trying to understand a rarely discussed front in Facebook’s rolling standoff with journalism: In cases of difficult news judgment, who decides what counts as news? Will Facebook defer to publishers’ decisions on, for instance, which celebrity’s home purchase is worth covering? Or will Facebook delete a publisher’s link just as quickly as it deletes an individual’s post that it has decided violated its rules?
The answer, the lawyer told me, is simple: Facebook alone decides. In the parts of its policy that are devoted to privacy and safety, Facebook doesn’t pay any special deference to journalists and believes its “policy” team is better suited to make decisions. Facebook alone will balance competing values like newsworthiness against privacy, or the old print belief in transparency against the digital aversion to “doxxing” — that is, publishing people’s identifying information against their will. And in the standoff with The Post this month, all you can do is choose your fighter: Mark Zuckerberg or Rupert Murdoch.
Facebook’s lawyer was earnestly explaining the policy to me to rebut The Post’s accusation that it was being “silenced” because of its Trumpy right-wing politics. This keeps happening to The Post. Facebook also blocked an article speculating (as many others have) that the coronavirus could have leaked from a lab, and it ensured that The Post’s reporting on emails from Hunter Biden couldn’t be widely shared across the social network. Blocking the Covid-19 op-ed, a Facebook spokeswoman, Sally Aldous, said, was a “bug.” The company’s action on the Hunter Biden story was the result of yet another policy, in which professional “fact checkers” — mostly junior journalists — have a week to rule on whether something is true or false while Facebook prevents a story from being shared widely. (The fact checkers were not, in this case, able to get to the bottom of an epically puzzling and messy story in a week.)
Facebook’s usual critics have been strikingly silent as the company has extended its purview over speech into day-to-day editorial calls. “We don’t have anyone who is closely plugged into that situation right now so we don’t have anything to say at this point in time,” a spokesman for the American Civil Liberties Union, Aaron Madrid Aksoz, said in an email. The only criticism came from the News Media Alliance, the old newspaper lobby, whose chief executive, David Chavern, called blocking The Post’s link “completely arbitrary” and noted that “Facebook and Google stand between publishers and their audiences and determine how and whether news content is seen.”
The Post’s editorial board wrote that Facebook and other social media companies “claim to be ‘neutral’ and that they aren’t making editorial decisions in a cynical bid to stave off regulation or legal accountability that threatens their profits. But they do act as publishers — just very bad ones.”
Of course, it takes one to know one. The Post, always a mix of strong local news, great gossip and spun-up conservative politics, is making a bid for the title of worst newspaper in America right now. It has run a string of scary stories about Covid vaccines, the highlight of which was a headline linking vaccines to herpes, part of a broader attempt to extend its digital reach. Great stuff, if you’re mining for traffic in anti-vax Telegram groups. The piece on the Black Lives Matter activist that Facebook blocked was pretty weak, too. It insinuated, without evidence, that her wealth was ill-gotten, and mostly just sneered at how “the self-described Marxist last month purchased a $1.4 million home.”
But then, you’ve probably hate-read a story about a person you disliked buying an expensive house. When Lachlan Murdoch, the co-chairman of The Post’s parent company, bought the most expensive house in Los Angeles, for instance, it received wide and occasionally sneering coverage. Maybe Mr. Murdoch didn’t know he could get the stories deleted by Facebook.
Facebook doesn’t keep a central register of news articles it expunges on these grounds, though the service did block a Daily Mail article about the Black Lives Matter activist’s real estate as well. And it does not keep track of how many news articles it has blocked, though it regularly deletes offending posts by individuals, including photos of the home of the Fox News star Tucker Carlson, a Facebook employee said.
What Facebook’s clash with The Post really revealed — and what surprised me — is that the platform does not defer, at all, to news organizations on questions of news judgment. A decision by The Post, or The New York Times, that someone’s personal wealth is newsworthy carries no weight in the company’s opaque enforcement mechanisms. Nor, Facebook’s lawyer said, does a more nebulous and reasonable human judgment that the country has felt on edge for the last year and that a Black activist’s concern for her own safety was justified. (The activist didn’t respond to my inquiry but, in an Instagram post, called the reporting on her personal finances “doxxing” and a “tactic of terror.”)
The point of Facebook’s bureaucracy is to replace human judgment with a kind of strict corporate law. “The policy in this case prioritizes safety and privacy, and this enforcement shows how difficult these trade-offs can be,” the company’s vice president for communications, Tucker Bounds, said. “To help us understand if our policies are in the right place, we are referring the policy to the Oversight Board.”
The board is a promising kind of supercourt that has yet to set much meaningful policy. So this rule could eventually change. (Get your stories deleted while you can!)
For now, though, the deletion seems to be an instance of how the company finds itself constantly debating the literal interpretation of its own, made-up rules rather than exercising any form of actual judgment. That came up again this spring in an internal report finding that Facebook hadn’t cracked down on “Stop the Steal” splinter groups because they were all hovering below its “violation threshold.”
I should note that when it comes to the article about the activist’s house, Facebook waded into one of the trickiest areas of online speech, and one of the hardest calls for news organizations today. Journalists have always insisted on the right to print public information, which often includes things like home purchases and people’s real names. We have that legal right in the United States. But the internet has pioneered new forms of harassment based on menacingly circulating a photograph of your home, or tying your private social media to your public social media, which are viewed as “doxxing.”
These are hard calls without simple answers, and the social consensus is shifting. In 2012, for instance, I approved publishing an article that gave details about Mr. Zuckerberg’s residence in San Francisco’s Mission District, along with a candid photograph of him at a favorite deli. I’d probably leave out some of those details now, for some of the reasons I wouldn’t publish that Post article about the activist. The internet has gotten darker, and alongside substantive safety concerns, doxxing is an ugly and ubiquitous form of harassment.
There’s something depressing about an internet in which you’re left choosing between Mr. Zuckerberg and Mr. Murdoch, as the Electronic Frontier Foundation’s director of strategy, Danny O’Brien, pointed out to me. It didn’t have to be that way.
“We’ve been thrown into a situation where all you can do is pick your billionaire monopolist,” Mr. O’Brien said, lamenting “a world in which you get to pick your gatekeeper, rather than the world we were promised — and which technology offers — of not picking a gatekeeper at all.”
But in this time of media consolidation, it seems healthy to decentralize decision-making where you can. At present, Mr. Zuckerberg is making every call, or instituting a new quasi-legal code that reduces journalists to mall cops, enforcing Facebook’s rules rather than acting on news judgment. Better that professional editors, with diverse and conflicting views, make their differing calls. That inevitably includes Mr. Murdoch’s editors.