Op-ed: Alex Jones is a crackpot—but banning him from Facebook might be a bad idea [Updated]

By Timothy B. Lee

Update: This article originally stated that Facebook had not banned most Alex Jones content from the site, which was true when we scheduled the post on Friday afternoon. But on Monday morning, shortly before this article published, Facebook removed four key Alex Jones pages, effectively banning him from the platform. I've updated the story to reflect this change.

Facebook and YouTube both have strict rules against posting content that is hateful, pornographic, or violates someone's privacy. But what if someone posts content that is just egregiously false? Right now, neither Facebook nor YouTube has rules banning this kind of content. And critics say that's a problem.

In recent weeks, the issue has come to a head over online provocateur, pundit, and conspiracy theorist Alex Jones. Jones gleefully flouts the rules of journalistic ethics, regularly making outrageous claims without a shred of evidence.

“When I think about all the children Hillary Clinton has personally murdered and chopped up and raped, I have zero fear standing up against her,” Jones said in one YouTube video.

Last month, Facebook and YouTube both removed a few of Jones' videos, including that one quoted above, citing rules against harassment, bullying, child endangerment, or hate speech. At the end of July, Facebook suspended Jones' account for 30 days, while YouTube recently took down four videos.

Then on Monday morning, Facebook banned four of the main pages Alex Jones uses to post content, effectively banning Jones from the platform. But notably, Facebook banned the pages for violating rules against violence and hate speech—not fake news. Facebook continues to insist that fake news, by itself, is not a basis for removing content from the platform.

"We don't remove false news from Facebook, but instead significantly reduce its distribution by showing it lower in the News Feed," Facebook says in its content guidelines.

For some perspective, I talked to two experts who have thought a lot about the problem of fake news online. And neither one of them was able to offer a compelling alternative to Facebook's current approach.

"All of these social media platforms will make all these speeches and press releases that they're really committed to ending disinformation on their platform, but they're not really," said Alice Marwick, a communications professor at the University of North Carolina, Chapel Hill. "If they are committed to disinformation, they do have to kick Alex Jones."

Yet she wasn't entirely comfortable with that approach. "Do they want to be the arbitrators of truth?" she asked.  "We probably don't want that."

Booting someone like Jones from Facebook or YouTube altogether could easily turn him into a martyr among his paranoid fans. And policing content for factual accuracy could suck platforms into endless controversies over hot-button political issues. So as unsatisfying as Facebook's approach is, it might be the best of some bad options.

"I think that using algorithms to personalize and filter, as much as it's gotten a bad rap, is probably healthier than saying this thing is available or not available," argued James Grimmelmann, a law professor at Cornell University.

Alex Jones, troll and conspiracy theorist

Facebook and YouTube have helped Jones amass a huge online audience for his website Infowars. Jones has 2.4 million YouTube subscribers and 1.6 million followers on Facebook. He has built that audience by constantly peddling nonsense.

In one rant, Jones claimed that Hillary Clinton and Barack Obama are literal demons. "Obama and Hillary both smell like sulfur," as Jones put it in the final weeks of the 2016 election.

Also in 2016, Jones repeatedly touted "Pizzagate," a conspiracy theory that involved prominent Democrats running a child sex ring from the basement of a Washington, DC, pizzeria (as it turned out, the restaurant doesn't even have a basement). Pizzagate rumors online prompted one Alex Jones fan to drive to Washington, DC, and stage an armed confrontation.

Jones has promoted outrageous conspiracy theories unrelated to Democratic politicians, too. He has claimed that a number of recent mass shootings, including the 2012 shooting at Sandy Hook Elementary School, were false-flag operations organized by the government. That claim got Jones sued by some Sandy Hook parents.

Jones claims that the government "can create and steer groups of tornadoes" to use as weapons. He predicted that Lady Gaga would perform a satanic ritual in the 2017 Super Bowl halftime show.

So when Facebook executives met with a group of journalists last month to tout the platform's battle against fake news, Jones' name came up, according to CNN:

When asked by this reporter how the company could claim it was serious about tackling the problem of misinformation online while simultaneously allowing InfoWars to maintain a page with nearly one million followers on its website, [Facebook's] Hegeman said that the company does not "take down false news."

Other media outlets quickly piled on, pressuring Facebook to cut ties with Jones. YouTube faced pressure over hosting Jones as well. Soon afterward, both platforms blocked some of the most egregious videos from Jones' site—videos the platforms said violated pre-existing rules not directly related to fake news.

All the while, Facebook stuck to its guns when it came to the question of becoming the fake news police.

"We see pages on both the left and the right pumping out what they consider opinion or analysis—but others call fake news," the company tweeted. "We believe banning these pages would be contrary to the basic principles of free speech."


Page 2

Facebook has tried to thread the needle by insisting that Jones' content is being removed for violating policies against hate speech—not policies against fake news. This is because Facebook really doesn't want to get into the business of deciding which news is fake or not.

There's a big risk that banning Jones—and others like him—is going to turn him into a martyr. He will undoubtedly find a home on another platform, and his total audience might grow as news organizations cover the controversy.

A major theme of Jones' show is that powerful institutions are conspiring to shut down free speech. Jones will undoubtedly seize on Facebook's crackdown as proof that his fears were justified.

If Facebook had banned Jones for posting fake news, rather than hate speech, it would have faced more pressure to censor other fringe content. Not only would that be a labor-intensive process—potentially requiring Facebook employees to personally fact-check millions of pieces of content—it would also come with a raft of ideologically fraught decisions that could alienate a large portion of Facebook's user base.

The line between news, commentary, and satire isn't always clear, either, as the legal scholar James Grimmelmann points out in a recent essay. Some of Jones' segments—like his claims that Hillary Clinton and Barack Obama are literal demons and that Lady Gaga would perform satanic rituals at the Super Bowl—could easily be read as entertainment rather than serious news reporting. In fact, in a recent custody battle, Jones' own lawyers argued that Jones was merely "playing a character" in his programs.

Jones also loves to "raise questions" about the official explanation for tragedies like the Sandy Hook school shooting, insinuating that there was some kind of cover-up without putting forward a specific conspiracy theory. These kinds of tactics make it difficult to draw a line between someone merely offering commentary about a conspiracy theory and someone explicitly advocating it.

And then, of course, there's the question of politics.

"Most of the disinformation you're concerned about, which is political misinformation, is very partisan," Marwick said. "If you're shutting down a large number of channels that further conspiracy theories that appeal to people on the right, you're going to be accused of bias against conservatives."

The news media is ideologically asymmetrical. Mainstream media with high journalistic standards tends to lean a bit to the left. In reaction to that, conservatives have constructed explicitly partisan media alternatives that don't always uphold the same high journalistic standards. Hence there's a real danger that a war against fake news could be seen as a war on conservative media—alienating many conservative Facebook users in the process.

Platforms can fight fake news without banning it

It's helpful here to think of Facebook as being two separate products: a hosting product and a recommendation product (the Newsfeed). Facebook's basic approach is to apply different strategies for these different products.

For hosting content, Facebook takes an inclusive approach, only taking down content that violates a set of clearly defined policies on issues like harassment and privacy.

With the Newsfeed, by contrast, Facebook takes a more hands-on approach, downranking content it regards as low quality.

This makes sense because the Newsfeed is fundamentally an editorial product. Facebook has an algorithm that decides which content people see first, using a wide variety of criteria. There's no reason why journalistic quality, as judged by Facebook, shouldn't be one of those criteria.

Under Facebook's approach, publications with a long record of producing high-quality content can get bumped up toward the top of the news feed. Publications with a history of producing fake news can get bumped to the back of the line, where most Newsfeed users will never see it.

This will obviously still anger some conservatives, but it doesn't really make sense to describe it as censorship. Users delegate content selection to Facebook when they choose to use the Newsfeed—they can hardly complain about Facebook steering users toward content Facebook sees as high quality. Users who really want to see a particular fake news publisher's posts can still go directly to that publisher's Facebook page.

At the same time, downranking fake news in the Newsfeed will produce almost all of the same benefits as banning it outright. Relatively few people are going to browse directly to a particular publisher's page, and the ones who do would probably follow the publisher to another site anyway. So for now, downranking fake news in the Newsfeed is going to be just about as effective a way of fighting fake news as banning it from the platform outright.

The same basic point applies to YouTube. YouTube operates both a video hosting service and a variety of algorithms for recommending what videos people should watch next. If YouTube stops recommending Jones' videos and puts him at the bottom of users' personalized feeds, that would go a long way toward limiting his audience—without creating the backlash that would come from banning him from the platform altogether.

But to do this well, both platforms might want to invest more in building the capacity to distinguish high-quality and low-quality content. That should include hiring experienced editors to help decide how to judge content quality. At Facebook and YouTube's scale it's probably not feasible for a human being to look at every piece of content. But they can at least look at the ones that get the most attention, and they can provide feedback to the people writing automated algorithms about how to make the algorithms better.