Facebook Removes 790 QAnon Groups to Fight Conspiracy Theory

By Sheera Frenkel

The social network also said it was restricting another 1,950 groups, 440 pages on Facebook and more than 10,000 accounts on Instagram related to the conspiracy group.

Facebook’s action on Wednesday was its most sweeping against the QAnon conspiracy theory, which supporters promote with Q signs. 
Facebook’s action on Wednesday was its most sweeping against the QAnon conspiracy theory, which supporters promote with Q signs. Credit...John Rudoff/Anadolu Agency, via Getty Images
Sheera Frenkel

OAKLAND, Calif. — Facebook said on Wednesday that it had removed 790 QAnon groups from its site and was restricting another 1,950 groups, 440 pages and more than 10,000 Instagram accounts related to the right-wing conspiracy theory, in the social network’s most sweeping action against the fast-growing movement.

Facebook’s takedown followed record growth of QAnon groups on the site, much of it since the coronavirus pandemic began in March. Activity on some of the largest QAnon groups on the social network, including likes, comments and shares of posts, rose 200 to 300 percent in the last six months, according to data gathered by The New York Times.

“We have seen growing movements that, while not directly organizing violence, have celebrated violent acts, shown that they have weapons and suggest they will use them, or have individual followers with patterns of violent behavior,” Facebook said in a statement, adding that it would also block QAnon hashtags such as #digitalarmy and #thestorm.

The actions, less than three months before November’s presidential election, underline how QAnon is increasingly causing alarm. Founded four years ago, QAnon was once a fringe phenomenon with believers who alleged, falsely, that the world was run by a cabal of Satan-worshiping pedophiles who were plotting against President Trump while operating a global child sex-trafficking ring.

But in recent months, the movement has become mainstream. Believers of Q, the shadowy central figure of QAnon, have shown up at political rallies. Some have committed violence in the name of the movement. And members of the group are rising in politics. Marjorie Taylor Greene, an avowed QAnon supporter from Georgia, won a Republican primary this month and may be elected to the House in November.

At a White House news conference on Wednesday, Mr. Trump was asked what he thought about QAnon’s theory that he is saving the world from a satanic cult of pedophiles and cannibals. Mr. Trump, who has shared information from QAnon accounts on Twitter and Facebook, said, “I haven’t heard that, but is it supposed to be a bad thing or a good thing?”

In response to the growing activity, tech companies have ramped up their measures to limit QAnon on social media, where the movement is deeply ingrained.

Last month, Twitter announced that it was removing thousands of QAnon accounts and said it was blocking trends and key phrases related to QAnon from appearing in its search and Trending Topics section. Reddit has also banned some of its forums for QAnon content, while the video app TikTok has banned several QAnon-related hashtags.

YouTube also regularly takes down QAnon content, including “tens of thousands of Q-related videos, and terminated hundreds of Q-related channels for violating our community guidelines,” a YouTube spokesman said

“There needs to be a real change in how these platforms think about conspiracy theories and the real-world harm they cause,” said Cindy Otis, vice president of analysis for Alethea Group, an organization that investigates disinformation. “Since the start of the pandemic, we have seen QAnon move much faster than the social media platforms to gain a following and push their content out.”

Facebook became increasingly concerned by QAnon’s presence in May, said two employees with knowledge of the efforts, who were not authorized to speak publicly.

That was when a video known as “Plandemic,” featuring a discredited scientist spreading a baseless conspiracy theory about the coronavirus, gathered steam on the social network, fueled by QAnon groups. New members also started flocking to the QAnon groups on Facebook.

QAnon activity also spilled out into the real world. In New York, a woman who had cited QAnon theories as a reason she wanted to “take out” the Democratic presidential nominee Joseph R. Biden Jr., was arrested on May 1 with dozens of knives in her car. The group has been linked to more than a dozen violent incidents over the last year, including a train hijacking; last month, a QAnon supporter rammed a car into a government residence in Canada.

The spiking activity on its network, combined with real-world incidents, pushed Facebook to discuss policy changes to limit QAnon’s spread, the two employees said. But the conversations stalled because taking down QAnon-related groups, pages and accounts could feed into the movement’s conspiracy theory that social media companies are trying to silence them, the people said.

Marc-André Argentino, a Ph.D. candidate who is studying QAnon, said part of the problem was that QAnon had absorbed members of other conspiracy groups into its pantheon. Even if Facebook removed the groups, they would likely find a foothold within other Facebook networks.

“QAnon is a super conspiracy,” Mr. Argentino said. “Various other conspiracies have various places in the hierarchy under the QAnon narrative, so it draws in people in different ways and gives them one central home. There is no easy answer about what to do about QAnon.”

How effective the social media companies’ takedowns will be at limiting QAnon is unclear. YouTube sometimes surfaces QAnon videos as additional viewing after one QAnon video is watched. Members still post about the conspiracy theory on some parts of Reddit. And on TikTok, accounts promoting the false conspiracy have amassed hundreds of thousands of followers.

A TikTok spokeswoman said that QAnon content “frequently contains disinformation and hate speech which violates our community guidelines,” and that searches for dozens of hashtags related to QAnon were directed to monitors. “We continually update our list as people work to break through our safeguards with misspellings and new phrases,” she added.

In a statement, Reddit said it had banned QAnon communities since 2018 but allowed some discussion of the group’s theories in broader forums “mainly because these conversations discount the theories as unsubstantiated and bogus.” Reddit said it would continue monitoring QAnon content.

On Wednesday, Facebook said it had taken down the QAnon groups as part of a new policy to clamp down on movements that discuss “potential violence.” Under that policy, Facebook said, it will also remove 980 groups such as those related to the far-left antifa movement, as well as others related to militia movements or other protests.

The new policy also bars the groups from buying ads on the platform. QAnon has sold merchandise on Facebook including hats, T-shirts and banners, partly through Facebook ads.

Ms. Otis of Alethea Group said the actions by Facebook and others against QAnon had not come soon enough.

“It has taken far too many weeks, too many months, for the platforms to get their arms around what is happening,” she said.