Misinformation on Facebook is three times more popular than it was during the 2016 election, according to new research (FB)
Summary List Placement
Engagement from misleading websites on Facebook has tripled since the 2016 US presidential election. The total number of user interactions with articles from "deceptive outlets" has increased by 242% between the third quarter of 2016 and the third quarter of 2020, according to a study published Monday by the German Marshall Fund Digital, the digital wing of the Washington, DC-based public policy think tank. Only 10 outlets — out of thousands — received 62% of those interactions, GMF Digital found. The researchers categorized outlets as either "False Content Producers" for sites, including The Federalist, that provide information that's false, and "Manipulators" for sites, like Breitbart, that present claims that aren't backed by evidence. The study concluded that since the third quarter of 2016, the number of articles from False Content Producers jumped by 102 percent and the number of articles from Manipulators increased by 293 percent. "Disinformation is infecting our democratic discourse at rates that threaten the long-term health of our democracy," Karen Kornbluh, director of GMF Digital, said in a press release. "A handful of sites masquerading as news outlets are spreading even more outright false and manipulative information than in the period around the 2016 election." Earlier this year, the Wall Street Journal reported that a team of Facebook employees told senior executives that the algorithms on its website were more divisive than unifying. Civil rights leaders have slammed Facebook for inadequately handling the spread of misinformation on its platform. Major brands have boycotted the platform, and celebrities, including Kim Kardashian, led a daylong protest against Facebook last month entitled Stop Hate for Profit. A spokesperson from Facebook told Business Insider that engagement doesn't take into account what the majority of Facebook users actually see on the site, and that it doesn't reflect the progress Facebook has made in limiting misinformation since 2016. "Over the past four years we've built the largest fact-checking network of any platform, made investments in highlighting original, informative reporting, and changed our products to ensure fewer people see false information and are made aware of it when they do," the spokesperson said. Third-party fact-checkers are responsible for verifying much of the content publishers post on Facebook, and are part of "a three-part approach" that Facebook takes in "addressing problematic content across" its apps, including Instagram and WhatsApp. Some groups, like climate activists, say the program doesn't do enough. Many articles that incorrectly state that global warming doesn't exist escape the company's fact-check policies as they're labeled as opinion articles, which fall outside of the fact-checkers' responsibilities. And a recent study found that 84% of medical misinformation is never tagged on Facebook. Facebook's attempts to moderate misinformation on the platform — including posts from President Trump — come into focus ahead of the presidential election, eliciting parallels in how the company handled user data and moderation efforts during the 2016 election cycle. In 2017, the company revealed in sworn testimony to Congress that Russian interference campaigns reached nearly 130 million Americans in the weeks before the 2016 election. The company recently took down two Russian networks with ties to groups that interfered with the 2016 election, the Washington Post reported. President Trump finds himself needing to make up a lot of ground with the election being just 22 days away. He's trailing Democratic nominee Joe Biden by an average of 10.5 percentage points nationally. Also, Trump is falling behind Biden in many swing states he won back in 2016, such as Michigan, Wisconsin and Pennsylvania. Early voting began in four states on September 18, and voters in eight more states will start visiting the polls this week.Join the conversation about this story » NOW WATCH: Why electric planes haven't taken off yet
More like this (3)
People are engaging more on Facebook with content from outlets that publish falsehoods and distortions, even...People are engaging more on Facebook with content from outlets that publish falsehoods and distortions, even though the social network has tried limiting misinformation, new research found.
84% of medical misinformation on Facebook is never tagged with a warning, and is viewed billions of times, report says
84% of medical misinformation posts on Facebook were left up with no warning label, despite the...84% of medical misinformation posts on Facebook were left up with no warning label, despite the platform's policy to counter bogus claims, an investigation found. Non-profit organization Avaaz said 16% of the posts it examined that had been fact checked were labelled, while the other 84% were not. It said bogus claims were viewed some 3.8 billion times, with an especially large audience as...
Internal docs: Facebook removed "strikes" so that at least 2 conservative pages, from PragerU and Diamond & Silk, dodged penalties under misinformation policies (Olivia Solon/NBC News)
Olivia Solon / NBC News: Internal docs: Facebook removed “strikes” so that at least 2 conservative...Olivia Solon / NBC News: Internal docs: Facebook removed “strikes” so that at least 2 conservative pages, from PragerU and Diamond & Silk, dodged penalties under misinformation policies — Facebook has allowed conservative news outlets and personalities to repeatedly spread false information without facing …