Mark Zuckerberg wearing a suit and tie: Facebook co-founder and CEO Mark Zuckerberg testifies before the House Financial Services Committee in the Rayburn House Office Building on Capitol Hill October 23, 2019 in Washington, DC. Chip Somodevilla/Getty Images


© Chip Somodevilla/Getty Images
Facebook co-founder and CEO Mark Zuckerberg testifies before the House Financial Services Committee in the Rayburn House Office Building on Capitol Hill October 23, 2019 in Washington, DC. Chip Somodevilla/Getty Images

  • Engagement on Facebook posts from misleading websites has spiked by 242 percent from 3Q of 2016 to 3Q of 2020, according to a new report from German Marshall Fund Digital.
  • Only 10 outlets, which researchers labeled as “False Content Producers” or “Manipulators,” were responsible for 62% of interactions. 
  • Facebook in the past has been slammed by civil rights leaders for inadequately handling the spread of misinformation on its platform.
  • Facebook’s attempts to moderate misinformation on the platform come into focus ahead of the US presidential election. 
  • Visit Business Insider’s homepage for more stories.

Engagement from misleading websites on Facebook has tripled since the 2016 US presidential election.

Loading...

Load Error

The total number of user interactions with articles from “deceptive outlets” has increased by 242% between the third quarter of 2016 and the third quarter of 2020, according to a study published Monday by the German Marshall Fund Digital, the digital wing of the Washington, DC-based public policy think tank. 

Only 10 outlets — out of thousands — received 62% of those interactions, GMF Digital found. The researchers categorized outlets as either “False Content Producers” for sites, including The Federalist, that provide information that’s false, and “Manipulators” for sites, like Breitbart, that present claims that aren’t backed by evidence. 

The study concluded that since the third quarter of 2016, the number of articles from False Content Producers jumped by 102 percent and the number of articles from Manipulators increased by 293 percent. 

“Disinformation is infecting our democratic discourse at rates that threaten the long-term health of our democracy,” Karen Kornbluh, director of GMF Digital, said in a press release. “A handful of sites masquerading as news outlets are spreading even more outright false and manipulative information than in the period around the 2016 election.”

Earlier this year, the Wall Street Journal reported that a team of Facebook employees told senior executives that the algorithms on its website were more divisive than unifying. Civil rights leaders have slammed Facebook for inadequately handling the spread of misinformation on its platform. Major brands have boycotted the platform, and celebrities, including Kim Kardashian, led a daylong protest against Facebook last month entitled Stop Hate for Profit.  

A spokesperson from Facebook told Business Insider that engagement doesn’t take into account what the majority of Facebook users actually see on the site, and that it doesn’t reflect the progress Facebook has made in limiting misinformation since 2016.

“Over the past four years we’ve built the largest fact-checking network of any platform, made investments in highlighting original, informative reporting, and changed our products to ensure fewer people see false information and are made aware of it when they do,” the spokesperson said. 

Third-party fact-checkers are responsible for verifying much of the content publishers post on Facebook, and are part of “a three-part approach” that Facebook takes in “addressing problematic content across” its apps, including Instagram and WhatsApp. Some groups, like climate activists, say the program doesn’t do enough. Many articles that incorrectly state that global warming doesn’t exist escape the company’s fact-check policies as they’re labeled as opinion articles, which fall outside of the fact-checkers’ responsibilities. And a recent study found that 84% of medical misinformation is never tagged on Facebook. 

Facebook’s attempts to moderate misinformation on the platform — including posts from President Trump — come into focus ahead of the presidential election, eliciting parallels in how the company handled user data and moderation efforts during the 2016 election cycle.

In 2017, the company revealed in sworn testimony to Congress that Russian interference campaigns reached nearly 130 million Americans in the weeks before the 2016 election. The company recently took down two Russian networks with ties to groups that interfered with the 2016 election, the Washington Post reported. 

President Trump finds himself needing to make up a lot of ground with the election being just 22 days away. He’s trailing Democratic nominee Joe Biden by an average of 10.5 percentage points nationally. Also, Trump is falling behind Biden in many swing states he won back in 2016, such as Michigan, Wisconsin and Pennsylvania. 

Early voting began in four states on September 18, and voters in eight more states will start visiting the polls this week.

Continue Reading

Source Article