Meta

Helping Fact-Checkers Identify False Claims Faster

By Henry Silverman, Product Manager

We’ve made significant progress reducing misinformation through our partnership with some of the world’s leading fact-checkers. Today, we’re announcing a new pilot program built to leverage the Facebook community. It will allow fact-checkers to quickly see whether a representative group of Facebook users found a claim to be corroborated or contradicted. Our goal is to help fact-checkers address false content faster.

The program will have community reviewers work as researchers to find information that can contradict the most obvious online hoaxes or corroborate other claims. These community reviewers are not Facebook employees but instead will be hired as contractors through one of our partners. They are not making final decisions themselves. Instead, their findings will be shared with the third-party fact-checkers as additional context as they do their own official review.

For example, if there is a post claiming that a celebrity has died and community reviewers don’t find any other sources reporting that news — or see a report that the same celebrity is performing later that day — they can flag that the claim isn’t corroborated. Fact-checkers will then see this information as they review and rate the post. 

We started exploring this idea earlier this year. Since then, we’ve worked with experts and partners across many fields to understand how we can better support our fact-checking partners in their effort to review content faster. 

Here’s how it will work: 

  • Our machine learning model identifies potential misinformation using a variety of signals. These include comments on the post that express disbelief, and whether a post is being shared by a Page that has spread misinformation in the past. 
  • If there is an indication that a post may be misinformation, it will be sent to a diverse group of community reviewers.
  • These community reviewers will be asked to identify the main claim in the post. They will then conduct research to find other sources that either support or refute that claim, similar to the way a person using Facebook may search for other news articles to assess it if they believe the main claim in a post. Fact-checking partners will then be able to see the collective assessment of community reviewers as a signal in selecting which stories to review and rate. 

To ensure the pool of community reviewers represents the diversity of people on Facebook, we’re partnering with YouGov, a global public opinion and data company. YouGov conducted an independent study of community reviewers and Facebook users. They determined that the requirements used to select community reviewers led to a pool that is representative of the Facebook community in the US and reflects the diverse viewpoints — including political ideology — of Facebook users. They also found the judgments of corroborating claims by community reviewers were consistent with what most people using Facebook would conclude.

We’re piloting this process in the US over the coming months and we’ll closely evaluate how it’s working through our own research, help from academics and feedback from our third-party fact-checking partners. We believe that by combining the expertise of third-party fact-checkers with a group of community-based reviewers, we can evaluate misinformation faster and make even more progress reducing its prevalence on Facebook. 



To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookie Policy