Meta

Increasing Our Efforts to Fight False News

By Tessa Lyons, Product Manager

Over the last year and half, we have been committed to fighting false news through a combination of technology and human review, including removing fake accounts, partnering with fact-checkers, and promoting news literacy. This effort will never be finished and we have a lot more to do. Today, we’re announcing several updates as part of this work:

Expanding our fact-checking program to new countries. Since we first launched the third-party fact-checking program last spring, we’ve expanded to 14 countries and have plans to scale to more countries by the end of the year. These certified, independent fact-checkers rate the accuracy of stories on Facebook, helping us reduce the distribution of stories rated as false by an average of 80%.

Expanding our test to fact-check photos and videos. One challenge in fighting misinformation is that it manifests itself differently across content types and countries. To address this, we expanded our test to fact-check photos and videos to four countries. This includes those that are manipulated (e.g. a video that is edited to show something that did not really happen) or taken out of context (e.g. a photo from a previous tragedy associated with a different, present day conflict).

Increasing the impact of fact-checking by using new techniques. With more than a billion pieces of content posted every day, we know that fact-checkers can’t review every story one-by-one. So, we are looking into new ways to identify false news and take action on a bigger scale.

Taking action against new kinds of repeat offenders. Historically, we have used ratings from fact-checkers to identify Pages and domains that repeatedly share false news. We then take action by reducing their distribution and removing their ability to monetize. To help curb foreign interference in public discourse, we are beginning to use machine learning to help identify and demote foreign Pages that are likely to spread financially-motivated hoaxes to people in other countries.

Improving measurement and transparency by partnering with academics. In April, we announced a new initiative to help provide independent research about the role of social media in elections, as well as democracy more generally. The elections research commission is in the process of hiring staff and establishing the legal and organizational procedures necessary to becoming fully independent. In the coming weeks, the commission will release a website and then its first request for proposals, to measure the volume and effects of misinformation on Facebook.

We’re currently working with the commission to develop privacy-protected data sets, which will include a sample of links that people engage with on Facebook. The academics selected by the commission will be able to study these links to better understand the kinds of content being shared on Facebook. Over time, this externally-validated research will help keep us accountable and track our progress.