Meta

The Next Phase in Fighting Misinformation

By Henry Silverman, Operations Specialist

Over the last two years, we’ve greatly expanded our efforts to fight false news: we’re getting better at enforcing against fake accounts and coordinated inauthentic behavior; we’re using both technology and people to fight the rise in photo and video-based misinformation; we’ve deployed new measures to help people spot false news and get more context about the stories they see in News Feed; and we’ve grown our third-party fact-checking program to include 45 certified fact-checking partners who review content in 24 languages. And overall, we’re making progress: multiple research studies suggest that these efforts are working and that misinformation on Facebook has been reduced since the US presidential elections in 2016.

But misinformation is a complex and evolving problem, and we have much more work to do. With more than a billion things posted to Facebook each day, we need to find additional ways to expand our capacity. The work our professional fact-checking partners do is an important piece of our strategy. But there are scale challenges involved with this work. There simply aren’t enough professional fact-checkers worldwide, and like all good journalism, fact-checking — especially when it involves investigation of more nuanced or complex claims — takes time. We want to be able to tackle more false news, more quickly.

So today, we’re kicking off a new collaborative process with outside experts that will help us hone in on new solutions to fight false news at scale. The goal of this process is to arrive at externally vetted, consistent approaches that have the potential to help us catch and reduce the distribution of greater quantities of misinformation, more efficiently.

We know this won’t be easy. Whatever we do next, we need to find solutions that support original reporting, promote trusted information and allow for people to express themselves freely. So the question is, how do we come up with a model where we’re serving people by giving them a chance to see the content they want, while also cutting down on misinformation, without having Facebook be the judge of what is true? How do we ensure a system complementary to our existing fact-checking programs, so that professional journalists can spend their time doing original reporting on the hardest cases? How we can build such a system that can’t be gamed or manipulated by coordinated groups of people? How can we avoid introducing personal biases into these systems? And what additional safeguards do we need in place to protect civil rights and minority voices?

Those are some of the issues we’ll be exploring in the months to come.

A Collaborative Process

As we’ve worked to expand our misinformation efforts over the past two years, we’ve also been doing extensive research and talking to outside experts to identify additional approaches that might bolster our defenses. One promising idea we’ve been exploring would involve relying on groups of people who use Facebook to point to journalistic sources that can corroborate or contradict the claims made in potentially false content, as discussed in this video.

Facebook’s head of research for News Feed Integrity, Apala Sabde, and University of Michigan professor Paul Resnick, a consultant to Facebook’s misinformation team and one of the many experts we’re working with on this topic, discuss our early explorations into community-driven approaches to misinformation.

Over the next few months, we’re going to build on the explorations we’ve started around this idea, consulting a wide range of academics, fact-checking experts, journalists, survey researchers and civil society organizations to understand the benefits and risks of ideas like this. We’re going to share with experts the details of the methodology we’ve been thinking about, to help these experts get a sense of where the challenges and opportunities are, and how they’ll help us arrive at a new approach. We’ll also share updates from these conversations throughout the process, and find ways to solicit broader feedback from people around the world who may not be in the core group of experts attending these roundtable events.

Taking the fight against misinformation to the next level is an important task for us. There are elections around the world month after month, only adding to the everyday importance of minimizing false news. We plan to move quickly with this work, sharing some of the data and ideas we’ve collected so far with the experts we consult so that we can begin testing new approaches as soon as possible.