Since 2016, we have used a strategy called “remove, reduce, and inform” to manage problematic content on Facebook. This involves removing content that violates our Community Standards, reducing the spread of problematic content that does not violate our standards, and informing people with additional information so they can choose what to click, read or share.
Our “reduce” work on Facebook is largely centered on News Feed and how we rank posts within it. When developing new “reduce” initiatives, we think about our goals in terms of the needs of three parties: people, publishers, and our community.
Responding to People’s Direct Feedback
We’re always listening to people’s feedback about what they do and don’t like seeing on Facebook and making changes to News Feed in response.
- Reducing content that is broadly disliked. People have told us they don’t like content that is spammy, so in recent years, we’ve reduced low-quality content such as clickbait, engagement bait, and web pages with little substance and disruptive ads. We will continue this work in many ways — in some cases we’ll take direct action against new types of content that people alert us to, such as web pages that have broken links, load slowly, or are otherwise difficult to use. In other cases, we’ll use signals, like whether a domain’s Facebook traffic is highly disproportionate to their place in the web graph, to better target known types of low-quality content.
- Creating more personalized experiences. We’re taking more steps to make sure people see what’s most important to them in News Feed. One of the ways we’ll understand this is by surveying some people on whether posts they see are worth their time. The answers will help us understand who wants to see more, or less, of different types of content. We’re also making it easier for people to customize their News Feed by providing even more ways for them to set preferences.
- Showing comments that add the most value. People have told us they want to have better conversations on Facebook. Last year, we worked to reduce bullying and offensive comments on public Pages, so that people felt more comfortable engaging with those they don’t know. This year, we’ll continue with those efforts while exploring ways to understand and promote the best comments.
Incentivizing Publishers to Invest in High-Quality Content
We recognize that getting broad distribution in News Feed influences what publishers produce, and ultimately, what our community sees. We want people to have interesting, new material to engage with in the long term, so we’re working to set incentives that encourage the creation of these types of content.
- Cracking down on unoriginality. Some publishers gain distribution by publishing content that is repurposed from other sources without adding material value. We began reducing the distribution of unoriginal content last October, and we are continuing to take action against it in a number of ways, including demonetizing it on Instant Articles and reducing the distribution of video content that is compiled and posted from third-party content creators.
- Doubling down on penalties for repeat offenders. When publishers repeatedly post content that triggers our quality demotions, we expect them to change their behavior. If they don’t, we implement stricter demotions for a set period of time to prompt faster, more comprehensive action. We will continue to introduce penalties for “repeat offenders” along with new quality demotions, to encourage publishers to be compliant with our standards.
Fostering a Safe Community
There is some content that individual people might want to see, and that don’t necessarily create bad incentives for publishers, but we have decided are problematic for our community. We’ll make this content difficult to encounter for people who aren’t actively trying to see it.
- Reducing the spread of misinformation. In recent years, we’ve focused on combating misinformation through a combination of technology and human review. This year, we will explore new ways to leverage experts, as well as our Facebook community, in our fight against false news so we can expand the scale and speed of our work. We’ll particularly focus on combating misinformation in high-risk areas like health and finance.
- Understanding borderline content: Some types of content, although they do not violate the letter of our Community Standards, are sensationalist or provocative and bring down the overall quality of discourse on our platform. As we said back in November, we’re working to understand this type of content.
As we deliver on these and other plans, we’ll continue to rely on our core values to keep the central experience of News Feed intact as it evolves.
To learn more about our latest work across our strategic remove, reduce, inform areas, see this recap of our April 2019 event.