People, Publishers, the Community

Since 2016, we have used a strategy called “remove, reduce, and inform” to manage problematic content on Facebook. This involves removing content that violates our Community Standards, reducing the spread of problematic content that does not violate our standards, and informing people with additional information so they can choose what to click, read or share.

Our “reduce” work on Facebook is largely centered on News Feed and how we rank posts within it. When developing new “reduce” initiatives, we think about our goals in terms of the needs of three parties: people, publishers, and our community.

Responding to People’s Direct Feedback
We’re always listening to people’s feedback about what they do and don’t like seeing on Facebook and making changes to News Feed in response.

Incentivizing Publishers to Invest in High-Quality Content
We recognize that getting broad distribution in News Feed influences what publishers produce, and ultimately, what our community sees. We want people to have interesting, new material to engage with in the long term, so we’re working to set incentives that encourage the creation of these types of content.

Fostering a Safe Community
There is some content that individual people might want to see, and that don’t necessarily create bad incentives for publishers, but we have decided are problematic for our community. We’ll make this content difficult to encounter for people who aren’t actively trying to see it.

As we deliver on these and other plans, we’ll continue to rely on our core values to keep the central experience of News Feed intact as it evolves.

To learn more about our latest work across our strategic remove, reduce, inform areas, see this recap of our April 2019 event.