- We’re sharing a round-up of the biggest changes and tests we’ve launched this year to give people who use Facebook more control over their News Feed and reduce negative experiences.
- We continually evaluate the effectiveness of News Feed ranking signals and give people insight into how content appears in their feeds.
In 2021, we made significant progress in providing greater transparency into how the News Feed ranking process works, what gets distributed and why. We released new features to incorporate direct feedback we get from people who use Facebook by giving them more control over their feeds and reducing negative experiences.
Here are some of the biggest changes and tests we launched this year:
- February: new tests to reduce political content in News Feed in response to direct feedback.
- March: the launch of Choose Who Can Comment, Favorites, the Feed Filter Bar, and an expansion of Why Am I Seeing This to suggested posts to provide people more context and control over the content they see and share in News Feed
- April: new tests to incorporate more direct feedback from people who use Facebook about the content they want to see more or less in their News Feed
- May: new options for where and how people can choose to see reaction counts on Facebook and Instagram
- August: our first-ever Widely Viewed Content Report to share what content is seen by the most people in News Feed in the US
- September: our Content Distribution Guidelines that list content and behaviors that receive reduced distribution in News Feed because they are problematic or otherwise low quality
- November: our second Widely Viewed Content Report and new ways to make News Feed controls easier to find and use for people and advertisers
We continually evaluate the effectiveness of News Feed ranking signals and update or remove them when it makes sense. This month, we removed the transparent authorship signal because it did not have a significant effect on the news ecosystem. We still prioritize original reporting in News Feed and will continue to boost quality news by improving more impactful News Feed signals.
News Feed uses personalized ranking, which takes into account thousands of unique signals to understand what’s most meaningful to you. Our aim isn’t to keep you scrolling on Facebook for hours on end, but to give you an enjoyable experience that you want to return to. It’s not in our interest to show you hateful or inflammatory content — our advertisers don’t want their ads shown next to it and our users tell us they don’t want it. We’re incentivized to reduce it. The prevalence of hate speech is now just about 0.03% of content viewed, or about 3 views per every 10,000 and continuing to drop. In the US, nearly 90% of the content people see is from friends, pages and groups they follow or are connected to because we use algorithmic ranking.
We understand concerns people have about the lack of transparency over how algorithmic ranking systems work, so we’ve introduced new measures to give people more control over, and insight into, how content appears in their News Feed. Heading into the new year, we’ll continue to share updates on our tests and features that aim to give you more control and connect you to meaningful content.