Meta

Investments to Fight Polarization

By Guy Rosen, VP Integrity

Yesterday the Wall Street Journal published a story claiming that Facebook “shut down efforts to make the site less divisive” and “largely shelved” internal research on whether social media increases polarization. Unfortunately, this particular story wilfully ignored critical facts that undermined its narrative.

The piece uses a couple of isolated initiatives we decided against as evidence that we don’t care about the underlying issues and it ignored the significant efforts we did make. The piece disregarded how our research, and research we continue to commission, informed dozens of other changes and new products. It also ignored other measures we’ve taken to fight polarization. As a result, readers were left with the impression we are ignoring an issue that in fact we have invested heavily in.

Here are just some of the initiatives we’ve made over the past three years to address factors that can contribute to polarization: 

Recalibrating News Feed

In 2018 we made a fundamental change to the way content is surfaced in people’s News Feed to prioritize posts from friends and family over news content. This was based on extensive research that found people derive more meaningful conversations and experiences when they engage with people they know rather than by passively consuming content. 

This was one of many changes we’ve made to Facebook’s News Feed to try and minimize the amount of divisive news content people see. We’ve reduced clickbait headlines. We’ve reduced links to misleading and spammy posts. And we’ve improved how comments are ranked to show people those that are more relevant and of higher quality. 

Building a Robust Integrity Team

Over the past four years, we’ve built a global team of more than 35,000 people working across the company on issues to secure the safety and security of our services, including those related to polarization. We’ve removed billions of fake accounts. We made it easier to see who is behind political ads. And we’ve updated our privacy settings and built new tools to give people more control over their information.

Restricting Recommendations

We’ve added more restrictions to the types of Pages and Groups that we recommend to people. Through our tools, we can detect many types of violating content posted in Groups before people report it to us. If Pages and Groups repeatedly share content that violates our Community Standards, or is rated false by fact-checkers, we reduce their distribution, remove them from recommendations, and we remove the ability for Pages to monetize and advertise. We also remove entire Pages and Groups who repeatedly post violating content. When reviewing a Group to decide whether or not to take it down, we will look at admin and moderator content violations, including posts by members that they have approved, as a stronger signal that the Group violates our standards.

Combating Hate Speech 

Another way we reduce polarization is by prohibiting speech that attacks people based on what we call protected characteristics — race, ethnicity, national origin, religious affiliation, sexual orientation, caste, sex, gender, gender identity, and serious disease or disability hate speech on our platforms. We’ve expanded our proactive detection technology to find such content faster and in more languages. Today almost 90% of hate speech we remove is detected by our systems before anyone reports it to us.

Reducing Misinformation and Removing Content That Could Cause Physical Harm

Misinformation is often a tool used in polarizing content and fighting it has been a key focus. 

In December of 2016, we launched an independent fact-checking program that became the basis for evaluating whether content posted to Facebook is actually true. Content found to be false by our fact-checking partners is labelled and down-ranked in News Feed. We’ve since expanded the program to Instagram and now have more than 60 fact-checking partners covering more than 50 languages around the world. 

And we’ve updated our policies to remove misinformation that has the potential to contribute to imminent violence, physical harm, and voter suppression.

Finally, the Journal examined a rigorous process we instituted called ‘eat your veggies’ that was designed to vet new products before they were shipped. The process was put in place as a response to valid criticism, including from the media, that tech companies weren’t doing enough to anticipate unintended uses of their products. 

We’ve taken a number of important steps to reduce the amount of content that could drive polarization on our platform, sometimes at the expense of revenues. This job won’t ever be complete because at the end of the day, online discourse is an extension of society and ours is highly polarized. But it is our job to reduce polarization’s impact on how people experience our products. We are committed to doing just that.  



To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookie Policy