Meta

Updating the Values That Inform Our Community Standards

By Monika Bickert, Vice President, Global Policy Management

Today, we’re expanding the values that serve as the basis for our Community Standards — the guidepost for what is and isn’t allowed on Facebook. For more than a decade, we’ve focused on giving people voice, making Facebook a safe place and applying our policies consistently and fairly around the world. Those values remain important to us. However, as we’ve grown and introduced new products, features and services, our Community Standards have become more expansive and nuanced. The values we’re publishing today reflect the policies we’ve developed over time and what we stand for as a company. 

Our commitment to giving people voice remains paramount. We also focus on authenticity, safety, privacy and dignity in writing and enforcing our Community Standards. We’ve updated the preamble to our Community Standards to reflect these values and included it below to help people understand the environment we want to foster on Facebook.  

Voice

The goal of our Community Standards is to create a place for expression and give people voice. Building community and bringing the world closer together depends on people’s ability to share diverse views, experiences, ideas and information. We want people to be able to talk openly about the issues that matter to them, even if some may disagree or find them objectionable. In some cases, we allow content which would otherwise go against our Community Standards – if it is newsworthy and in the public interest. We do this only after weighing the public interest value against the risk of harm, and we look to international human rights standards to make these judgments.

A commitment to expression is paramount, but we recognize the internet creates new and increased opportunities for abuse. For these reasons, when we limit expression we do it in service of one or more of the following values:

Our Community Standards apply to everyone around the world, and to all types of content. They’re designed to be comprehensive – for example, content that might not be considered hateful may still be removed for violating a different policy. 

We recognize that words mean different things or affect people differently depending on their local community, language or background. We work hard to account for these nuances while also applying our policies consistently and fairly to people and their expression.