What content is allowed on Facebook’s services?

Facebook builds services for you to express your voice.

Building community and bringing the world closer together depends on people’s ability to share their own voice, to be heard, to exchange diverse views, experiences, ideas and information.

For people to express themselves, they need to be safe.

We don’t allow content that threatens the safety or dignity of people. We define this content in detail in our Community Standards. We find and remove it through a combination of technology and people.

That means we don’t remove things that are controversial or untrue.

We want people to talk openly about the issues that matter to them, which leads to the exchange of ideas and debate. The answer to misinformation can’t be less information – but more context.

But we do provide context so people can make their own decisions.

People need to decide what to believe for themselves, with as much of the story as possible. That’s why we partner with third party fact-checkers to provide context when they’ve rated an article as false. We also take steps to stop misinformation from going viral.

Is this post allowed?

Determine which of these posts are allowed on Facebook’s services.

Yes, this is allowed.

This is a call for violence, but against a fictional character in a TV series. Context matters. These rules don’t apply to fictional characters as they do to real people.

Yes, this is allowed.

This is a call for violence, but against a fictional character in a TV series. Context matters. These rules don’t apply to fictional characters as they do to real people.

No, this is not allowed.

Discussing medications and your health are okay, but the illegal sale of drugs is not allowed on our services.

No, this is not allowed.

Discussing medications and your health are okay, but the illegal sale of drugs is not allowed on our services.

Yes, this is allowed.

When people share an intent to die by suicide, it is left up so that the user can get help & support from their social network. We also provide support resources to the poster and their friends.

Yes, this is allowed.

When people share an intent to die by suicide, it is left up so that the user can get help & support from their social network. We also provide support resources to the poster and their friends.

No, this is not allowed.

Dehumanizing statements about people based on a protected characteristic, such as religion, are not allowed.

No, this is not allowed.

Dehumanizing statements about people based on a protected characteristic, such as religion, are not allowed.

No, this is not allowed.

Threats that could lead to death are never allowed. All people should feel safe on our services.

No, this is not allowed.

Threats that could lead to death are never allowed. All people should feel safe on our services.

Yes, this is allowed.

We don't allow child nudity on the platform for safety reasons. However, this photo received a newsworthy exception for the following reasons: the global importance, its notoriety as an award-winning image, and the fact that the subject is now an adult and consented to its publication. Newsworthy exceptions are rare and cannot be executed at scale.

Yes, this is allowed.

We don't allow child nudity on the platform for safety reasons. However, this photo received a newsworthy exception for the following reasons: the global importance, its notoriety as an award-winning image, and the fact that the subject is now an adult and consented to its publication. Newsworthy exceptions are rare and cannot be executed at scale.

1/6

Keeping us accountable

User Review
& Appeal

People can report content that we missed, or appeal a decision for reconsideration if they believe something has been taken down in error.

Oversight Board

A new model for oversight focused on independent decision-making. This global board of experts will review Facebook’s most difficult and significant content decisions on Facebook and Instagram, make binding decisions on that content and be able to issue policy recommendations.

Community Standards Enforcement Report

A bi-annual report that measures the progress we've made to prevent and take action on violating content.

To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookies Policy