Our Latest Steps to Keep Facebook Groups Safe

By Tom Alison, VP of Engineering

People turn to Facebook Groups to connect with others who share their interests, but even if they decide to make a group private, they have to play by the same rules as everyone else. Our Community Standards apply to public and private groups, and our proactive detection tools work across both. That means even if someone doesn’t report an issue to us, our AI can detect potentially violating content and we can remove it. Today we’re sharing an update on our ongoing work to keep groups safe, including a number of changes to reduce harmful content and misinformation.

Over the last year, we removed about 1.5 million pieces of content in groups for violating our policies on organized hate, 91% of which we found proactively. We also removed about 12 million pieces of content in groups for violating our policies on hate speech, 87% of which we found proactively.

That’s what we do for posts within groups. When it comes to groups themselves, we will take an entire group down if it repeatedly breaks our rules or if it was set up with the intent to violate our standards. Over the last year, we took down more than 1 million groups for violating these policies. 

Stopping People Who Break Our Rules

We’re taking further steps to stop people who repeatedly violate our Community Standards from being able to create new groups. Our existing recidivism policy stops the admins of a group from creating another group similar to one we removed. Going forward, admins and moderators of groups taken down for policy violations will not be able to create any new groups for a period of time.

For members who have any Community Standards violations in a group, their posts in that group will now require approval for the next 30 days. This stops their post from being seen by others until an admin or moderator approves it. If admins or moderators repeatedly approve posts that violate our Community Standards, we will remove the group.

Helping Ensure Groups Have an Active Admin

Admins are at the heart of fostering the purpose and culture of their groups. Sometimes admins may step down or leave their groups. Our proactive detection continues to operate in these groups, but we know that active admins can help maintain the community and promote more productive conversations. So we now suggest admin roles to members who may be interested. A number of factors go into these suggestions, including whether people have a history of Community Standards violations. 

In the coming weeks, we’ll begin archiving groups that have been without an admin for some time. Moving forward, when a single remaining admin chooses to step down, they can invite members to become admins. If no invited members accept, we will suggest admin roles to members who may be interested. If no one accepts, we’ll archive the group.

Become an admin screenshot

Removing Health Groups from Recommendations

Facebook Groups, including health groups, can be a positive space for giving and receiving support during difficult life circumstances. At the same time, it’s crucial that people get their health information from authoritative sources. To prioritize connecting people with accurate health information, we are starting to no longer show health groups in recommendations. People can still invite friends to health groups or search for them.

For more information on the kinds of content we recommend, including groups, see our recommendations guidelines, which we recently made public.

Continuing to Combat Organizations and Movements Tied to Violence

This summer we continued to take action against groups tied to violence. We banned a violent US-based anti-government network connected to the boogaloo movement and removed 106 of their groups. We also expanded our policy to address organizations and movements that have demonstrated significant risks to public safety, including QAnon, US-based militia organizations and anarchist groups that support violent acts amid protests. 

We now limit the spread of these groups by removing them from recommendations, restricting them from search, and soon reducing their content in News Feed. We also remove these groups when they discuss potential violence, even if they use veiled language and symbols. For example, we removed 790 groups linked to QAnon under this policy.

Combating Misinformation in Groups

To combat misinformation across Facebook, we take a “remove, reduce, inform” approach that leverages a global network of independent fact-checkers. For Facebook Groups, this work includes:

  • Removing groups that share content that violates our Community Standards. If admins or moderators repeatedly post or approve content that breaks our rules, we take down the whole group. 
  • Reducing the distribution of groups that share misinformation. Groups that repeatedly share content rated false by fact-checkers won’t be recommended to other people on Facebook. We rank all content from these groups lower in News Feed and limit notifications so fewer members see their posts.
  • Informing people when they encounter misinformation. We apply a label to content that’s been reviewed by fact-checkers, so people can see additional context. We also notify people before they try to share this content, and we let people know if something they shared is later rated false. Group admins are also notified each time a piece of content rated false by fact-checkers is posted in their group, and they can see an overview of this in the Group Quality tool.

We know there is more to do to keep groups safe on Facebook, and we’ll keep improving our technology and policies to ensure groups remain places where people can connect and find support.



To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookie Policy