How Do We Help Keep Private Groups Safe?

By Tom Alison, VP of Engineering

Private groups can be important places for people to come together and share around a range of personal topics, like identifying as LGBTQ or discussing challenges around a rare health condition.

But being in a private group doesn’t mean that your actions should go unchecked. We have a responsibility to keep Facebook safe, which is why our Community Standards apply across Facebook, including in private groups. To enforce these policies, we use a combination of people and technology — content reviewers and proactive detection. Over the last few years, we’ve invested heavily in both, including hiring more than 30,000 people across our safety and security teams.

Within this, a specialized team has been working on the Safe Communities Initiative: an effort that started two years ago with the goal of protecting people using Facebook Groups from harm. Made up of product managers, engineers, machine learning experts and content reviewers, this team works to anticipate the potential ways people can do harm in groups and develops solutions to minimize and prevent it. As the head of Facebook Groups, I want to explain how we’re making private groups safer by focusing on three key areas: proactive detection, tools for admins, and transparency and control for members.

Using Proactive Detection To Moderate Groups

One of the main ways we keep people safe is proactively identifying and removing posts and groups that break our rules. This is a main area of focus for the Safe Communities Initiative, and it runs across private and public groups.

Increasingly, we can use AI and machine learning to proactively detect bad content before anyone reports it, and sometimes before people even see it. As content is flagged by our systems or reported by people, trained reviewers consider context and determine whether the content violates our Community Standards. We then use these examples to train our technology to get better at finding and removing similar content. Just as we used proactive detection in public, closed and secret groups before, this process will continue to apply to all public and private groups under our new simplified privacy model.

Deciding whether an entire group should stay up or come down is nuanced. If an individual post breaks our Community Standards, it comes down, but with dozens, hundreds, or sometimes thousands of different members and posts, at what point should a whole group be deemed unacceptable for Facebook?

One big factor we look at is subject matter: Does the name or description of the group include hate speech or other content that we don’t allow? Another important factor is the action of admins and moderators, since they set the tone for the group. In April, we updated our policy to look more closely at admin and moderator behavior. If group leaders often break our rules, or if they commonly approve posts from other members who break our rules, those are clear strikes against the overall group. And if a group member repeatedly violates our standards, we’ll start requiring admins to review their posts before anyone else can see them. Then if an admin approves a post that breaks our rules, it will count against the whole group. These, combined with a number of other factors, help us determine whether the group should be taken down. If the group doesn’t cross this line, it will stay up, but we’ll continue to remove individual posts that go against our Community Standards.

Tools for Admins

Admins know their communities best, and we want to empower them to run meaningful groups. That’s why we built Group Quality, which gives admins an overview of content Facebook has removed and flagged to them for most Community Standards violations. We also added a section about false news found in the group. These tools give admins more clarity about how and when we enforce our policies in their groups and gives them greater visibility into what is happening in their communities. This also means that they’re more accountable for what happens under their watch.

We help admins to establish positive group norms by adding a section for rules so they can be clear about what is and isn’t allowed. Admins and moderators also have the option to share which rule a member broke when declining a pending post, removing a comment or muting a member.

Transparency and Control for Members

We’re also committed to giving group members more transparency and control. When someone joins a group, they should know what type of community they will be a part of. That’s why before joining a group, we let people see relevant details about it, like who the admins and moderators are, and whether the group has gone by any other names in the past. You can also preview a group you were invited to and have the option to accept or decline the invitation.

Through the Safe Communities Initiative, we’ll continue to ensure that Facebook Groups can be places of support and connection, not hate or harm. There’s always more to do, and we’ll keep improving our technology, tools and policies to help keep people safe.



To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookie Policy