Meta

Our Work To Fight Online Predators

Takeaways

  • Child exploitation is a horrific crime, and predators are determined criminals who test app, website and platform defenses.
  • In addition to developing technology to tackle this abuse, we hire specialists dedicated to online child safety and we share information with our industry peers and law enforcement.
  • Predators don’t limit their attempts to harm children to online spaces, so it’s vital that we work together to stop predators and prevent child exploitation.

Update on January 30, 2024 at 8:10AM PT:

We’re providing an update on the efficacy of our work enforcing our child safety policies.

Originally published on December 1, 2023:

Preventing child exploitation is one of the most important challenges facing our industry today. Online predators are determined criminals who use multiple apps and websites to target young people. They also test each platform’s defenses, and they learn to quickly adapt. That’s why now, as much as ever, we’re working hard to stay ahead. In addition to developing technology that roots out predators, we hire specialists dedicated to online child safety and we share information with our industry peers and law enforcement.

We take recent allegations about the effectiveness of our work very seriously, and we created a task force to review existing policies; examine technology and enforcement systems we have in place; and make changes that strengthen our protections for young people, ban predators, and remove the networks they use to connect with one another. The task force took immediate steps to strengthen our protections, and our child safety teams continue to work on additional measures. Today, we’re sharing an overview of the task force’s efforts to date.

An Overview of Meta’s Child Safety Task Force

Meta’s Child Safety Task Force focused on three areas: Recommendations and Discovery, Restricting Potential Predators and Removing Their Networks, and Strengthening Our Enforcement.

Recommendations and Discovery

We make recommendations in places like Reels and Instagram Explore to help people discover new things on our apps, and people use features like Search and Hashtags to find things they might be interested in. Given we’re making suggestions to people in these places, we have protections in place to help ensure we don’t suggest something that may be upsetting or that may break our rules. We have sophisticated systems that proactively find, remove, or refrain from suggesting content, groups and pages, among other things, that break our rules or that may be inappropriate to recommend to people. Our Child Safety Task Force improved these systems by combining them and expanding their capabilities. This work is ongoing, and we expect it to come into effect fully in the coming weeks. 

Here’s what we did:

Restricting Potential Predators and Removing Their Networks

We’ve developed technology that identifies potentially suspicious adults, and we review more than 60 different signals to find these adults, such as if a teen blocks or reports an adult, or if someone repeatedly searches for terms that may suggest suspicious behavior. We already use this technology to limit potentially suspicious adults from finding, following or interacting with teens, and we’ve expanded it to prevent these adults from finding, following or interacting with one another. 

Here’s what we did:

Strengthening Our Enforcement

The task force also made a series of updates to strengthen our reporting and enforcement systems, and found new ways to root out and ban potentially predatory accounts. In August 2023 alone, we disabled more than 500,000 accounts for violating our child sexual exploitation policies.