Meta

July 2021 Coordinated Inauthentic Behavior Report

We’re constantly working to find and stop coordinated campaigns that seek to manipulate public debate across our apps.

Purpose of This Report

Over the past four years, we’ve shared our findings about coordinated inauthentic behavior (CIB) we detect and remove from our platforms. As part of our regular CIB reports, we’re sharing information about all the networks we take down over the course of a month to make it easier for people to see the progress we’re making in one place.

Summary of July 2021 Findings

Our teams continue to focus on finding and removing deceptive campaigns around the world — whether they are foreign or domestic. In July, we removed two networks from Russia and Myanmar. In this report, we’re also sharing an in-depth analysis by our threat intelligence team into one of the operations — a network from Russia linked to Fazze, a marketing firm registered in the UK — to add to the public reporting on this network’s activity across over a dozen different platforms. We have shared information about our findings with industry partners, researchers, law enforcement and policymakers.

We know that influence operations will keep evolving in response to our enforcement, and new deceptive behaviors will emerge. We will continue to refine our enforcement and share our findings publicly. We are making progress rooting out this abuse, but as we’ve said before — it’s an ongoing effort and we’re committed to continually improving to stay ahead. That means building better technology, hiring more people and working closely with law enforcement, security experts and other companies.

Here are the numbers for the new CIB networks we removed in July:

Networks removed in July 2021:

  1. Myanmar: We removed 79 Facebook accounts, 13 Pages, eight Groups, and 19 Instagram accounts in Myanmar that targeted domestic audiences and were linked to individuals associated with the Myanmar military. We found this activity after reviewing information about a portion of it shared by a member of civil society in Myanmar. Our investigation revealed some links between this operation and the activity we removed in 2018.
  2. Russia: We removed 65 Facebook accounts and 243 Instagram accounts from Russia that we linked to Fazze, a subsidiary of a UK-registered marketing firm, whose operations were primarily conducted from Russia. Fazze is now banned from our platform. This cross-platform operation targeted audiences primarily in India, Latin America, and to a much lesser extent the United States. We found this network after reviewing public reporting about an off-platform portion of this activity.

Learn More About Coordinated Inauthentic Behavior

We view CIB as coordinated efforts to manipulate public debate for a strategic goal where fake accounts are central to the operation. There are two tiers of these activities that we work to stop: 1) coordinated inauthentic behavior in the context of domestic, non-government campaigns and 2) coordinated inauthentic behavior on behalf of a foreign or government actor.

Coordinated Inauthentic Behavior (CIB)
When we find domestic, non-government campaigns that include groups of accounts and Pages seeking to mislead people about who they are and what they are doing while relying on fake accounts, we remove both inauthentic and authentic accounts, Pages and Groups directly involved in this activity.

Foreign or Government Interference (FGI)
If we find any instances of CIB conducted on behalf of a government entity or by a foreign actor, we apply the broadest enforcement measures including the removal of every on-platform property connected to the operation itself and the people and organizations behind it.

Continuous Enforcement
We monitor for efforts to re-establish a presence on Facebook by networks we previously removed. Using both automated and manual detection, we continuously remove accounts and Pages connected to networks we took down in the past.

See the full report for more information.