Meta

March 2020 Coordinated Inauthentic Behavior Report

We’re constantly working to find and stop coordinated campaigns that seek to manipulate public debate across our apps. In the past year alone, we’ve taken down over 50 networks worldwide for engaging in coordinated inauthentic behavior (CIB), including ahead of major democratic elections.

Purpose of This Report

Over the past three years, we’ve shared our findings about coordinated inauthentic behavior we detect and remove from our platforms. As part of regular CIB reports, we’re sharing information about all networks we take down over the course of a month to make it easier for people to see progress we’re making in one place.

What Is CIB?

While we investigate and enforce against any type of inauthentic behavior — including fake engagement, spam and artificial amplification — we approach enforcement against these mostly financially-motivated activities differently from how we counter foreign interference or domestic influence operations. We routinely take down less sophisticated, high-volume inauthentic behaviors like spam and we do not announce these enforcement actions when we take them.

We view influence operations as coordinated efforts to manipulate public debate for a strategic goal where fake accounts are central to the operation. There are two tiers of these activities that we work to stop: 1) coordinated inauthentic behavior in the context of domestic, non-state campaigns (CIB) and 2) coordinated inauthentic behavior on behalf of a foreign or government actor (FGI).

Coordinated Inauthentic Behavior (CIB)
When we find domestic, non-government campaigns that include groups of accounts and Pages seeking to mislead people about who they are and what they are doing while relying on fake accounts, we remove both inauthentic and authentic accounts, Pages and Groups directly involved in this activity.

Foreign or Government Interference (FGI)
If we find any instances of CIB conducted on behalf of a government entity or by a foreign actor, we apply the broadest enforcement measures including the removal of every on-platform property connected to the operation itself and the people and organizations behind it.

Summary of March 2020 Findings

This month, we removed three networks of accounts, Pages and Groups engaged in coordinated inauthentic behavior. We reported the removal of one of these networks earlier this month, and we’re sharing the remaining two today. We have shared information about our findings with law enforcement, policymakers and industry partners.

Networks Removed in March, 2020:

  1. NEW — France: We removed 51 Facebook accounts, 9 Pages and 9 Instagram accounts. This domestic-focused activity originated in the Sete region of France.
  2. NEW — Egypt: We also removed 81 Facebook accounts, 82 Pages, 1 Group and 76 Instagram accounts from Egypt. This activity focused on Arabic-speaking audiences and some Pages focused specifically on Egypt and the Gulf region. Although the people behind this network attempted to conceal their identities and coordination, our investigation found links to Maat, a marketing firm in Egypt.
  3. Russia, Ghana, Nigeria: We removed a network of 49 Facebook accounts, 69 Pages and 85 Instagram accounts. This network was in the early stages of building an audience and was operated by local nationals — some wittingly and some unwittingly — in Ghana and Nigeria on behalf of individuals in Russia. It targeted primarily the United States. Our investigation linked this activity to EBLA, an NGO in Ghana, and individuals associated with past activity by the Russian Internet Research Agency (IRA). (Originally announced on March 12, 2020)

See the detailed report for more information.