Meta

March 2020 Coordinated Inauthentic Behavior Report

We’re constantly working to find and stop coordinated campaigns that seek to manipulate public debate across our apps. In the past year alone, we’ve taken down over 50 networks worldwide for engaging in coordinated inauthentic behavior (CIB), including ahead of major democratic elections.

Purpose of This Report

Over the past three years, we’ve shared our findings about coordinated inauthentic behavior we detect and remove from our platforms. As part of regular CIB reports, we’re sharing information about all networks we take down over the course of a month to make it easier for people to see progress we’re making in one place.

What Is CIB?

While we investigate and enforce against any type of inauthentic behavior — including fake engagement, spam and artificial amplification — we approach enforcement against these mostly financially-motivated activities differently from how we counter foreign interference or domestic influence operations. We routinely take down less sophisticated, high-volume inauthentic behaviors like spam and we do not announce these enforcement actions when we take them.

We view influence operations as coordinated efforts to manipulate public debate for a strategic goal where fake accounts are central to the operation. There are two tiers of these activities that we work to stop: 1) coordinated inauthentic behavior in the context of domestic, non-state campaigns (CIB) and 2) coordinated inauthentic behavior on behalf of a foreign or government actor (FGI).

Coordinated Inauthentic Behavior (CIB)
When we find domestic, non-government campaigns that include groups of accounts and Pages seeking to mislead people about who they are and what they are doing while relying on fake accounts, we remove both inauthentic and authentic accounts, Pages and Groups directly involved in this activity.

Foreign or Government Interference (FGI)
If we find any instances of CIB conducted on behalf of a government entity or by a foreign actor, we apply the broadest enforcement measures including the removal of every on-platform property connected to the operation itself and the people and organizations behind it.

Summary of March 2020 Findings

This month, we removed three networks of accounts, Pages and Groups engaged in coordinated inauthentic behavior. We reported the removal of one of these networks earlier this month, and we’re sharing the remaining two today. We have shared information about our findings with law enforcement, policymakers and industry partners.

  • Total number of Facebook accounts removed: 180
  • Total number of Instagram accounts removed: 170
  • Total number of Pages removed: 160
  • Total number of Groups removed: 1
  • Continuous enforcement against networks we removed in the past: We monitor for efforts to re-establish a presence on Facebook by networks we previously removed for coordinated inauthentic behavior. As part of our continuous enforcement, using both automated and manual detection, we removed 214 Facebook accounts, 435 Instagram accounts, 83 Pages, and 1 Group connected to networks we took down in the past, including the following: (Updated numbers on April 8, 2020 at 1:25PM PT: We noted in our original post that we’d update these numbers as more data for March became available. These new numbers reflect the complete data for this reporting period.) * Note: as we continue to refine our enforcement reporting framework, we estimate that our automated systems may undercount the number of enforcement actions by no more than 10%.

Networks Removed in March, 2020:

  1. NEW — France: We removed 51 Facebook accounts, 9 Pages and 9 Instagram accounts. This domestic-focused activity originated in the Sete region of France.
  2. NEW — Egypt: We also removed 81 Facebook accounts, 82 Pages, 1 Group and 76 Instagram accounts from Egypt. This activity focused on Arabic-speaking audiences and some Pages focused specifically on Egypt and the Gulf region. Although the people behind this network attempted to conceal their identities and coordination, our investigation found links to Maat, a marketing firm in Egypt.
  3. Russia, Ghana, Nigeria: We removed a network of 49 Facebook accounts, 69 Pages and 85 Instagram accounts. This network was in the early stages of building an audience and was operated by local nationals — some wittingly and some unwittingly — in Ghana and Nigeria on behalf of individuals in Russia. It targeted primarily the United States. Our investigation linked this activity to EBLA, an NGO in Ghana, and individuals associated with past activity by the Russian Internet Research Agency (IRA). (Originally announced on March 12, 2020)

See the detailed report for more information.



To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookie Policy