Meta

February 2020 Coordinated Inauthentic Behavior Report

We’re constantly working to find and stop coordinated campaigns that seek to manipulate public debate across our apps. In the past year alone, we’ve taken down over 50 networks worldwide for engaging in coordinated inauthentic behavior (CIB), including ahead of major democratic elections.

Purpose of This Report
Over the past three years, we’ve shared our findings about coordinated inauthentic behavior we detect and remove from our platforms. Starting this month, we will begin publishing information about all networks we take down over the course of a month as part of regular CIB reports to make it easier for people to see progress we’re making in one place.

What is CIB?
While we investigate and enforce against any type of inauthentic behavior — including fake engagement, spam and artificial amplification — we approach enforcement against these mostly financially-motivated activities differently than how we counter foreign interference or domestic influence operations. We routinely take down high-volume inauthentic behaviors like spam which are much less sophisticated, and we do not announce these enforcement actions when we take them.

We view influence operations as coordinated efforts to manipulate public debate for a strategic goal where fake accounts are central to the operation. There are two tiers of these activities that we work to stop: 1) coordinated inauthentic behavior in the context of domestic, non-state campaigns (CIB) and 2) coordinated inauthentic behavior on behalf of a foreign or government actor (FGI).

Coordinated Inauthentic Behavior (CIB)

When we find domestic, non-government campaigns that include groups of accounts and Pages seeking to mislead people about who they are and what they are doing while relying on fake accounts, we remove both inauthentic and authentic accounts, Pages and Groups directly involved in this activity. We will share the removal of this activity as part of our monthly CIB report. However, if the accounts and Pages we remove are directly related to a civic event, pose imminent harm or involves a new technique or a new significant threat actor, we will share our findings at the time of enforcement and we’ll also include them in our monthly report.

We continue to monitor for efforts to re-establish a presence on Facebook by networks we previously removed for CIB. As part of our this continuous enforcement, using both automated and manual detection, we constantly remove accounts and Pages connected to networks we took down in the past. We will share these numbers as part of our monthly CIB reports moving forward.

Foreign or Government Interference (FGI)

If we find any instances of CIB conducted on behalf of a government entity or by a foreign actor, we apply the broadest enforcement measures including the removal of every on-platform property connected to the operation itself and the people and organizations behind it. We will announce the removal of this activity at the time of enforcement and will also include it in our monthly report.

Summary of February 2020 Findings

This month, we removed 5 networks of accounts, Pages and Groups engaged in foreign or government interference — which is coordinated inauthentic behavior on behalf of a foreign actor or government entity. We reported the removal of three of these networks earlier this month, and we’re reporting the remaining two today. We have shared information about our findings with law enforcement, policymakers and industry partners.

Networks removed in February, 2020:

  1. India: We removed a network of 37 Facebook accounts, 32 Pages, 11 Groups and 42 Instagram accounts. This activity originated in India and focused on the Gulf region, United States, United Kingdom, and Canada. Although the people behind this network attempted to conceal their identities and coordination, our investigation found links to aRep Global, a digital marketing firm in India.
  2. Egypt: We removed a network of 333 Facebook accounts, 195 Pages, 9 Groups and 1194 Instagram accounts. This activity originated in Egypt and focused on countries across the Middle East and North Africa. Although the people behind this activity attempted to conceal their identities and coordination, our investigation found links to two marketing firms in Egypt, New Waves and Flexell, which were behind the activity we removed in August and October 2019. Both of these companies and individuals associated with them have repeatedly violated our Inauthentic Behavior policy and are now banned from Facebook.
  3. Russia: We removed a network of 78 Facebook accounts, 11 Pages, 29 Groups and four Instagram accounts. This activity originated in Russia and focused primarily on Ukraine and neighboring countries. Although the people behind this network attempted to conceal their identities and coordination, our investigation found links to Russian military intelligence services. (Originally announced on February 12, 2020)
  4. Iran: We also removed a small network of 6 Facebook accounts and 5 Instagram accounts that originated in Iran and focused primarily on the US. (Originally announced on February 12, 2020)
  5. Myanmar, Vietnam: Finally, we removed 13 Facebook accounts and 10 Pages operated from Myanmar and Vietnam, focused on Myanmar. Although the people behind this activity attempted to conceal their identities and coordination, our investigation found links to two telecom providers, Mytel in Myanmar and Viettel in Vietnam, as well as Gapit Communications, a PR firm in Vietnam. (Originally announced on February 12, 2020)

See the detailed report for more information.



To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookie Policy