Meta

May 2020 Coordinated Inauthentic Behavior Report

We’re constantly working to find and stop coordinated campaigns that seek to manipulate public debate across our apps. In 2019 alone, we took down over 50 networks worldwide for engaging in coordinated inauthentic behavior (CIB), including ahead of major democratic elections. You can find more information about our previous enforcement actions here.

Purpose of This Report

Over the past three years, we’ve shared our findings about coordinated inauthentic behavior we detect and remove from our platforms. As part of regular CIB reports, we’re sharing information about all networks we take down over the course of a month to make it easier for people to see progress we’re making in one place.

What Is CIB?

While we investigate and enforce against any type of inauthentic behavior — including fake engagement, spam and artificial amplification — we approach enforcement against these mostly financially-motivated activities differently from how we counter foreign interference or domestic influence operations. We routinely take down less sophisticated, high-volume inauthentic behaviors like spam and we do not announce these enforcement actions when we take them.

We view influence operations as coordinated efforts to manipulate public debate for a strategic goal where fake accounts are central to the operation. There are two tiers of these activities that we work to stop: 1) coordinated inauthentic behavior in the context of domestic, non-state campaigns (CIB) and 2) coordinated inauthentic behavior on behalf of a foreign or government actor (FGI).

Coordinated Inauthentic Behavior (CIB)
When we find domestic, non-government campaigns that include groups of accounts and Pages seeking to mislead people about who they are and what they are doing while relying on fake accounts, we remove both inauthentic and authentic accounts, Pages and Groups directly involved in this activity.

Foreign or Government Interference (FGI)
If we find any instances of CIB conducted on behalf of a government entity or by a foreign actor, we apply the broadest enforcement measures including the removal of every on-platform property connected to the operation itself and the people and organizations behind it.

Continuous Enforcement
We monitor for efforts to re-establish a presence on Facebook by networks we previously removed for CIB. Using both automated and manual detection, we continuously remove accounts and Pages connected to networks we took down in the past.

Summary of May 2020 Findings

This month, we removed two networks of accounts, Pages and Groups. One of them — from Tunisia — focused on countries across Francophone Africa, and the other one targeted domestic audiences in the Kurdistan region of Iraq. We have shared information about our findings with law enforcement, policymakers and industry partners.

  • Total number of Facebook accounts removed: 254
  • Total number of Instagram accounts removed: 240
  • Total number of Pages removed: 770
  • Total number of Groups removed: 101

(Updated on June 15, 2020 at 4:15PM PT to reflect the latest enforcement number for Facebook accounts removed.)

Networks Removed in May, 2020:

  1. Tunisia: We removed 446 Pages, 182 Facebook accounts, 96 Groups, 60 events and 209 Instagram accounts. This activity originated in Tunisia and focused on Francophone countries in Sub-Saharan Africa. This network used fake accounts to masquerade as locals in countries they targeted, post and like their own content, drive people to off-platform sites, and manage Groups and Pages posing as independent news entities. Some Pages engaged in deceptive audience building tactics changing their focus from non-political to political themes including substantial name and admin changes over time. We found this network as part of our internal investigation which linked this activity to a Tunisia-based PR firm Ureputation.
  2. Iraq: We also removed 324 Pages, 72 accounts, 5 Groups and 31 Instagram accounts. (Updated on June 15, 2020 at 4:15PM PT to reflect the latest enforcement number for Facebook accounts removed.) This activity originated in the Kurdistan region of Iraq and focused on domestic audiences. This network used fake accounts — some of which had been previously detected and disabled by our automated systems — to post in Groups, impersonate local politicians and political parties, and manage Pages masquerading as news entities. We found this activity as part of our internal investigation linked to Pages we had previously removed for impersonation. Our investigation connected this activity to individuals associated with Zanyari Agency, part of the intelligence services of the Kurdistan Regional Government in Iraqi Kurdistan.

We are making progress rooting out this abuse, but as we’ve said before, it’s an ongoing effort. We’re committed to continually improving to stay ahead. That means building better technology, hiring more people and working more closely with law enforcement, security experts and other companies.

See the detailed report for more information.



To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookie Policy