Meta

April 2020 Coordinated Inauthentic Behavior Report

We’re constantly working to find and stop coordinated campaigns that seek to manipulate public debate across our apps. In 2019 alone, we took down over 50 networks worldwide for engaging in coordinated inauthentic behavior (CIB), including ahead of major democratic elections.

These efforts are led by a cross-disciplinary team focused on finding and disrupting both the most sophisticated influence operations aimed to manipulate public debate as well as high volume inauthentic behaviors like spam and fake engagement. Over the past several years, our team has grown to over 200 people with expertise ranging from open source research, to threat investigations, cyber security, law enforcement and national security, investigative journalism, engineering, product development, data science and academic studies in disinformation.

You can find more information about our previous enforcement actions here.

Purpose of This Report

Over the past three years, we’ve shared our findings about coordinated inauthentic behavior we detect and remove from our platforms. As part of regular CIB reports, we’re sharing information about all networks we take down over the course of a month to make it easier for people to see progress we’re making in one place.

What is CIB?

While we investigate and enforce against any type of inauthentic behavior — including fake engagement, spam and artificial amplification — we approach enforcement against these mostly financially-motivated activities differently from how we counter foreign interference or domestic influence operations. We routinely take down less sophisticated, high-volume inauthentic behaviors like spam and we do not announce these enforcement actions when we take them.

We view influence operations as coordinated efforts to manipulate public debate for a strategic goal where fake accounts are central to the operation. There are two tiers of these activities that we work to stop: 1) coordinated inauthentic behavior in the context of domestic, non-state campaigns (CIB) and 2) coordinated inauthentic behavior on behalf of a foreign or government actor (FGI).

Coordinated Inauthentic Behavior (CIB)
When we find domestic, non-government campaigns that include groups of accounts and Pages seeking to mislead people about who they are and what they are doing while relying on fake accounts, we remove both inauthentic and authentic accounts, Pages and Groups directly involved in this activity.

Foreign or Government Interference (FGI)
If we find any instances of CIB conducted on behalf of a government entity or by a foreign actor, we apply the broadest enforcement measures including the removal of every on-platform property connected to the operation itself and the people and organizations behind it.

Continuous Enforcement
We monitor for efforts to re-establish a presence on Facebook by networks we previously removed for CIB. Using both automated and manual detection, we continuously remove accounts and Pages connected to networks we took down in the past.

Summary of April 2020 Findings

This month, we removed eight networks of accounts, Pages and Groups. Two of them — from Russia and Iran — focused internationally (FGI), and the remaining six — in the US, Georgia, Myanmar and Mauritania — targeted domestic audiences in their respective countries (CIB). We have shared information about our findings with law enforcement, policymakers and industry partners.

We know that people looking to mislead others — whether through phishing, scams, or influence operations — try to leverage crises to advance their goals, and the coronavirus pandemic is no different. All of the networks we took down for CIB in April were created before the COVID-19 pandemic began, however, we’ve seen people behind these campaigns opportunistically use coronavirus-related posts among many other topics to build an audience and drive people to their Pages or off-platform sites. The majority of the networks we took down this month were still trying to grow their audience or had a large portion of engagement on their Pages generated by their own accounts.

  • Total number of Facebook accounts removed: 732
  • Total number of Instagram accounts removed: 162
  • Total number of Pages removed: 793
  • Total number of Groups removed: 200

Networks Removed in April, 2020:

  1. Russia: We removed 46 Pages, 91 Facebook accounts, 2 Groups, and 1 Instagram account. This network posted in Russian, English, German, Spanish, French, Hungarian, Serbian, Georgian, Indonesian and Farsi, focusing on a wide range of regions around the world. Our investigation linked this activity to individuals in Russia, the Donbass region in Ukraine and two media organizations in Crimea — NewsFront and SouthFront. We found this network as part of our internal investigation into suspected coordinated inauthentic behavior in the region.
  2. Iran: We removed 118 Pages, 389 Facebook accounts, 27 Groups, and 6 Instagram accounts. This activity originated in Iran and focused on a wide range of countries globally including Algeria, Bangladesh, Bosnia, Egypt, Ghana, Libya, Mauritania, Morocco, Nigeria, Senegal, Sierra Leone, Somalia, Sudan, Tanzania, Tunisia, the US, UK and Zimbabwe. Our investigation linked this activity to the Islamic Republic of Iran Broadcasting Corporation. We found this network as part of our internal investigations into suspected coordinated inauthentic behavior, based in part on some links to our past takedowns.
  3. US: We removed 5 Pages, 20 Facebook accounts, and 6 Groups that originated in the US and focused domestically. Our investigation linked this activity to individuals associated with the QAnon network known to spread fringe conspiracy theories. We found this activity as part of our internal investigations into suspected coordinated inauthentic behavior ahead of the 2020 election in the US.
  4. US: We removed 19 Pages, 15 Facebook accounts, and 1 Group that originated in the US and focused domestically. Our investigation linked this network to VDARE, a website known for posting anti-immigration content, and individuals associated with a similar website The Unz Review. We found this activity as part of our internal investigations into suspected coordinated inauthentic behavior ahead of the 2020 election in the US.
  5. Mauritania: We removed 11 Pages, 75 Facebook accounts, and 90 Instagram accounts. This network originated in Mauritania and focused on domestic audiences. We detected this operation as a result of our internal investigation into suspected coordinated inauthentic behavior linked to our past takedowns.
  6. Myanmar: We removed 3 Pages, 18 Facebook accounts, and 1 Group. This domestic-focused network originated in Myanmar. Our investigation linked this activity to members of the Myanmar Police Force. We found this network as part of our internal investigation into suspected coordinated inauthentic behavior in the region.
  7. Georgia: We removed 511 Pages, 101 Facebook accounts, and 122 Groups, and 56 Instagram accounts. This domestic-focused activity originated in Georgia. Our investigation linked this network to Espersona, a media firm in Georgia. This organization is now banned from our platforms. We found this activity as part of our investigation into suspected coordinated inauthentic behavior publicly reported by a local fact-checking organization in Georgia with some links to our past takedown.
  8. Georgia: Finally, we removed 23 Facebook accounts, 80 Pages, 41 Groups, and 9 Instagram accounts. This domestic-focused activity originated in Georgia. Our investigation linked this network to individuals associated with United National Movement, a political party. We found this activity as part of our investigation into suspected coordinated inauthentic behavior in the region. Our assessment benefited from local public reporting in Georgia.

We are making progress rooting out this abuse, but as we’ve said before, it’s an ongoing effort. We’re committed to continually improving to stay ahead. That means building better technology, hiring more people and working more closely with law enforcement, security experts and other companies.

See the detailed report for more information.



To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookie Policy