Meta

January 2021 Coordinated Inauthentic Behavior Report

We’re constantly working to find and stop coordinated campaigns that seek to manipulate public debate across our apps.

The Purpose of This Report

Over the past three and a half years, we’ve shared our findings about coordinated inauthentic behavior we detect and remove from our platforms. As part of our regular CIB reports, we’re sharing information about all networks we take down over the course of a month to make it easier for people to see progress we’re making in one place.

What is CIB?

We view CIB as coordinated efforts to manipulate public debate for a strategic goal where fake accounts are central to the operation. There are two tiers of these activities that we work to stop: 1) coordinated inauthentic behavior in the context of domestic, non-government campaigns and 2) coordinated inauthentic behavior on behalf of a foreign or government actor.

Coordinated Inauthentic Behavior (CIB)

When we find domestic, non-government campaigns that include groups of accounts and Pages seeking to mislead people about who they are and what they are doing while relying on fake accounts, we remove both inauthentic and authentic accounts, Pages and Groups directly involved in this activity.

Foreign or Government Interference (FGI)

If we find any instances of CIB conducted on behalf of a government entity or by a foreign actor, we apply the broadest enforcement measures including the removal of every on-platform property connected to the operation itself and the people and organizations behind it.

Continuous Enforcement

We monitor for efforts to re-establish a presence on Facebook by networks we previously removed. Using both automated and manual detection, we continuously remove accounts and Pages connected to networks we took down in the past.

Summary of January 2021 Findings

Our teams continue to focus on finding and removing deceptive campaigns around the world — whether they are foreign or domestic. In January, we removed two networks of accounts, Pages and Groups. One network from Uganda targeted domestic audiences in its own country, and another network originated primarily in Palestine and focused on domestic audiences. We have shared information about our findings with industry partners, researchers and policymakers.

We are making progress rooting out this abuse, but as we’ve said before – it’s an ongoing effort. We’re committed to continually improving to stay ahead. That means building better technology, hiring more people and working closely with law enforcement, security experts and other companies.

Networks removed in January 2021:

  1. Uganda: We removed 220 Accounts, 32 Pages, 59 Groups and 139 Instagram accounts that originated in Uganda and targeted domestic audiences. Our investigation found links to the Government Citizens Interaction Center at the Ministry of Information and Communications Technology in Uganda. We found this network after reviewing information about a portion of its activity shared with us by researchers at the Atlantic Council’s Digital Forensics Research Lab. Given the impending election in Uganda, once we completed our investigation, we moved quickly to take down this network in early January.
  2. Palestine: We removed 206 Facebook accounts, 178 Pages, 3 Groups and 14 Instagram accounts that targeted Palestine. Our investigation found links to individuals in Palestine and UAE, in addition to links between a small portion of this network and individuals associated with a recently created marketing firm called Orientation Media in Belgium. We found this network as part of our internal investigation into suspected coordinated inauthentic behavior in the region.

See the full report for more information.