Meta

How We’re Taking Action Against Vaccine Misinformation Superspreaders

By Monika Bickert, Vice President, Content Policy

In recent weeks, there has been a debate about whether the global problem of COVID-19 vaccine misinformation can be solved simply by removing 12 people from social media platforms. People who have advanced this narrative contend that these 12 people are responsible for 73% of online vaccine misinformation on Facebook. There isn’t any evidence to support this claim. Moreover, focusing on such a small group of people distracts from the complex challenges we all face in addressing misinformation about COVID-19 vaccines.

That said, any amount of COVID-19 vaccine misinformation that violates our policies is too much by our standards — and we have removed over three dozen Pages, groups and Facebook or Instagram accounts linked to these 12 people, including at least one linked to each of the 12 people, for violating our policies. We have also imposed penalties on nearly two dozen additional Pages, groups or accounts linked to these 12 people, like moving their posts lower in News Feed so fewer people see them or not recommending them to others. We’ve applied penalties to some of their website domains as well so any posts including their website content are moved lower in News Feed. The remaining accounts associated with these individuals are not posting content that breaks our rules, have only posted a small amount of violating content, which we’ve removed, or are simply inactive. In fact, these 12 people are responsible for about just 0.05% of all views of vaccine-related content on Facebook. This includes all vaccine-related posts they’ve shared, whether true or false, as well as URLs associated with these people.

The report upon which the faulty narrative is based analyzed only a narrow set of 483 pieces of content over six weeks from only 30 groups, some of which are as small as 2,500 users. They are in no way representative of the hundreds of millions of posts that people have shared about COVID-19 vaccines in the past months on Facebook. Further, there is no explanation for how the organization behind the report identified the content they describe as “anti-vax” or how they chose the 30 groups they included in their analysis. There is no justification for their claim that their data constitute a “representative sample” of the content shared across our apps. 

Focusing on these 12 individuals misses the forest for the trees. We have worked closely with leading health organizations since January 2020 to identify and remove COVID-19 misinformation that could contribute to a risk of someone spreading or contracting the virus. Since the beginning of the pandemic across our entire platform, we have removed over 3,000 accounts, Pages and groups for repeatedly violating our rules against spreading COVID-19 and vaccine misinformation and removed more than 20 million pieces of content for breaking these rules.

None of this is to suggest that our work is done or that we are satisfied. Tracking and combating vaccine misinformation is a complex challenge, made more difficult by the lack of common definitions about what constitutes misinformation, and the reality that guidance from scientific and health experts has evolved and will continue to evolve throughout the pandemic. That’s why we’re continuing to work with external experts and governments to make sure that we are approaching these issues in the right way and making adjustments if necessary. In the meantime, we will continue doing our part to show people reliable information about COVID-19 vaccines from health experts and help people get vaccinated.



To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookie Policy