Meta

Meta Asks Oversight Board to Advise on COVID-19 Misinformation Policies

By Nick Clegg, President, Global Affairs

Takeaways

  • We’re asking the Oversight Board whether our current COVID-19 misinformation policy is still appropriate.
  • Under this policy, we began removing false claims about masks, social distancing, vaccines and more.
  • Now that the COVID-19 situation has evolved, we’re seeking the Oversight Board’s opinion on whether we should change the way we address this type of misinformation through other means, like labeling or demoting it.

Update on June 16, 2023 at 6:00AM PT:

Today, we are releasing our response to the recommendations the Oversight Board made in their Covid-19 misinformation Policy Advisory Opinion.

We will take a more tailored approach to our Covid-19 misinformation rules consistent with the Board’s guidance and our existing policies. In countries that have a Covid-19 public health emergency declaration, we will continue to remove content for violating our Covid-19 misinformation policies given the risk of imminent physical harm. We are consulting with health experts to understand which claims and categories of misinformation could continue to pose this risk. Our Covid-19 misinformation rules will no longer be in effect globally as the global public health emergency declaration that triggered those rules has been lifted.

To learn more about our response to the board, visit our Transparency Center

Originally published on July 26, 2022 at 5:00AM PT:

Meta is asking the Oversight Board for advice on whether measures to address dangerous COVID-19 misinformation, introduced in extraordinary circumstances at the onset of the pandemic, should remain in place as many, though not all, countries around the world seek to return to more normal life.

Misinformation related to COVID-19 has presented unique risks to public health and safety over the last two years and more. To keep our users safe while still allowing them to discuss and express themselves on this important topic, we broadened our harmful misinformation policy in the early days of the outbreak in January 2020. Before this, Meta only removed misinformation when local partners with relevant expertise told us a particular piece of content (like a specific post on Facebook) could contribute to a risk of imminent physical harm. The change meant that, for the first time, the policy would provide for removal of entire categories of false claims on a worldwide scale.

As a result, Meta has removed COVID-19 misinformation on an unprecedented scale. Globally, more than 25 million pieces of content have been removed since the start of the pandemic. Under this policy, Meta began removing false claims about masking, social distancing and the transmissibility of the virus. In late 2020, when the first vaccine became available, we also began removing further false claims, such as the vaccine being harmful or ineffective. Meta’s policy currently provides for removal of 80 distinct false claims about COVID-19 and vaccines. 

Meta remains committed to combating COVID-19 misinformation and providing people with reliable information. As the pandemic has evolved, the time is right for us to seek input from the Oversight Board about our measures to address COVID-19 misinformation, including whether those introduced in the early days of an extraordinary global crisis remains the right approach for the months and years ahead. The world has changed considerably since 2020. We now have Meta’s COVID-19 Information Center, and guidance from public health authorities is more readily available. Meta’s COVID-19 Information Center has connected over two billion people across 189 countries to helpful, authoritative COVID-19 information.

The pandemic itself has also evolved. In many countries, where vaccination rates are relatively high, life is increasingly returning to normal. But this isn’t the case everywhere and the course of the pandemic will continue to vary significantly around the globe — especially in countries with low vaccination rates and less developed healthcare systems. It is important that any policy Meta implements be appropriate for the full range of circumstances countries find themselves in.

Meta is fundamentally committed to free expression and we believe our apps are an important way for people to make their voices heard. But some misinformation can lead to an imminent risk of physical harm, and we have a responsibility not to let this content proliferate. The policies in our Community Standards seek to protect free expression while preventing this dangerous content. But resolving the inherent tensions between free expression and safety isn’t easy, especially when confronted with unprecedented and fast-moving challenges, as we have been in the pandemic. That’s why we are seeking the advice of the Oversight Board in this case. Its guidance will also help us respond to future public health emergencies. 

The Oversight Board was established to exercise independent judgment, operating as an expert-led check and balance for Meta, with the ability to make binding decisions on specific content cases and to offer non-binding advisory opinions on its policies. We are requesting an advisory opinion from the Oversight Board on whether Meta’s current measures to address COVID-19 misinformation under our harmful health misinformation policy continue to be appropriate, or whether we should address this misinformation through other means, like labeling or demoting it either directly or through our third-party fact-checking program.



To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookie Policy