Meta

Charting a Way Forward on Online Content Regulation

By Monika Bickert, Vice President, Content Policy

Over the past decade the internet has improved economies, reunited families, raised money for charity and helped bring about political change. However, the internet has also made it easier to share harmful content like hate speech and terrorist propaganda.

Governments, academics and others are debating how to hold internet platforms accountable, particularly in their efforts to keep people safe and protect fundamental rights like freedom of expression.

Last year, Facebook CEO Mark Zuckerberg called for governments to work with online platforms to create and adopt new regulation for online content, noting, “It’s impossible to remove all harmful content from the Internet, but when people use dozens of different sharing services — all with their own policies and processes — we need a more standardized approach.”

Today, we’re publishing a white paper setting out some questions that regulation of online content might address. 

Charting a Way Forward: Online Content Regulation builds on recent developments on this topic, including legislative efforts and scholarship. 

Moving the Conversation Forward

The paper poses four questions which go to the heart of the debate about regulating content online: 

  • How can content regulation best achieve the goal of reducing harmful speech while preserving free expression? By requiring systems such as user-friendly channels for reporting content or external oversight of policies or enforcement decisions, and by requiring procedures such as periodic public reporting of enforcement data, regulation could provide governments and individuals the information they need to accurately judge social media companies’ efforts.
  • How can regulations enhance the accountability of internet platforms? Regulators could consider certain requirements for companies, such as publishing their content standards, consulting with stakeholders when making significant changes to standards, or creating a channel for users to appeal a company’s content removal or non-removal decision. 
  • Should regulation require internet companies to meet certain performance targets? Companies could be incentivized to meet specific targets such as keeping the prevalence of violating content below some agreed threshold.
  • Should regulation define which “harmful content” should be prohibited on the internet? Laws restricting speech are generally implemented by law enforcement officials and the courts. Internet content moderation is fundamentally different. Governments should create rules to address this complexity — that recognize user preferences and the variation among internet services, can be enforced at scale, and allow for flexibility across language, trends and context. 

Guidelines for Future Regulation 

The development of regulatory solutions should involve not just lawmakers, private companies and civil society, but also those who use online platforms. The following principles are based on lessons we’ve learned from our work in combating harmful content and our discussions with others.

  • Incentives. Ensuring accountability in companies’ content moderation systems and procedures will be the best way to create the incentives for companies to responsibly balance values like safety, privacy, and freedom of expression.
  • The global nature of the internet. Any national regulatory approach to addressing harmful content should respect the global scale of the internet and the value of cross-border communications. They should aim to increase interoperability among regulators and regulations.
  • Freedom of expression. In addition to complying with Article 19 of the ICCPR (and related guidance), regulators should consider the impacts of their decisions on freedom of expression.
  • Technology. Regulators should develop an understanding of the capabilities and limitations of technology in content moderation and allow internet companies the flexibility to innovate.  An approach that works for one particular platform or type of content may be less effective (or even counterproductive) when applied elsewhere.
  • Proportionality and necessity. Regulators should take into account the severity and prevalence of the harmful content in question, its status in law, and the efforts already underway to address the content.

If designed well, new frameworks for regulating harmful content can contribute to the internet’s continued success by articulating clear ways for government, companies, and civil society to share responsibilities and work together. Designed poorly, these efforts risk unintended consequences that might make people less safe online, stifle expression and slow innovation.

We hope today’s white paper helps to stimulate further conversation around the regulation of content online. It builds on a paper we published last September on data portability, and we plan on publishing similar papers on elections and privacy in the coming months.



To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookie Policy