Meta

Defending Our Ability to Protect and Empower People on Our Apps

Takeaways

  • Today, we filed an amicus brief with the US Supreme Court to urge them not to undermine the internet as we know it so people can continue connecting with each other and millions of small businesses can grow and thrive.
  • In the case, Gonzalez v. Google, the Supreme Court is considering whether Section 230 protects the ability of companies like ours to recommend and filter content in ways that are useful.
  • Without this protection, millions of online companies would not be able to help keep people safe by reducing and blocking dangerous content, such as terrorism, while continuing to provide social and economic opportunities to people and advertisers.

Next month, the US Supreme Court will hear arguments in a case that could make it much harder for millions of online companies like Meta to provide the type of services that people enjoy using every day — to facilitate deep connections with friends and families, to discover new places, interests and communities, and to help millions of small businesses grow and thrive. The case, Gonzalez v. Google, asks whether Section 230 protects the ability of an online service to sort, organize and recommend the growing number of posts, videos, photos, customer reviews and other content created and shared online by hundreds of millions of people every day.

Over the past quarter-century, Section 230 has enabled the internet to revolutionize the way we live. For example, it has helped companies like Spotify to introduce people to new music and to connect up-and-coming artists with new audiences; companies like Etsy to connect small businesses with new customers; and fundraising platforms like Kickstarter and GoFundMe to empower millions to contribute to causes they care about.

One way Meta helps people to build community is by using algorithms to recommend connections and content you might be interested in — for example, new Facebook groups you might want to join, pages you might like, or events you might want to attend — and by ranking content so that you are more likely to see the posts you care most about. This technology also helps protect our community by filtering, blocking, and reducing the spread of content that violates our policies or is otherwise problematic.

The sheer volume of user-generated content on the internet means that online companies have to make decisions about how to organize, prioritize, and deprioritize this content in ways that are useful to people and advertisers, while enforcing our policies against terrorism and other harmful content. Meta has invested billions of dollars to develop sophisticated safety and security systems that work to identify, block, and remove terrorist content quickly — typically before it is ever seen by any users. Section 230 was enacted to allow companies to do exactly this. Exposing companies to liability for decisions to organize and filter content from among the vast array of content posted online would incentivize them to simply remove more content in ways Congress never intended.

Here are the key arguments we make in the amicus brief filed today with the Court: