Meta

Requesting Oversight Board Guidance on Our Cross-Check System

Update on March 3, 2023 at 5:00AM PT:

Today, we are responding to the 32 recommendations the Oversight Board made last year as part of their Policy Advisory Opinion on our cross-check system. Out of their recommendations, we are fully implementing 11, partially implementing 15, still assessing the feasibility of one and taking no further action on five

This will result in substantial changes to how we operate this system in response to the feedback we’ve received from the board and other stakeholders we’ve engaged with over several years. In particular, we will make cross-check more transparent through regular reporting and fine-tune our criteria for inclusion to the list to better account for human rights interests and equity. We will also change cross-check’s operational systems to help reduce the backlog of review requests and decrease the time it takes to review cases. 

Together with the steps we announced late last year, which are outlined below, these actions will improve this system to make it more effective, accountable and equitable. Our full responses to the Oversight Board’s recommendations are available here

Holding Meta accountable for our content policies and processes, as well as our decisions, is exactly why the Oversight Board was established. We thank the board for the care and attention it gave this case. 

To learn more about the cross-check system, visit our Transparency Center.

Update on December 6, 2022 at 3:05AM PT:

Today, the Oversight Board published a Policy Advisory Opinion on our cross-check system, following our request last year.

We built the cross-check system to prevent potential over-enforcement (when we take action on content or accounts that don’t actually violate our policies) and to double-check cases where there could be a higher risk for a mistake or when the potential impact of a mistake is especially severe. For example, when journalists are reporting from conflict zones.

We have teams and resources devoted to improving the cross-check system. Here is some of the work we’ve done in the past year:

To fully address the number of recommendations, we’ve agreed with the board to review and respond within 90 days. To learn more about the cross-check system, visit our Transparency Center.

Update on October 21, 2021 at 5:05AM PT:

The Oversight Board announced that it will review our cross-check system and provide recommendations on how we can improve it. You can read more in our Transparency Center.

Originally published on September 28, 2021 at 7:00AM PT:

Facebook will ask the Oversight Board — in the form of a Policy Advisory Opinion — for recommendations about how we can continue to improve our cross-check system. Specifically, we will ask the board for guidance on the criteria we use to determine what is prioritized for a secondary review via cross-check, as well as how we manage the program.  

Facebook reviews billions of pieces of content everyday, has 40,000 people working on safety and security, and has built some of the most sophisticated technology to help with content enforcement. Despite that, we know we are going to make mistakes. The cross-check system was built to prevent potential over-enforcement mistakes and to double-check cases where, for example, a decision could require more understanding or there could be a higher risk for a mistake. This could include activists raising awareness of instances of violence, journalists reporting from conflict zones or other content from high-visibility Pages and profiles where correct enforcement is especially important given the number of people who could see it. 

We know the system isn’t perfect. We have new teams and resources in place, and we are continuing to make improvements. But more are needed. The Oversight Board’s recommendations will be a big part of this continued work.

We have already implemented one of the board’s recommendations related to cross-check from a previous case by describing the system in our Transparency Center. Recently the board has expressed interest in looking more into our cross-check system. This referral goes beyond the briefing we have already provided. We are proactively asking the board for its input through a formal and transparent process.

Holding Facebook accountable for our content policies and processes is exactly why the Oversight Board was established. Over the coming weeks and months, we will continue to brief the board on our cross-check system and engage with them to answer their questions. We welcome their recommendations and the independent oversight they provide. Per the bylaws, we will publicly respond to their recommendations within 30 days.