Meta

Race Data Measurement and Meta’s Commitment to Fair and Inclusive Products

Update on July 28, 2022 at 6:00 AM PT:

We’re following through on requests made by the civil rights community, academics and regulators to look at the experiences people from historically and systemically marginalized communities have on our platforms and how our technologies may impact them. Over the next few months, people using Instagram in the US may see a survey prompt asking them for their race or ethnicity. This survey is part of Meta’s broader, long-term effort to help ensure that our products are built responsibly and that our products benefit the people who use them. For example, analysis we conduct with this information might help us better understand experiences different communities may have when it comes to how we rank content on Instagram.

We partnered with YouGov that will serve as a survey administrator and Northeastern University (NU), Texas Southern University (TSU), the University of Central Florida (UCF) and Oasis Labs (OL) as third-party data facilitators. OL also helped develop the approach that draws on the secure multi-party computation (SMPC) method that preserves privacy. We’ve partnered with a Historically Black College and University (HBCU), TSU, and a Hispanic-Serving Institution (HSI), UCF, because they primarily serve communities that have been historically and systemically marginalized. We are committed to understanding how people from marginalized communities experience our technologies and will continue to partner with HBCUs and HSIs.

Originally published on November 18, 2021 at 11:00AM PT:

The intersection of technology and civil rights is an emerging space that requires further attention from the entire industry, especially given the perceived rise of digital discrimination and bias. As questions have been raised about technology’s potential effects on members of marginalized communities, we’ve heard the calls for more research.

Today, I want to explain more about our approach to understanding how people from marginalized communities experience Meta technologies. Some people have said that  their opportunities are limited or that they’re having a different experience than others, but we don’t have the data to fully understand what may be happening and why. We can’t address what we can’t measure, so establishing a more accurate measurement framework is vital to create more inclusive products, policies and operations across the company. 

To do this right, we can’t work alone. We’re consulting with the civil rights community, privacy experts, academics, regulators and other organizations on the best way to measure these potential differences in people’s experiences. As we’ve explored options for measurement, experts reaffirmed that any work we do must take into account privacy, security and transparency. 

To start, we plan to introduce a framework for studying our platforms and identifying opportunities to increase fairness when it comes to race in the United States. We have explored using aggregate US Census and ZIP code data, which is an accepted way of measuring demographics in the US. However, this approach has some limitations. To that end, we plan to augment that approach with two methodologies that will produce more accurate insights. We will do this in a way that allows important measurement while honoring people’s privacy: 

Learn more about these methodologies in our technical paper. While this work will initially focus on race in the US, it will help us lay the groundwork for how to address concerns from other marginalized communities here and around the world, consistent with our corporate human rights policy

I joined the company at the beginning of the year to establish a new civil rights organization as part of our commitment following an independent audit of our policies and practices. As civil rights expert Laura Murphy stated in the audit, Meta has a “responsibility to ensure that the algorithms and machine learning models that can have important impacts on billions of people do not have unfair or adverse consequences.” We know that this journey will not be easy, however, we remain committed to this work, doing it thoughtfully, and being transparent about our efforts.