An Update on Our Ads Fairness Efforts


  • Over the past year, we have worked with the Department of Justice to develop new technology to help distribute ads in a more equitable way on our apps. 
  • We’re now launching this technology in the US for housing ads and will expand it later this year to employment and credit ads. 
  • We’re sharing more about how this new system works, including the ways it helps enhance equitable distribution of ads and respects people’s privacy.

As a part of our settlement with the Department of Justice (DOJ), representing the US Department of Housing and Urban Development (HUD), we announced our plan to create the Variance Reduction System (VRS) to help advance the equitable distribution of ads on Meta technologies. After more than a year of collaboration with the DOJ, we have now launched the VRS in the United States for housing ads. Over the coming year, we will extend its use to US employment and credit ads. Additionally, we discontinued the use of Special Ad Audiences, an additional commitment in the settlement.

The Variance Reduction System in Action

The VRS uses new machine learning technology in ad delivery so that the actual audience that sees an ad more closely reflects the eligible target audience for that ad. After the ad has been shown to a large enough group of people, the VRS measures aggregate demographic distribution of those who have seen the ad to understand how that audience compares with the demographic distribution of the eligible target audience selected by the advertiser. To implement this technology in a way that respects people’s privacy, the VRS relies on a widely used method of measurement called Bayesian Improved Surname Geocoding (BISG) – informed by publicly available US Census statistics – to measure estimated race and ethnicity. This method is built with added privacy enhancements including differential privacy, a technique that can help protect against re-identification of individuals within aggregated datasets. 

Throughout the course of an ad campaign, the VRS will keep measuring the audience’s demographic distribution and continue working to reduce the difference between the audiences. 

Learn more about this new technology in our technical paper and on our AI blog.

Our Work to Further Algorithmic Fairness

Meta embeds civil rights and responsible AI principles into our product development process to help advance our algorithmic fairness efforts while protecting privacy. 

The VRS builds on our longstanding efforts to help protect against discrimination. This includes restricting certain targeting options for campaigns that advertise housing, employment or credit ads. For example, we don’t allow advertisers that are either based in or trying to reach people in the US, Canada and certain European countries from targeting their housing, employment or credit ads based on age, gender or ZIP code. 

Across the industry, approaches to algorithmic fairness are still evolving, particularly as it relates to digital advertising. But we know we cannot wait for consensus to make progress in addressing important concerns about the potential for discrimination — especially when it comes to housing, employment, and credit ads, where the enduring effects of historically unequal treatment still have the tendency to shape economic opportunities. We will continue to make this work a priority as we collaborate with stakeholders to support important industry-wide discussions around how to make progress toward more fair and equitable digital advertising.