Meta

An Update on Meta’s Civil Rights Progress

By Roy L. Austin, Jr., Vice President and Deputy General Counsel, Civil Rights

Takeaways

  • We’re sharing key highlights of our civil rights progress since 2021. 
  • These include launching civil rights training for employees across the company, building on our work to help ensure that our AI systems don’t spread harmful or disrespectful content for historically marginalized communities and developing new technology to more equitably distribute ads on our apps. 
  • We are also announcing a new consent-driven, publicly available dataset to enable researchers to better evaluate the fairness and robustness of certain types of AI models. 

Today, we’re sharing an update on Meta’s civil rights work. Since our last update in 2021, we’ve continued to make progress when it comes to advancing civil rights throughout the company. The following are key highlights:

  • Helping Researchers Measure Fairness in our AI Models. Today, we’re releasing Casual Conversations v2, a consent-driven, publicly available dataset that enables researchers to better evaluate how fair and robust certain types of AI models are, with the goal of making them more inclusive. 
  • Launched Civil Rights Training for all Employees. We introduced civil rights training for employees to better equip them to identify and address civil rights issues in their day-to-day work. Since we launched the training in July 2022, it has been taken by more than 50,000 full-time employees. 
  • Responsible AI Associations. We continue our work to build more responsible AI systems that do not spread content that is harmful or disrespectful to historically marginalized communities. Last year, Meta assembled a cross-functional team to better understand what can cause potentially problematic associations within our systems and have already implemented some mitigations to reduce the chances of this happening on our platforms. By listening to people with lived experiences, subject matter experts and policymakers, we hope to proactively promote and advance the responsible design and operation of AI systems. 
  • Developed New Machine Learning Technology to More Equitably Distribute Ads on our Apps. We collaborated with the Department of Justice to develop and launch the Variance Reduction System (VRS), which is a new technology that will help distribute ads in a more equitable way on our apps. We launched VRS in the US for housing ads and will expand it to employment and credit ads. 
  • Providing Hate Crime Enforcement Trainings. We continue to collaborate with the James Byrd, Jr. Center to Stop Hate at the Lawyers’ Committee for Civil Rights Under Law to deliver hate crime training to law enforcement officers, prosecutors and public safety officials. We focus on how to use our products in a nondiscriminatory manner, underscore our prohibition against operating fake accounts and highlight the importance of engaging with marginalized communities and people impacted by hate. 
  • Developing More Equitable and Inclusive Experiences for the Metaverse. We continue to build for the metaverse with marginalized communities in mind. We have introduced more educational experiences, such as the work of VR for Good in producing The March 360 in Horizon Worlds, which allows people “to share the feeling of marching with the 250,000-plus people” who came to witness Dr. Martin Luther King deliver his “I Have a Dream” speech, and MLK: Now is the Time. We continue to engage with civil rights communities about governance and policy in the metaverse, and are developing more inclusive avatars that represent different cultures, religions and regions to help build recognition and a sense of belonging.
  • Improved our Penalty System to be More Effective and Equitable. As a result of feedback from the civil rights community and the Oversight Board, we recently announced updates to our penalty system to make it more effective and equitable. While we still remove violating content just as before, under our new system we focus more on helping people understand why we have removed their content, which has proven to help prevent re-offending. 
  • Established a Scholarship to Support the Next Generation of Civil Rights and Tech Leaders. In partnership with the Dorothy I. Height Education Foundation and the National Council of Negro Women, we are supporting the Project Height Scholarship Program, which will fund 26 scholarships for students with a career interest in civil rights and technology. 
  • Piloting Project Height, a Civil Rights Product Review Framework. We began piloting a framework that helps product teams proactively assess potential civil rights impacts and implications on the front-end of the product development process. 
  • Increased Transparency Across Social Issue, Electoral and Political Ads. As part of our ongoing commitment to transparency, we made detailed targeting information (such as location, demographics and interests) for social issue, electoral or political (SIEP) ads available to vetted academic researchers and updated our publicly available Ad Library to include a summary for SIEP ads. This follows feedback from the civil rights community on how more information about advertisers’ targeting choices is critical to understanding the impact of digital advertising on elections and social discourse.

Equally important is the progress we’ve made against the 117 recommendations and action items set out in Meta’s Civil Rights Audit. As of today, 97 of the 117 recommendations and action items from the audit have either been implemented or represent ongoing work. For example, we will continue to engage with outside experts on the subject of AI bias, since work on that issue must remain ongoing. Eleven recommendations and action items remain in progress; three are under evaluation; and six were declined. 

We remain the only major tech company to have a vice president of civil rights, a distinction that has contributed to our progress over the last two years. As we continue our work to advance civil rights across Meta’s apps, we look forward to sharing more.


:

To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookie Policy