Meta

Improving People’s Experiences Through Community Forums

By Brent Harris, VP, Governance

Takeaways

  • We’re announcing community forums as a new tool to help us make decisions that govern our technologies. 
  • Community forums bring together diverse groups of people from all over the world to discuss tough issues, consider hard choices and share their perspectives on a set of recommendations. 
  • Developing rules this way could improve the quality of governance by giving more people a voice in how we make decisions on some of the most difficult questions we face.

In order to help us navigate some of the most difficult questions we face across our technologies, we’re adding a new method for making decisions called community forums. It brings together diverse groups of people from all over the world to discuss tough issues, consider hard choices and share their perspectives on a set of recommendations that could improve the experiences people have across our apps and technologies every day.

As the next step, the Behavioral Insights Team (BIT), in consultation with the Deliberative Democracy Lab at Stanford’s Center on Democracy, Development and the Rule of Law, will host a virtual community forum on our behalf in December. This forum will include nearly 6,000 people from 32 countries as part of our broader effort to create programs that foster more inclusive decision-making and enable more people to have a voice in the development of our apps and technologies. It will also be hosted on Stanford’s Online Deliberation platform.

BIT will use Stanford’s Deliberative Polling method, which is a form of deliberative process used for decades by organizations and governments around the world. While deliberative processes, such as citizens assemblies and panels, are a well-established method for informing decisions, it’s extremely rare for them to be convened at this global scale.

For this deliberation, we chose to focus on conduct in closed virtual spaces so that the forum could advise on policy and product development for virtual experiences such as Horizon Worlds. Participants will have the opportunity to speak with each other as well as ask a range of experts questions about relevant topics including security, privacy and social media before making a series of non-binding recommendations. Ahead of these discussions, participants will have a chance to review educational materials developed in consultation with the Stanford Democracy Lab and the BIT about virtual social experiences and how the Code of Conduct in Virtual Experiences is enforced in these spaces. These materials have also been vetted for neutrality by an external group of advisors who have expertise in content moderation, virtual reality and freedom of expression.

After the deliberation, we’ll release the results publicly along with a response and an explanation of any actions we’ll take. The BIT and Stanford’s Deliberative Democracy Lab will also issue guidance assessing how the process went, and how we might be able to improve the effectiveness of the program as we evaluate potential to hold more in the future.

We’ve seen encouraging signs already about the impact that these forums can have. For example, we previously worked with BIT to test whether community forums could be helpful for advising on what we should do about misleading climate content. The pilots we conducted — in Nigeria, India, Brazil, the US and France— gave us valuable insights into how people felt about this specific content moderation issue, demonstrated the value of this deliberative approach and affirmed people’s willingness to work together in order to come to a recommendation.

We are proud to be able to explore this innovative solution to digital governance. These experiments are key to helping us discover more equitable and innovative ways to make complex decisions, and we’re excited about the opportunity to help build toward the metaverse with global voices at the very heart of this process.


:

To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookie Policy