Meta

Additional Steps to Protect Myanmar’s 2020 Election

In August we shared an update on the work we’re doing to prepare for the November elections in Myanmar. Today, we’re announcing some additional steps aimed at protecting the integrity of that election on our platform. 

Demoting Likely Hate Speech 

As we announced in May, and in keeping with our commitment to the UN Guiding Principles on Business and Human Rights, Facebook is taking additional steps to combat hate speech in countries that are in conflict or at risk of conflict. 

To decrease the risk of problematic content going viral in Myanmar and potentially inciting violence or hatred ahead of or during the election, we will significantly reduce the distribution of content that our proactive detection technology identifies as likely hate speech. This content will be removed if determined to violate our policies, but its distribution will remain reduced until that determination is made.

Under our existing Community Standards, we remove certain slurs that we determine to be hate speech. To complement that effort, we are using technology to identify new words and phrases associated with hate speech in Myanmar, and are either removing posts with that language or reducing their distribution. We are constantly revising and updating the list of Myanmar-specific, prohibited words and phrases. 

Improving Efforts to Reduce Repeat Offender Content 

In addition to our standard practice of removing accounts that repeatedly violate our Community Standards, we will also improve our efforts to temporarily reduce the distribution of content from accounts that have recently and repeatedly violated our policies. This includes providing additional information to those whose accounts are affected.

Expanding Misinformation Labels to the Burmese Language 

As part of our efforts to tackle misinformation, we work with third-party fact-checkers around the world, including three partners in Myanmar who are certified by the International Fact-Checking Network, to provide people with additional context about the content they’re seeing on Facebook. For example, when people come across information rated false by our third-party fact-checkers, a screen warns them of this. Now, we’re expanding these warning screens to include the Burmese language.

Directing People to Authoritative Voting Information 

We recognize it is important for people to be able to access reliable information about elections and voting especially during a campaign period. That is why we are helping to direct people to authoritative sources of election information, where they can learn how to check voter lists, as well as voting times and locations. This includes making the Facebook pages of the Union Election Commission, Vote MM and First Time Youth Voters for 2020 much more accessible to users.

Digital Literacy Training in Myanmar

We’ve also been working across Myanmar to train civil society organizations and reporters on journalist safety, media and digital literacy, as well as Facebook’s Community Standards and third-party fact-checking programs. As part of this effort, we’ve held a monthly television talk-show on digital literacy called Tea Talks, that focuses on issues like online bullying and account security. 

We’ve also introduced tools to newsrooms in Myanmar such as CrowdTangle, a public insights tool from Facebook that makes it easy to follow, analyze and report on what’s happening with public content on social media.

And in June, we held a month of webinars on election best practices with 50 people from 13 different news organizations in Myanmar, including ethnic media. 

Looking ahead, we will continue to support our news partners in Myanmar beyond the election. And we’ll continue to scale up our Community Standards enforcement efforts to meet the challenge of protecting elections and keeping people safe, now and in the future.