Meta

How Facebook Is Preparing for Myanmar’s 2020 Election

By Rafael Frankel, Director of Public Policy Southeast Asia, Emerging Markets

Update on February 2, 2021 at 5:45PM PT:

This week, the military in Myanmar took control of the government in a coup and detained senior officials from the ruling National League for Democracy (NLD). We’re treating the situation in Myanmar as an emergency and reducing the distribution of content that likely violates our hate speech and incitement policies, as well as content that praises or supports post-election violence including a coup.

Learn more about how Facebook is protecting elections

On November 8, Myanmar voters will go to the polls for the second democratic election in the country’s recent history. Beginning next week, political parties and candidates in the Southeast Asian nation will vie for the national and regional leadership in what is anticipated to be a hotly contested campaign complicated by the COVID-19 pandemic, as all elections are in 2020.

Despite the challenges that November presents, Facebook continues to focus on our responsibility to ensure the integrity of Myanmar’s election on our platform. We also recognize Myanmar’s complex social and political context and are sensitive to the tumultuous changes and the serious violence that took place since the country’s last election in 2015. 

This is why many teams at Facebook have worked over the past few years to better understand how our platform is used in Myanmar and how we can play a part in helping to prevent harm. We’ve also built a team that is dedicated to Myanmar. This includes people who spend significant time on the ground working with civil society partners who are advocating on a range of human and digital rights issues across Myanmar’s diverse, multi-ethnic society. Our goal is to understand and address current issues and those that are on the horizon. 

We remain committed to advancing the social and economic benefits of Facebook in Myanmar. Although we know that this work will continue beyond November, we acknowledge that Myanmar’s 2020 general election will be an important marker along the journey.

Today, we’re sharing some important updates on the work that we’ve done and will continue to do in the lead up to the vote and some of the progress that we have made. This includes improving our ability to detect and remove hate speech and content that incites violence, our ongoing work to reduce the spread of harmful misinformation, the removal of inauthentic networks in Myanmar that seek to manipulate public opinion, and our engagement with key stakeholders in Myanmar to ensure that Facebook is responsive to local needs 

Preventing Voter Suppression

Facebook has expanded our misinformation policy in Myanmar so that we will now remove misinformation that could lead to voter suppression or damage the integrity of the electoral process. Working with local partners, between now and 22 November, we will remove verifiable misinformation and unverifiable rumors that are assessed as having the potential to suppress the vote or damage the integrity of the electoral process. 

For example, we would remove posts falsely claiming a candidate is a Bengali, not a Myanmar citizen, and thus ineligible.

Combating Hate Speech

We also recognize that there are certain types of content, such as hate speech, that could lead to imminent, offline harm but that could also suppress the vote. We have a clear and detailed policy against hate speech, and we remove violating content as soon as we become aware of it. 

To do this, we’ve invested significantly in proactive detection technology to help us catch violating content more quickly. We also use AI to proactively identify hate speech in 45 languages, including Burmese.

We have continued to invest in improving this technology and our overall enforcement against hate speech as the election approaches. In the second quarter of 2020, we took action against 280,000 pieces of content in Myanmar for violations of our Community Standards prohibiting hate speech, of which we detected 97.8% proactively before it was reported to us. This is up significantly from Q1 2020, when we took action against 51,000 pieces of content for hate speech violations, detecting 83% proactively.

Making It Easier to Understand Ads About Social Issues, Elections or Politics in Myanmar

We’re also introducing more transparency when it comes to issue, electoral and political ads, going far beyond the standard in print and broadcast media. As of this month, all these ads in Myanmar must have a “Paid for by” disclaimer attached to them to show the organization or person behind the ad. All ads about social issues, elections or politics are also stored in our searchable Ad Library for seven years which include additional insights about them to help journalists, regulators, researchers, watchdog groups and others learn more about ads and help hold advertisers and Facebook accountable. 

Making Pages More Transparent

We also want to make sure people are using Facebook authentically, and that they understand who is speaking to them. To that end, we are working with two partners in Myanmar to verify the official national Facebook Pages of political parties. So far, more than 40 political parties have been given a verified badge.​ This provides a blue tick on the Facebook Page of a party and makes it easier for users to differentiate a real, official political party page from unofficial pages, which is important during an election campaign period.

Limiting the Spread of Misinformation

To provide people using the platform with additional context before they share images that are more than a year old and could be potentially harmful or misleading, we introduced an Image Context reshare product in Myanmar in June. Out-of-context images are often used to deceive, confuse and cause harm. With this product, users will be shown a message when they attempt to share specific types of images, including photos that are over a year old and that may come close to violating Facebook’s guidelines on violent content. We warn people that the image they are about to share could be harmful or misleading will be triggered using a combination of artificial intelligence (AI) and human review. 

Messenger Forwarding Limits

We have also introduced a new feature that limits the number of times a message can be forwarded to five. These limits are a proven method of slowing the spread of viral misinformation that has the potential to cause real world harm. This safety feature is available in Myanmar and, over the course of the next few weeks, we will be making it available to Messenger users worldwide. 

Third-Party Fact-Checking

We introduced our third-party fact-checking program in Myanmar in March as part of our ongoing integrity efforts to reduce the spread of misinformation and improve the quality of the news people find online. We now have three fact-checking partners in Myanmar – BOOM, AFP Fact Check and Fact Crescendo.

Preventing and Disrupting Interference

We’re constantly working to find and stop coordinated campaigns that seek to manipulate public debate across our apps. In 2019 alone, we took down over 50 networks worldwide for engaging in coordinated inauthentic behavior (CIB), including ahead of major democratic elections.

Since 2018, we’ve identified and disrupted six networks engaging in Coordinated Inauthentic Behavior in Myanmar. These networks of accounts, Pages and Groups were masking their identities to mislead people about who they were and what they were doing by manipulating public discourse and misleading people about the origins of content. 



To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookie Policy