Meta

How Facebook is Preparing for Ethiopia’s 2021 General Election

Today, we’re sharing an update on our election integrity work ahead of Ethiopia’s general election on June 21st. This work will continue in the lead up to, during and after the vote, and builds on our longstanding efforts in understanding and addressing the way social media is used in Ethiopia. It includes our efforts to detect and remove hate speech and content that incites violence, our ongoing work to reduce the spread of misinformation, our efforts to improve digital literacy, and the steps we’re taking to make political advertising more transparent.

These efforts are informed by the conversations we’re having with human rights groups, NGOs, local civil society organisations, and regional experts within Facebook. They are being implemented by a team purpose-built to focus on the Ethiopian election. Because local understanding is critical to doing this work effectively, our team includes a number of people in and from Ethiopia including experts in topics like misinformation, hate speech, elections and disinformation. 

Activating Our Elections Operation Center

Facebook opened its first Elections Operations Center in 2018, ahead of the elections held that year in the United States and Brazil. Since then, we’ve run operations centers for major elections around the world, including the upcoming elections in Ethiopia.

At the onset of the COVID-19 pandemic, our Elections Operation Center transitioned from a physical to a virtual work space. However, we’re still bringing together subject matter experts from across the company — including from our threat intelligence, data science, engineering, research, operations, policy and legal teams — so we can respond in real time to potential problems and abuses we see emerging in Ethiopia.

Tackling Hate Speech and Other Harmful Content

Our Community Standards — which set out what is and isn’t allowed on Facebook — cover a number of areas relevant to elections, including policies against harassment and incitement to violence, as well as detailed hate speech policies that ban attacks on people based on characteristics like ethnicity or religion. When we become aware of content that violates these rules we remove it. 

We’ve significantly improved and simplified our reporting tools to make it easier for Ethiopians to tell us when they see violating content, so we can investigate. To further broaden awareness of our policies in Ethiopia, we’ve run online ad and radio campaigns and held training sessions with activists, civil society organizations, small and medium sized businesses owners, government agencies and members of the local media. We’ve also established dedicated reporting channels for specialized international and local human rights and civil society organizations to make sure we can quickly review problematic content they identify for possible violations, and continue to work with local partners who provide us with feedback that we incorporate into our policies and programs.

Alongside these efforts to improve reporting, we’ve also invested in proactive detection technology that helps us catch violating content before people report it to us. We’re now using this technology to proactively identify hate speech in Amharic and Oromo, alongside over 40 other languages globally. 

Over the last few years, we’ve tripled the size of the global team working on safety and security to over 35,000 and hired more content reviewers who are native speakers of Amharic, Oromo and Somali, while having the capacity to review content in Tigrinya.

These investments are having an impact: between March 2020 and March 2021, we removed 87,000 pieces of hate speech in Ethiopia, about 89% of which were detected proactively.

We are taking additional temporary steps ahead of, and during the election to reduce the distribution of content and comments that our proactive detection technology identifies as likely containing hate speech, or violence and incitement, while our teams investigate it. This content will be removed if we determine it violates our policies. 

In addition to our standard practice of removing accounts that repeatedly violate our Community Standards, we are also continuing to reduce the distribution of content posted from accounts that have recently and repeatedly posted violating content in countries facing heightened risks. This means fewer people in Ethiopia will see content from these repeat offenders located in those countries.

Combating Misinformation and False News

Because we know it’s important for people to see accurate information on Facebook and Instagram, we are working to fight the spread of misinformation on our services in Ethiopia. We remove the most serious kinds of misinformation, such as content that is intended to suppress voting or which could cause violence or physical harm. For content which doesn’t violate these particular rules, we’ve partnered with independent third-party fact-checking partners in Ethiopia — Pesa Check and AFP — to ascertain whether something is misinformation or false news.  When they review and rate a piece of content as false, we reduce its distribution so fewer people see it and add a warning label with more information for anyone who does see it. In general, when a warning screen is placed on a post, 95% of the time people don’t click past it. 

Addressing Misleading, Outdated or Out of Context Imagery

We often see people try to deceive, abuse or cause harm by sharing news articles that are taken out of context, or outdated or misleading images accompanied by false claims and misinformation. To address this, we’ve temporarily expanded our misinformation policy in Ethiopia. We are removing out of context imagery which makes false allegations about the perpetrators, severity or targets of violence in Ethiopia. This is based on guidance from over 50 local partners and independent experts who have told us these specific claims could result in violence or physical harm. We’ve also launched tools which notify people when a news article they’re about to share is more than 90 days old. People will also see a message when they attempt to share specific types of images, including photos that are over a year old, warning them that the image they are about to share could be harmful or misleading. (Updated on August 16, 2021 at 9:39AM PT to further clarify how we are addressing out of context imagery in Ethiopia.)

Improving the Transparency of Political Advertising

We believe political discussion and debate should be transparent to every voter, which is why over the past few years we’ve introduced a number of tools that provide more information about political ads on Facebook and Instagram. In March this year, we made these political ads transparency tools mandatory in Ethiopia. As a result, anybody who wants to run political ads in Ethiopia must now go through a verification process to prove who they are and that they live in Ethiopia. We then run additional checks to ensure compliance with our policies. Political ads in Ethiopia will be labelled with a “paid by” disclaimer, so you can see who paid for them. We also put political ads that run in Ethiopia in our Ads Library so that everyone can see what ads are running, information about targeting, and how much was spent. This fully searchable archive will store these ads for seven years. In addition to providing more transparency, earlier this year we also announced that we are rolling out new controls so that people can choose to see fewer social issue, electoral, and political ads. When people use these controls, they’ll no longer see ads that run with a “Paid for by” disclaimer. These changes mean that political advertising on Facebook and Instagram is now more transparent than other forms of election campaigning, whether that’s billboards, newspaper ads, direct mail, leaflets or targeted emails. 

Supporting Digital Literacy

Finally, we’re also investing in digital literacy in Ethiopia through our work with local partners. We’ve partnered with the Center for African Leadership Studies to implement “My Digital World”, a series of live webinars which has seen us engage with over 7,000 people in the country on topics such as online safety, privacy, digital citizenship, news and media literacy.  We’ve rolled out a media literacy campaign, aimed at educating and informing people on how to detect potential false news, and ran billboard advertising campaigns across Addis Ababa, the first of its kind across Africa, focused on informing and educating people on how to stay safe online and use social media responsibly.