By Monika Bickert, Vice President of Global Policy Management
Update on July 17, 2018:
After watching the program, we want to give you some more facts about a couple of important issues raised by Channel 4 News.
Cross Check
We want to make clear that we remove content from Facebook, no matter who posts it, when it violates our standards. There are no special protections for any group — whether on the right or the left. ‘Cross Check’ — the system described in Dispatches — simply means that some content from certain Pages or Profiles is given a second layer of review to make sure we’ve applied our policies correctly.
This typically applies to high profile, regularly visited Pages or pieces of content on Facebook so that they are not mistakenly removed or left up. Many media organizations’ Pages — from Channel 4 to The BBC and The Verge — are cross checked. We may also Cross Check reports on content posted by celebrities, governments, or Pages where we have made mistakes in the past. For example, we have Cross Checked an American civil rights activist’s account to avoid mistakenly deleting instances of him raising awareness of hate speech he was encountering.
To be clear, Cross Checking something on Facebook does not protect the profile, Page or content from being removed. It is simply done to make sure our decision is correct.
Britain First was a cross checked Page. But the notion this in anyway protected the content is wrong. In fact, we removed Britain First from Facebook in March because their Pages repeatedly violated our Community Standards.
Minors
We do not allow people under 13 to have a Facebook account. If someone is is reported to us as being under 13, the reviewer will look at the content on their profile (text and photos) to try to ascertain their age. If they believe the person is under 13, the account will be put on a hold and the person will not be able to use Facebook until they provide proof of their age. Since the program, we have been working to update the guidance for reviewers to put a hold on any account they encounter if they have a strong indication it is underage, even if the report was for something else.
Originally published on July 16, 2018:
People all around the world use Facebook to connect with friends and family and openly discuss different ideas. But they will only share when they are safe. That’s why we have clear rules about what’s acceptable on Facebook and established processes for applying them. We are working hard on both, but we don’t always get it right.
This week a TV report on Channel 4 in the UK has raised important questions about those policies and processes, including guidance given during training sessions in Dublin. It’s clear that some of what is in the program does not reflect Facebook’s policies or values and falls short of the high standards we expect.
We take these mistakes incredibly seriously and are grateful to the journalists who brought them to our attention. We have been investigating exactly what happened so we can prevent these issues from happening again. For example, we immediately required all trainers in Dublin to do a re-training session — and are preparing to do the same globally. We also reviewed the policy questions and enforcement actions that the reporter raised and fixed the mistakes we found.
We provided all this information to the Channel 4 team and included where we disagree with their analysis. Our Vice President for Global Policy Solutions, Richard Allan, also answered their questions in an on-camera interview. Our written response and a transcript of the interview can be found in full here and here.
It has been suggested that turning a blind eye to bad content is in our commercial interests. This is not true. Creating a safe environment where people from all over the world can share and connect is core to Facebook’s long-term success. If our services aren’t safe, people won’t share and over time would stop using them. Nor do advertisers want their brands associated with disturbing or problematic content.
How We Create and Enforce Our Policies
More than 1.4 billion people use Facebook every day from all around the world. They post in dozens of different languages: everything from photos and status updates to live videos. Deciding what stays up and what comes down involves hard judgment calls on complex issues — from bullying and hate speech to terrorism and war crimes. It’s why we developed our Community Standards with input from outside experts — including academics, NGOs and lawyers from around the world. We hosted three Facebook Forums in Europe in May, where we were able to hear from human rights and free speech advocates, as well as counter-terrorism and child safety experts.
These Community Standards have been publicly available for many years, and this year, for the first time, we published the more detailed internal guidelines used by our review teams to enforce them.
To help us manage and review content, we work with several companies across the globe including CPL, the company featured in the program. These teams review reports 24 hours a day, seven days a week, across all time zones and in dozens of languages. When needed, they escalate decisions to Facebook staff with deep subject matter and country expertise. For specific, highly problematic types of content such as child abuse, the final decisions are made by Facebook employees.
Reviewing reports quickly and accurately is essential to keeping people safe on Facebook. This is why we’re doubling the number of people working on our safety and security teams this year to 20,000. This includes over 7,500 content reviewers. We’re also investing heavily in new technology to help deal with problematic content on Facebook more effectively. For example, we now use technology to assist in sending reports to reviewers with the right expertise, to cut out duplicate reports, and to help detect and remove terrorist propaganda and child sexual abuse images before they’ve even been reported.
We are constantly improving our Community Standards and we’ve invested significantly in being able to enforce them effectively. This is a complex task, and we have more work to do. But we are committed to getting it right so Facebook is a safe place for people and their friends.