Meta

Partnering with Experts to Protect People from Self-Harm and Suicide

By Antigone Davis, Global Head of Safety

Update on September 29, 2021 at 11:50AM PT:

Information in this article may be outdated. For current information about our suicide and self-injury content detection technology, please visit our Safety Center. As described in the Safety Center, our algorithms are intended to help identify potential suicide and self-injury content and are not intended to diagnose or treat any mental health or other condition.

Originally published on February 7, 2019 at 9:54AM PT:

We care deeply about the safety of our community — and with the advice of experts we set policies for what is and isn’t allowed on the platform. For example, while we don’t allow people to celebrate or promote self harm or suicide, we do let people share admissions of self harm so their friends and family have an opportunity to reach out, offer support and provide help or resources. And when there’s risk of imminent harm, we work with emergency responders who can help. Today, we’re sharing updates in how we enforce our policies against self harm or suicide.

Since 2006 we’ve built our approach with experts in suicide prevention and safety. We seek their input on current research and best practices to ensure everyone’s safety is being considered. We constantly re-examine how we’re doing as we develop new products or see people using our services in new ways. In some cases, it might be a single experience that causes us to pause and question whether we need to make changes. And that’s what we’ve done following the tragic death of a young girl by suicide in the UK. Bringing together more than a dozen experts from around the world, many of whom helped us develop our policies in the first place, we asked them how we could better weigh two important goals that are sometimes at odds: the opportunity to get help and share paths to recovery for people who might be in harm’s way, and the possibility that we may unintentionally promote self-harm or remove content that might shame or trigger the poster to self-harm. Four main themes emerged from this discussion:

First, these experts unanimously reaffirmed that Facebook should allow people to share admissions of self harm and suicidal thoughts, but should not allow people to share content promoting it. They stressed the importance of giving room for people to share the challenges they are going through, including admitting thoughts or actions of self harm. They said this content, though tragic and upsetting to some, often helps people connect with support and resources, helping in their recovery and saving lives.

But the experts also advised that some graphic images of self-harm, particularly cutting, can have the potential to unintentionally promote self-harm even when they are shared in the context of admission or a path to recovery. As a result, based on their feedback, we will no longer allow graphic cutting images even in the context of admission and we will begin enforcing this policy in the coming weeks. To learn more about this, visit the Instagram Info Center.

We also discussed whether other kinds of content — like healed cutting scars shared to tell the story of recovery or certain sad memes — might unintentionally promote self-harm. This is an area where there is incomplete and sometimes competing research and the experts suggested that we continue to monitor the latest findings. We’ll do so as we continue discussions around this topic.

Finally, the experts emphasized the importance of building products that facilitate supportive connections, finding more opportunities to offer help and resources, and importantly, avoiding shaming people who post about their self-harm thoughts or actions. We will continue to provide resources, including messages with links to helplines, and over the coming weeks we will explore additional steps or products we can provide within Instagram and Facebook.

We are grateful for our continued partnership with these experts on such important issues — their guidance is crucial as we work to better protect our community.

Experts Consulted [Updated on February 7, 2019 at 1:30PM PT to include additional expert]:

Australia: Orygen – Dr. Jo Robinson

Brazil: Instituto Vita Alere – Dr. Karen Scavacini; Safernet – Thiago Tavares and Juliana Cunha

Bulgaria: Safer Internet – Georgi Apostolov

Canada: Kids Help Phone – Alisa Simon

India: ICALL Tata Institute for Social Sciences – Aparna Joshi

Mexico: University of Guadalajara, Neurosciences Department – Dr. Luis Miguel Sanchez-Loyo

Philippines: Philippines General Hospital, Child Protection Unit – Dr. Norie Balderrama

Thailand: Samaritans of Thailand – Trakarn Chensy

United Kingdom: Bristol University, Bristol Medical School – Dr. Lucy Biddle; Centre for Mental Health – Sarah Hughes; The Mix UK – Chris Martin; Samaritans – Jacqui Morrissey; Papyrus – Ged Flynn and Lisa Roxby

United States: Save.org – Dr. Dan Reidenberg; The National Suicide Prevention Lifeline – John Draper

 



To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookie Policy