Meta

Tightening Our Policies and Expanding Resources to Prevent Suicide and Self-Harm

By Antigone Davis, Global Head of Safety

As a global online community, keeping people safe on our apps is incredibly important to us. Since 2006, we’ve worked with experts from around the world to inform our policies, practices and products supporting those at risk of suicide or self-injury.

Today, on World Suicide Prevention Day, we’re sharing an update on what we’ve learned and some of the steps we’ve taken in the past year, as well as additional actions we’re going to take, to keep people safe on our apps, especially those who are most vulnerable.

Improving How We Handle Suicide and Self-injury Related Content

Earlier this year, we began hosting regular consultations with experts from around the world to discuss some of the more difficult topics associated with suicide and self-injury. These include how we deal with suicide notes, the risks of sad content online and newsworthy depictions of suicide. Further details of these meetings are available on Facebook’s new Suicide Prevention page in our Safety Center. 

As a result of these consultations, we’ve made several changes to improve how we handle this content. We tightened our policy around self-harm to no longer allow graphic cutting images to avoid unintentionally promoting or triggering self-harm, even when someone is seeking support or expressing themselves to aid their recovery. On Instagram, we’ve also made it harder to search for this type of content and kept it from being recommended in Explore. We’ve also taken steps to address the complex issue of eating disorder content on our apps by tightening our policy to prohibit additional content that may promote eating disorders. And with these stricter policies, we’ll continue to send resources to people who post content promoting eating disorders or self-harm, even if we take the content down. Lastly, we chose to display a sensitivity screen over healed self-harm cuts to help avoid unintentionally promoting self-harm. 

Our engagement with experts has proven so valuable that we’re also hiring a health and well-being expert to join our safety policy team. This person will focus exclusively on the health and well-being impacts of our apps and policies, and will explore new ways to improve support for our community, including on topics related to suicide and self-injury. 

And for the first time, we’re also exploring ways to share public data from our platform on how people talk about suicide, beginning with providing academic researchers with access to the social media monitoring tool, CrowdTangle. To date, CrowdTangle has been available primarily to help newsrooms and media publishers understand what is happening on Facebook. But we are eager to make it available to two select researchers who focus on suicide prevention to explore how information shared on Facebook and Instagram can be used to further advancements in suicide prevention and support.

In addition to all we are doing to find more opportunities and places to surface resources, we’re continuing to build new technology to help us find and take action on potentially harmful content, including removing it or adding sensitivity screens. From April to June of 2019, we took action on more than 1.5 million pieces of suicide and self-injury content on Facebook and found more than 95% of it before it was reported by a user. During that same time period, we took action on more than 800 thousand pieces of this content on Instagram and found more than 77% of it before it was reported by a user. (Updated on September 10, 2019 at 11:30 AM PT to share more information on how we handle potentially harmful content.)

Becoming a Safer Forum for Difficult Conversations

Experts have told us that one of the most effective ways to prevent suicide is for people to hear from friends and family who care about them. Facebook has a unique role in facilitating those kinds of connections and we’re taking additional steps to support those who are discussing these sensitive topics, especially young people. 

To help young people safely discuss topics like suicide, we’re enhancing our online resources by including Orygen’s #chatsafe guidelines in Facebook’s Safety Center and in resources on Instagram when someone searches for suicide or self-injury content.

The #chatsafe guidelines were developed together with young people to provide support to those who might be responding to suicide-related content posted by others or for those who might want to share their own feelings and experiences with suicidal thoughts, feelings or behaviors. 

We’ll continue to invest in people, technology and resources so that we can do more to protect people on our apps. Visit our Suicide Prevention Resource page to learn more about what’s available.



To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookie Policy