The next few months will see several major international sporting events, including the Men’s UEFA EURO 2024 in Germany, and the Olympic & Paralympic Games Paris 2024. Across the world, athletes and their fans will be using our apps to connect around these moments. Most of these interactions will be positive – but unfortunately, it is likely there will be some individuals who will want to be abusive towards others. Today, we are setting out the multiple protections we have put in place across our apps to protect fans and athletes alike from abusive behavior on our apps.
Our Rules Against Abuse
Since 2016, we’ve invested more than $20 billion into safety and security and quadrupled the size of our global team working in this area to around 40,000 people. This includes 15,000 content reviewers who review content across Facebook, Instagram and Threads.
We have clear rules against bullying, violent threats and hate speech — and we don’t tolerate it on our apps. As well as responding to reports from our community, outside of private messages we use technology to prevent, detect and remove content that might break these rules. Where a piece of content is deemed to violate our policies, our tech will either flag it to our teams for review, or delete it automatically where there is a clear violation. In the last quarter, we found and took action against 95% of content that violated our hate speech policy before it was reported to us.
Features to Help Keep Athletes Safe
People can turn direct message (DM) requests off completely on Instagram, meaning they won’t receive messages from anyone they don’t follow. For those who don’t want to turn off DM requests completely, we’ve developed Hidden Words. When turned on, this feature automatically sends DM requests — including Story replies — containing offensive words, phrases and emojis to a hidden folder so you don’t have to see them. It also hides comments with these terms under your posts.
We also have a number of additional features on both Facebook and Instagram to moderate comments and limit unwanted interactions across your accounts:
- Our Limits feature, when activated, hides comments and DM requests on Instagram from people who don’t follow you or who only followed you recently, you can also choose to limit everyone that isn’t in your close friends list. When we detect that someone may be experiencing a rush of comments or DM requests, such as after a football match, we’ll prompt them to turn on Limits.
- Once ‘Restrict’ is enabled on Instagram, comments on your posts and tags from a person you have restricted will only be visible to that person. You can choose to view the comment by tapping “See Comment”; approve the comment so everyone can see it; delete it; or ignore it. You won’t receive any notifications for comments from a restricted account.
- We’ve seen that tags and mentions can also be used to target or bully others, so we have controls that allow you to manage who can tag or mention you on Instagram and Facebook. You can choose whether you want everyone, only people you follow or no one to be able to tag or mention you in a comment, caption or Story.
- Those managing professional Pages or profiles on Facebook have the ability to manage comments via a feature called Moderation Assist. For example, you can automatically hide all new comments that have images or links, and review hidden comments in your activity log to determine if you want them to be visible.
- Blocking is a quick and effective way to stop someone from interacting with you on Facebook and Instagram. Now when you block someone on Instagram, you can also block any other accounts they may already have, or may create in the future, making it even harder for them to contact you.
- We have also recently introduced improvements to the way we detect scam accounts on Instagram, and will show users a notice that reminds them to be cautious of an account that could be engaging in fraudulent activity. We’re also using improved detection to help creators moderate fake followers: they are now able to remove spam followers in bulk – helping to keep users free from unwanted interactions.
Encouraging More Supportive Behavior on Instagram
On Instagram, we’re continuing to explore ways to prevent people from posting abusive content in the first place. We use artificial intelligence to detect when someone is trying to post a comment that might be offensive, and warn them it may break our rules. When we tested this intervention, people edited or deleted their comment 50% of the time after seeing these warnings. We have also introduced new nudges that encourage people to pause and rethink before replying to a potentially offensive comment. These nudges are live now for people whose apps are set to English, Portuguese, Spanish, French, Chinese or Arabic. When people send a DM request to a creator or public figure for the first time, we remind them to be respectful, so they remember there’s a real person on the other side of their message.
Helping Protect Women From Abuse
We developed many of our rules — like those against bullying and harassment, threats and hate speech — in consultation with women’s safety groups in order to address the types of harmful content that can disproportionately affect women. We remove and take action on a wide range of abusive content under these rules, from violent or dehumanizing attacks against women based on their sex or gender identity and threats of physical or sexual violence to attacks that use gendered or misogynistic language. We will remove content that breaks these rules during the Olympic & Paralympic Games — as we do all year round.
In addition to the measures outlined above, last year we released new features on Instagram to better protect people from unwanted images and videos in DMs — something we know can disproportionately impact women, especially those in the public eye:
- Before being able to message someone who doesn’t follow them, people must now send an invite to get their permission to connect. People can only send one invite at a time and can’t send more until the recipient accepts the invitation to connect.
- We’ll limit these message request invites to text only, so people can’t send any photos, videos or voice messages, or make calls until the recipient has accepted the invite to chat.
- These changes mean people won’t receive unwanted photos, videos or other types of media from people they don’t follow.
Working With Partners And Relevant Authorities
We regularly speak to athletes, football players, teams and sporting associations around the world — including UEFA and the IOC/IPC— to make sure they know about our latest safety policies and features, and we listen carefully to their feedback. We’re working closely with teams competing in the UEFA EURO 2024 to help their players turn on our safety tools, such as Hidden Words.
We cooperate with law enforcement in their investigations and respond to valid legal requests for information in accordance with our terms of service and applicable law.