Meta

Fighting Platform Abuse, Simplifying Privacy in Groups, and Protecting Information While Sharing Data

This is the first post in a new series about privacy improvements we’ve made. As I explained in the introduction to our Privacy Matters page, our privacy work is never finished, and we’re committed to communicating transparently about the steps we’re taking to make our systems better and more protective of people’s privacy. Below are several examples of our recent efforts.

Taking Action Against Platform Abuse

In addition to the proactive technical work we do to protect people’s privacy and prevent data misuse, we take legal action when needed to hold people accountable for abusing our platform.

Today we filed lawsuits in Spain and the US to enforce our Terms against using automated means to abuse our services. The defendants in Spain used automation software designed to evade Instagram’s restrictions against fake engagement, and the defendant in the US asked people to provide their Facebook login credentials and then used these credentials to scrape people’s data from Facebook. In both cases, we sent Cease and Desist letters and disabled accounts prior to filing lawsuits. We ultimately took legal action to hold these defendants accountable for violating our policies and trying to undermine the integrity of our services.

We also recently reached a settlement in our case against Rankwave, a South Korean data analytics company that ran apps on the Facebook platform, and failed to verify their compliance with our policies. Rankwave agreed to an audit and a permanent injunction banning them from using Facebook in the future. This legal action was one of the first of its kind and demonstrates our commitment to enforcing our policies to better protect people.

More Consistency for Private Groups

When people are part of a group on Facebook, it’s important they know who can see their information and the content they share. Last year, we simplified the privacy settings for Groups, and based on feedback we’ve received, we recently started notifying people that we are removing the option for groups to change their privacy setting from Private to Public. This limitation already exists for groups with more than 5,000 members, and now we’ve expanded it to all private groups so our settings are clearer and more consistent.

Protecting Privacy in Our Data for Good Program

The COVID-19 pandemic has highlighted how data can help inform response efforts for major crises. In that vein, our Data for Good program aims to help researchers, nonprofits and other groups address these issues, but it’s critical that we protect people’s privacy while sharing data. That’s why we put mechanisms in place to protect your information when we share it and give you control over your data. We recently highlighted how we make privacy central to this work.

For example, partners enrolled in the Data for Good program only have access to aggregate information from Facebook – we don’t share any individual information. Some datasets are shared publicly, but these are formatted to help prevent re-identification, while preserving insights that are useful in responding to crises.

We also use a differential privacy framework that further protects the privacy of individuals in aggregated datasets by ensuring no specific person can be re-identified. This allows us to make these datasets available publicly to help inform the public sector response to humanitarian crises like the COVID-19 pandemic, while protecting people’s privacy.