On Sunday, February 16, the Financial Times published the following op-ed by Mark Zuckerberg.
Every day, platforms like Facebook have to make trade-offs on important social values — between free expression and safety, privacy and law enforcement, and between creating open systems and locking down data.
There is rarely a clear “right” answer. Often it is as important that decisions are made in a way that people feel is legitimate.
I don’t think private companies should make so many decisions alone when they touch on fundamental democratic values. That is why last year I called for regulation in four areas: elections, harmful content, privacy and data portability.
On Monday Facebook is publishing our second white paper setting out some questions regulation might address. We’ve also been working with governments — including in France and New Zealand — on what regulation could look like. A few themes kept coming up.
One is transparency. Governments often tell us it’s hard to design content regulation because they don’t have insight into how our systems work. Facebook already publishes more detailed reports about harmful content than any other major internet service, and we’ve shown regulators how our systems operate. We’re also looking at opening up our content moderation systems for external audit.
Then there are political ads. We believe advertising is more transparent on Facebook than television, print or other online services. We publish details about political and issue ads — including who paid for them, how much was spent, and how many people were reached — in our ads library.
But who decides what counts as political advertising in a democracy? If a non-profit runs an ad about immigration during an election, is it political? Who should decide — private companies, or governments?
Another theme is openness. I’m glad the EU is looking at making data sharing easier, because it enables people to build things that are valuable for society. International agencies use Facebook’s Data for Good programme to figure out which communities need help after natural disasters, and governments use our publicly available population density maps for vaccination campaigns.
Of course, you should always be able to transfer your data between services. But how do we define what counts as your data? If I share something with you, like my birthday, should you be able to take that data to other services, like your calendar app? Is that my data or yours?
We have to balance promoting innovation and research against protecting people’s privacy and security.
Without clear rules on portability, strict privacy laws encourage companies to lock down data, refusing to share with others, to minimise regulatory risks.
Lastly, we need more oversight and accountability. People need to feel that global technology platforms answer to someone, so regulation should hold companies accountable when they make mistakes.
Companies like mine also need better oversight when we make decisions, which is why we’re creating an independent Oversight Board so people can appeal Facebook’s content decisions.
Tech companies should serve society. That includes at the corporate level, so we support the OECD’s efforts to create fair global tax rules for the internet.
I believe good regulation may hurt Facebook’s business in the near term but it will be better for everyone, including us, over the long term.
These are problems that need to be fixed and that affect our industry as a whole. If we don’t create standards that people feel are legitimate, they won’t trust institutions or technology.
Of course, we won’t agree with every proposal. Regulation can have unintended consequences, especially for small businesses that can’t do sophisticated data analysis and marketing on their own. Millions of small businesses rely on companies like ours to do this for them.
If regulation makes it harder for them to share data and use these tools, that could disproportionately hurt them and inadvertently advantage larger companies that can.
Still, rather than relying on individual companies to set their own standards, we’d benefit from a more democratic process. This is why we’re pushing for new legislation, and it’s why we support existing US proposals to prevent election interference like the Honest Ads Act and the Deter Act.
To be clear, this isn’t about passing off responsibility. Facebook is not waiting for regulation; we’re continuing to make progress on these issues ourselves.
But I believe clearer rules would be better for everyone. The internet is a powerful force for social and economic empowerment. Regulation that protects people and supports innovation can ensure it stays that way.