This post was originally published in The New York Times.
When does a company become too big or too successful to exist? Chris Hughes, a co-founder of Facebook, argues that Facebook should be dismantled because “big” poses a risk to society. In my view — and that of most people who write about technology’s impact on society — what matters is not size but rather the rights and interests of consumers, and our accountability to the governments and legislators who oversee commerce and communications.
Mr. Hughes is right that companies should be held accountable for their actions. If people were writing the rules for the internet from scratch today, they wouldn’t want so many important social, political and ethical questions left in the hands of private companies. But the challenges he alludes to, including election interference and privacy safeguards, won’t evaporate by breaking up Facebook or any other big tech company. Fixing these problems requires significant resources — and strong new rules.
We employ 38,000 people globally, and every day more than 2 billion people use Facebook, Instagram or one of our other products. More than 90 million small businesses thrive with the help of our platforms because they can reach a global audience for the first time. And nonprofits of every size raise money and promote their causes across about 200 countries and in every time zone. We’re proud of that.
But with great success comes great responsibility. While we operate under more regulation now than at any point in the history of the company, we believe more should be done. Mark Zuckerberg has been in Paris this week meeting with regulators and with President Emmanuel Macron of France to discuss the impact of technology and the need for legislative solutions.
We concentrate on four key areas: reducing the amount of harmful content that people post; protecting democratic elections; supporting unified rules for data privacy; and giving individuals more ability to easily move their data. In all these areas, we believe that governments should make the rules consistent with their own principles, not those of private companies like Facebook.
In recent months we’ve also been working with American regulators on how we might introduce significant improvements to our approach on privacy. We are in the unusual position of asking for more regulation, not less.
Mr. Hughes maintains that lawmakers merely marvel at Facebook’s explosive growth and have overlooked their own responsibility to protect the public through more competition.
This argument holds dangerous implications for the American technology sector, the strongest pillar of the economy. And it reveals misunderstandings of Facebook and the central purpose of antitrust law.
The first misunderstanding is about Facebook itself and the competitive dynamics in which we operate. We are a large company made up of many smaller pieces. All of our products and services fight for customers. Each one has at least three or four competitors with hundreds of millions, if not billions, of users. In photo and video-sharing, we compete against services like YouTube, Snapchat, Twitter, Pinterest and TikTok, an emerging competitor.
In messaging, we’re not even the leader in the top three markets — China, Japan and, by our estimate, the United States — where we compete with Apple’s iMessage, WeChat, Line and Microsoft’s Skype. Globally, the context in which social media must be understood, China alone has several large social media companies, including powerhouses like Tencent and Sina. It will seem perverse to people in Europe, and certainly in China, to see American policymakers talking about dismantling one of America’s biggest global players.
In this competitive environment, it is hard to sustain the claim that Facebook is a monopoly. Almost all of our revenue comes from digital advertising, and most estimates say Facebook’s share is about 20% of the United States online ad market, which means 80% of all digital ads happen off our platforms.
The second misunderstanding is of antitrust law. These laws, developed in the 1800s, are not meant to punish a company because people disagree with its management. Their main purpose is to protect consumers by ensuring they have access to low-cost, high-quality products and services. And especially in the case of technology, rapid innovation. That is exactly where Facebook puts its attention: building the best products, free for consumers, and funded by advertisers.
What antitrust law isn’t about is size alone. In Facebook’s case, our size has not only brought innovation, it has also allowed us to make a huge investment in protecting the safety and security of our services.
Over the past two years we’ve focused heavily on blocking foreign adversaries from trying to influence democratic elections by using our platforms. We’ve done the same to protect against terrorism and hate speech and to better safeguard people’s data. And the resources that we will spend on security and safety this year alone will be more than our overall revenues at the time of our initial public offering in 2012. That would be pretty much impossible for a smaller company.
Big in itself isn’t bad. Success should not be penalized. Our success has given billions of people around the globe access to new ways of communicating with one another. Earning money from ads means we can provide those tools to people for free. Facebook shouldn’t be broken up — but it does need to be held to account. Anyone worried about the challenges we face in an online world should look at getting the rules of the internet right, not dismantling successful American companies.