Meta

What the Wall Street Journal Got Wrong

By Nick Clegg, Vice President of Global Affairs

A lot has been said about Facebook this week. A series of articles published by the Wall Street Journal has focused on some of the most difficult issues we grapple with as a company — from content moderation and vaccine misinformation, to algorithmic distribution and the well-being of teens. These are serious and complex issues, and it is absolutely legitimate for us to be held to account for how we deal with them. But these stories have contained deliberate mischaracterizations of what we are trying to do, and conferred egregiously false motives to Facebook’s leadership and employees.  

At the heart of this series is an allegation that is just plain false: that Facebook conducts research and then systematically and willfully ignores it if the findings are inconvenient for the company. This impugns the motives and hard work of thousands of researchers, policy experts and engineers at Facebook who strive to improve the quality of our products, and to understand their wider (positive and negative) impact. It’s a claim which could only be made by cherry-picking selective quotes from individual pieces of leaked material in a way that presents complex and nuanced issues as if there is only ever one right answer. 

With any research, there will be ideas for improvement that are effective to pursue and ideas where the tradeoffs against other important considerations are worse than the proposed fix. The fact that not every idea that a researcher raises is acted upon doesn’t mean Facebook teams are not continually considering a range of different improvements. At the same time, none of these issues can be solved by technology companies alone, which is why we work in close partnership with researchers, regulators, policymakers and others.

But none of that collaborative work is helped by taking a deliberately lop-sided view of the wider facts. For example, to suggest that misinformation has somehow overwhelmed our COVID-19 vaccine response ignores the most important fact: that vaccine hesitancy among Facebook’s US users has declined by about 50% since January. The Journal article goes on to discuss at length how pro-vaccine posts are undermined by negative comments, once again burying a crucial point: that health organizations continue posting because their own measurements show how their posts on our platforms effectively promote vaccines, despite negative comments.

Similarly, to suggest that the research community is settled in its view on the intersection between social media and well-being is simply not the case. The truth is that research into the impact social media has on people is still relatively nascent and evolving, and social media itself is changing rapidly. Some researchers argue that we need more evidence to understand social media’s impact on people. Each study has limitations and caveats, so no single study is going to be conclusive. We need to rely on an ever-growing body of multi-method research and expert input.

What would be really worrisome is if Facebook didn’t do this sort of research in the first place. The reason we do it is to hold up a mirror to ourselves and ask the difficult questions about how people interact at scale with social media. These are often complex problems where there are no easy answers — notwithstanding the wish to reduce them to an attention-grabbing newspaper headline.

Facebook understands the significant responsibility that comes with operating a global platform. We take it seriously, and we don’t shy away from scrutiny and criticism. But we fundamentally reject this mischaracterization of our work and impugning of the company’s motives. I wish there were easy answers to these issues, and that choices we might make wouldn’t come with difficult trade-offs. That is not the world we live in. We will continue to invest in research into these serious and complex issues. We will continue to ask ourselves the hard questions. And we will continue to improve our products and services as a result. 



To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookie Policy