Meta

You and the Algorithm: It Takes Two to Tango

By Nick Clegg, VP of Global Affairs and Communication

Originally published on Medium.

In a recent article for The Atlantic, Adrienne LaFrance compared Facebook to a Doomsday Machine: “a device built with the sole purpose of destroying all human life.” In the Netflix documentary The Social Dilemma, the filmmakers imagine a digital control room where engineers press buttons and turn dials to manipulate a teenage boy through his smartphone. In her book Surveillance Capitalism, the Harvard social psychologist Shoshana Zuboff paints a picture of a world in which tech companies have constructed a massive system of surveillance that allows them to manipulate people’s attitudes, opinions and desires.

In each of these dystopian depictions, people are portrayed as powerless victims, robbed of their free will. Humans have become the playthings of manipulative algorithmic systems. But is this really true? Have the machines really taken over?

It is alleged that social media fuels polarization, exploits human weaknesses and insecurities, and creates echo chambers where everyone gets their own slice of reality, eroding the public sphere and the understanding of common facts. And, worse still, this is all done intentionally in a relentless pursuit of profit.

At the heart of many of the concerns is an assumption that in the relationship between human beings and complex automated systems, we are not the ones in control. Human agency has been eroded. Or, as Joanna Stern declared in the Wall Street Journal in January, we’ve “lost control of what we see, read — and even think — to the biggest social-media companies.”

Defenders of social media have often ignored or belittled these criticisms — hoping that the march of technology would sweep them aside, or viewing the criticisms as misguided. This is a mistake: Technology must serve society, not the other way around. Faced with opaque systems operated by wealthy global companies, it is hardly surprising that many assume the lack of transparency exists to serve the interests of technology elites and not users. In the long run, people are only going to feel comfortable with these algorithmic systems if they have more visibility into how they work and then have the ability to exercise more informed control over them.

Companies like Facebook need to be frank about how the relationship between you and their major algorithms really works. And they need to give you more control.

Some critics seem to think social media is a temporary mistake in the evolution of technology — and that once we’ve come to our collective senses, Facebook and other platforms will collapse and we’ll all revert to previous modes of communication. This is a profound misreading of the situation — as inaccurate as the December 2000 Daily Mail headline declaring the internetmay just be a passing fad.” Even if Facebook ceased to exist, social media won’t be — can’t be — uninvented. The human impulse to use the internet for social connection is profound.

Data-driven personalized services like social media have empowered people with the means to express themselves and to communicate with others on an unprecedented scale. And they have put tools into the hands of millions of small businesses around the world which were previously available only to the largest corporations. Personalized digital advertising not only allows billions of people to use social media for free, it is also more useful to consumers than untargeted, low-relevance advertising. Turning the clock back to some false sepia-tinted yesteryear — before personalized advertising, before algorithmic content ranking, before the grassroots freedoms of the internet challenged the powers that be — would forfeit so many benefits to society.

But that does not mean the concerns about how humans and algorithmic systems interact should be dismissed. There are clearly issues to be resolved and questions to be answered. The internet needs new rules — designed and agreed by democratically elected institutions — and technology companies need to make sure their products and practices are designed in a responsible way that takes into account their potential impact on society. That starts — but by no means ends — with putting people, not machines, more firmly in charge.

Read the full article on Medium


:

To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookie Policy