On its 28th birthday, how democratic is the web?

Or, why people need platform literacy

5 minutes to read
Piia Varis

On March 12, 1989, Tim Berners-Lee distributed a document titled 'Information management: A proposal' to his colleagues at CERN, the European Organisation for Nuclear Research in Geneva where he worked as a software engineer. His proposal for information management at CERN has since scaled up to be what we know today as the world wide web. 

Two days ago, the web celebrated its 28th birthday. To mark the occasion, its inventor Berners-Lee published an open letter, focusing on three trends that he has become "increasingly worried about" over the past 12 months:

  1. We've lost control of our personal data.
  2. It's too easy for misinformation to spread on the web.
  3. Political advertising online needs transparency and understanding.

Berners-Lee himself "has stuck firmly to the principle of openness, inclusivity and democracy since he invented the web in 1989, choosing not to commercialise his model". However, the web has changed a lot over the years, and as he mentions in his letter, "The current business model for many websites offers content in exchange for personal data." In proprietary social media for instance, people lose control of what is being done with their data, and also cannot reap the benefits of the data themselves. 

Misinformation which is surprising, shocking, or designed to appeal to our biases spreads easily in echo chambers

The fact that people happily give away their personal information in exchange for free services enables widespread data collection by the websites, and, consequently, as Berners-Lee points out in his letter, also large-scale surveillance, the targeting of political dissidents, and a 'chilling effect' on free speech. What exactly 'free' means in the context of such websites is of course also debatable; some have called the use of profit-making social media 'playbour' (play+labour) or 'weisure' (work+leisure), referring to the fact that while users are having a good time on social media, they are simultaneously in fact working for these media - generating profit for them by giving their data away.

The fact that misinformation spreads so easily can also be related to the way in which power is concentrated on the web in the hands of a few websites - in the words of Berners-Lee, "most people find news and information on the web through just a handful of social media sites and search engines". The sites make money from our clicks, and through algorithmic regulation create echo chambers and filter bubbles for us. It is no surprise that misinformation "which is surprising, shocking, or designed to appeal to our biases" spreads easily in these kinds of environments. Similarly, Berners-Lee notes, political advertising benefits from the fact that so many of people's online activities have concentrated on a couple of websites: the vast amounts of data collected by them of their users, and their algorithmic feeding of content they assume users will be interested in based on that data, allow for political advertising to be increasingly individually tailored and targeted. This might be good for the advertisers, but as Berners-Lee asks: "Is that democratic?"

The trends that Berners-Lee laments have to do with the shape of the web, and the kinds of features and qualities built into the websites profiting from appropriating user data. Langdon Winner's decades' worth of eloquent writing on the social and political issues related to technological change, and the social consequences of technologies, are enlightening reading in this respect. "To our accustomed way of thinking," he writes, "technologies are seen as neutral tools that can be used well or poorly, for good, for evil, or something in between. But we usually do not stop to inquire whether a given device might have been designed and built in such a way that it produces a set of consequences" (Winner, 1980, p. 125). Winner questions "our moral and political language for evaluating technology", and what is being overlooked if that language "includes only categories having to do with tools and uses" (ibid.).

When it comes to for instance making sense of and countering the challenge of misinformation that Berners-Lee mentions, a popular solution offered is more and better (social) media literacy. If the recent study by researchers at the Stanford Graduate School of Education is any indication, there is indeed a serious problem with literacy: they found that students had a hard time distinguishing ads from news articles, sponsored from non-sponsored content, and identifying the source of the information they see. However, the context in which fake news and misinformation appear - the web and the social media platforms themselves - is more invisible in the fake news discussion. The message seems to be: "Don't blame the tool, blame the user." Tools, in this understanding, are 'neutral'; it's supposedly all about how we use them. 

People can contribute to the development of the web and become active in debates on the uses of their personal data and its ownership, privacy, and information diets

If Facebook for instance is just a 'tool', then the only thing we need is indeed better literacy in the sense of being able to distinguish fact from fiction and information from misinformation. That kind of literacy is obviously essential, but it is not, however, the whole story. Users should also understand the principles according to which social media operate, how they award content based on the number of eyeballs engaged, amplify certain types of content and suppress or censor others. The values built into the systems play a role in how they work, what they offer us, and how (dis)empowered we are as users.

"It has taken all of us to build the web we have, and now it is up to all of us to build the web we want - for everyone", says Berners-Lee in his letter. He also calls, for instance, for "a fair level of data control back in the hands of people", and "more algorithmic transparency to understand how important decisions that affect our lives are being made". At the same time, while many users claim that they continue using certain social media because 'there is no alternative', there are of course alternatives for websites that are not transparent or privacy-friendly. Many people indeed choose otherwise, precisely because of the politics and principles according to which the most popular social media companies work - and everybody can make such choices.

People can contribute to the development of the web and become active in debates on the uses of their personal data and its ownership, privacy, and information diets. For people to be able to make informed decisions we need platform literacy; understanding of how the shape, politics and in-built features of the websites people use influence for instance what happens to their data and what they happen to see online. Without it, we cannot really say that we have an informed public - it is only with appropriate platform literacy that people can make informed choices regarding the ways in which they access information, which platforms they want to support with their participation, and really contribute to the creation of a web that is democratic.


Some resources to get you started: World Wide Web Foundation, Electronic Frontier Foundation, OpenMedia, Privacy International 'Explainers', Internet Society, Access Now, Tactical Technology Collective 



Winner, Langdon 1980. Do artifacts have politics? Daedalus 109 (1), 121-136.