Last year, Piia Varis published an article on Diggit in which she asked herself: on its 28th birthday, how democratic is the web? Now, one year later and with the web approaching its 29th birthday, I caught up with her to discuss the current state of the web, the influence that social media companies have on our public sphere and the impact that Facebook’s new guidelines will have.
The conclusion of your article from last year was rather pessimistic. With the web approaching its 29th birthday, do you see reason to change your judgment?
"I see some signs which make me a bit more optimistic. For example, as a teacher I notice how my students have more and more knowledge of how social media works; they know much more than students did five years ago. Privacy has also become a more prominent issue. I hear students explain how they have switched from WhatsApp to Telegram because of privacy issues, or that they have quit Instagram because it’s owned by Facebook. There seems to be more awareness of these issues, and not just in my classroom. However, apart from that, nothing has really changed in the last year. I mean, for a lot of people ‘the internet’ is largely two companies: Google and Facebook."
Facebook's model is built around collecting data and selling it to third parties. As a foundation for a public sphere, you could say that seems a little odd.
What are the pervasive influences of these social media companies? Do you think that Facebook and other social media platforms empower us democratically?
"It appears that every time you say something critical about social media you also need to acknowledge its good sides: you can stay in touch with your friends, stay informed about things very easily, etc. And that is of course true; social media enable very many good things. However, when you look at social media such as Facebook from a democratic point of view, the difficulty starts from a more fundamental level. That is because so much of what constitutes the public sphere is controlled by companies that are only interested in us clicking on ads. Their model is built around collecting data and selling it to third parties, and information circulates according to the popularity principle – favouring quantity over quality. As a foundation for a public sphere, you could say that seems a little odd."
Facebook has acknowledged its dubious role in the spreading of (mis)information in the public sphere and in its new guidelines, Mark Zuckerberg announced some fundamental changes to Facebook’s algorithm. As he says: “we are not in the business of picking which issues the world should read about, but we are in the business of connecting people with the stories they find most meaningful.”
"That sounds very nice, but can anybody explain how they are not in the business of picking what people should read about? Is that not exactly what they are doing? That is the whole idea behind the company: you give them information about yourself, and they use that information to provide you with information that they consider most likely to be relevant for you so that you stay on the medium. Regardless of whether they use humans or algorithms for that task, Facebook still actively makes decisions for you: they decide what is relevant for you to see.
But they say they are going to up-rank stories that circulate among your friends and down-rank stories that come from large corporate players and media publishers in order to minimize the spread of misinformation?
"Sure, but what is the context for this? The context is that Facebook has realized that there is a lot less original posting. Instead of writing their own posts, people share links with each other and so do companies. This simply means that there is less engagement: people spend less and less time on Facebook. For them as a company, that is of course highly problematic.
The principle has not changed: what you engage with will return on your feed, what you do not engage with will gradually disappear.
If they are really concerned about misinformation and the public sphere then they would do something for example about dark posts: the kind of advertisement posts that only target a particular group of people but are invisible to others. The Trump campaign for instance was very proud of the number of specific audiences it could target. It’s worth asking how that goes with the idea of the public sphere."
So you do not believe that these new guidelines will reduce the chance of people getting stuck in echo chambers?
"I am very curious to see how these new guidelines will change anything because the principle has not changed: what you engage with will return on your feed, what you do not engage with will gradually disappear. Even under these new guidelines, you will only encounter information that you will most probably like. This is why I am not too optimistic about these new guidelines.
In your article from last year you argued that in order for people to make informed digital decisions, we need platform literacy. What do you mean by this?
"There is of course the traditional type of media literacy we are very familiar with: for example, being able to interpret the title of a newspaper article to determine how it frames the news story in a particular way. The type of platform literacy I am talking about has to do with understanding how these platforms actually work and what kind of a public sphere they contribute to.
I hope that people will start asking themselves: “hang on, I am actually the product here?” Once people start to think about these kinds of questions, then they can decide what kind of media platforms they want to use. Having said that, the problem is of course that we can only become literate to a certain extent: how these platforms and their algorithms actually work is still of course a black box. But just knowing the principles by which these platforms operate is already a form of literacy that empowers you to make well-informed decisions.
We should not idealize social media and talk about it as if it’s a neutral tool: it just isn’t.
I hear a lot of people say that Facebook is just a tool, and everything depends on how you use the tool. If it all depends on how you use the tool, then you should try hammering a nail with cooked spaghetti. Tools have specific features and these features limit what you can do with them. The same holds for Facebook. We should not idealize social media and talk about it as if it’s a neutral tool: it just isn’t."
You also make reference to the ‘inventor of the internet’, Sir Tim Berners Lee. He argues that we as citizens should “push back” against powerful digital players such as Facebook and that we should “build the web we want”. Yet how do we do so if the infrastructure through which we must do so is controlled by those that we are trying to push back? What leverage do we have in this?
"Well, you could say that we have all the leverage because we are the product. If we decide to stop using their platforms then these companies don’t have their product anymore. We have already seen that for instance when people oppose the implementation of new privacy regulations on Facebook, they reverse their decisions. It is just another form of consumer activism: you boycott a company and say: “if you don’t change your ways, I will go elsewhere”."
Facebook is not going to last forever.
Where do you think we will be in five to ten years from now? Will the power of social media platforms such as Facebook have waned or will they have consolidated their position?
"Maybe this is wishful thinking but I do somehow think that Facebook’s power will decrease. It is definitely going to lose users, especially younger ones as we can already see. These younger generations find other media such as Snapchat now and I think that will really have its influence on Facebook. New media are constantly appearing and eventually one of them will replace Facebook.
Facebook is not going to last forever."