Recently, there has been renewed interest in the mechanisms behind online media and their impact on the public debate. This paper aims to summarize some of the main contributions in the field of algorithms and their power effects.
What makes you click?
Despite all the information about this topic being available online, there is limited data to be found about algorithms leading to political echo chambers. In finding out more about the ethics of big-data collecting by online media, as well as the use and misuse of personal data by the industry, the documentary ‘What makes you click’ (2016, September 25) provoked me to explore this field of work. The documentary starts with: “You might not know, but at this very moment the biggest psychological experiment of this era is taking place. What might surprise you, is that you are involved. Yes, you too. You never signed up for it, nor gave your permission, and its results are not available. However, there is something done with it.”
Within this paper, I explore the basic principles of algorithmic systems behind online platforms—which deliver metadata—and their influences on public opinion. I specifically investigate how these algorithmic systems could influence the electorate.
Data, knowledge and governance
The online dimension has become a dominant factor in our daily habits; platforms such as Facebook and Google have a tremendous influence on how we see the world (Kaplan & Haenlein, 2010). With approximately 1.71 billion users each month and over 100 billion searches per month respectively, both of these companies might even have a better idea of what your online behavior is like than you yourself (Statista, 2016; Smith, 2016).
This ‘knowledge’, better known as data, is information collected (e.g., photos shared, people’s networks, pages liked) by the companies when someone uses their services (Mullin, 2012). Since Snowden, we know that data collection is a huge enterprise; it is perhaps “the first time the world has seen this scale and quality of data about human communication” (Simonite, 2012).
Interestingly, the algorithms behind Facebook and Google will, based on your clicks, provide you with even more information they think you agree with (Ravenscraft, 2016). Undoubtedly, therefore, Google and Facebook offer a wealth of options for any type of organization to create awareness about themselves among users (Helmrich, 2016). Nowadays, social media are also indispensable in political elections, especially on the road towards them (Bhikhie, 2016). As a consequence, political parties attach great value on being visible online for the electorate (Bennett, 2012). As stated by Highfield (2016), ‘everyday politics’ is a common feature of our daily online affairs, as people share and talk about it on a variety of online platforms.
Everybody has the opportunity “to post their thoughts and media content, without extensive technical literacy or qualifications; with blogs and citizen journalism, much conjecture surrounded whether the ability for amateur voices to broadcast themselves to a large audience online would be a threat to–or even actively replace–traditional journalism” (Highfield, 2016). However, as a strategy, providing organizations and individuals the ability to post and search for information gives online platforms a simple formula they can use to keep control (Pariser, 2011).
By collecting data about us as their users, Facebook and Google turn us into ‘quantified selves’ and influence the choices we make.
There is undoubtedly a tenacious invisibility of social media intelligence. As Bucher (2012) mentions, there is no centralized inspector behind Facebook who is monitoring everybody’s behaviour. However, there is an invisible governance of the movements of its users. For example, Facebook determines the items, such as status updates, that show up on your news feed. As Kincaid (2010) states, “the algorithmic editorial voice of Facebook, determines what is shown on users’ Top News by drawing on different factors (i.e., affinity and time decay).” This captures the idea of invisible social media mechanisms which they use to influence what their users see. In addition, by letting their users see more information from the same perspective and from the same sources, the more they bolster your ‘knowledge’ without ever challenging you. People are thus excluding themselves from a debate with their own ideologies by only being confronted with information they ‘like’ over and over again (Ravenscraft, 2016).
Against this background, it is interesting to address the question: What are political echo chambers? The focus in the discussion here will be on two online companies, namely Google and Facebook.
Algorithms and echo chambers
To start off, it is important to know that knowledge is power (Pasquale, 2015). By actively using social media—and thus providing companies knowledge—people encounter ‘power regimes’. In a Foucauldian perspective, this phenomenon can be described as “mechanisms which produce discourses which function as true in particular times and places” (O’Farrell, 2005). These power regimes behind digital platforms such as Facebook and Google, by controlling populations through online procedures, lead to inequality among their users (Gozdecka, Ercan, and Kmak, 2014).
By collecting data about us as their users, Facebook and Google turn us into ‘quantified selves’ and influence the choices we make. A useful term to describe this problem is the ‘black box’ metaphor, referring to a system whose workings are mysterious. In addition, while using online applications, people cannot know where their information is being transferred to, how it will be used and the consequences of such actions (Pasquale, 2015).
Silicon Valley cherishes the knowledge about us. By coding human connections on the basis of what we ‘like’ and ‘dislike’ or do and do not search for, companies can estimate what information to provide us with in the future (Van Dijck, 2013). Simultaneously, this empowers the companies because they know how best to respond to you (Metz, 2012). This causes the users to feel comfortable using these services, again leading them to provide Google and Facebook with even more and more details about themselves. This cyclical operation leads this data to be sold to third parties (Pasquale, 2015).
The cyclical operation mentioned above is driven by algorithms. As Cormen & Leiserson (2009) stated, “Informally, an algorithm is any well-defined computational procedure that takes some value, or set of values, as input and produces some value, or set of values, as output. An algorithm is thus a sequence of computational steps that transform the input into the output.” In simple terms, algorithms are a sequence of steps in computer systems to accomplish a certain task (Otero, 2014). Decisions on what we like to read, or at least what Facebook and Google think we like to read, are not based on data itself, but on the data being analyzed algorithmically (Pasquale, 2015). Algorithms thus do not literally compare each news article on their content, but the analysis is based on all the clicks one has done before (Metz, 2012).
Each item within an algorithm is equal and manipulable. Due to the fact that its mechanisms do not differentiate between each context for the items clicked on, each item can be potentially made relevant. If someone purposely clicks on comparable items (related to a single topic) repeatedly, he or she will also be confronted with this same type of information next time. Hence, this ‘manipulability’ also means the ability to direct the news feed if users are willing to control what they click on. For example, even when you do not have any interest in issues related to Donald Trump, once you search for or click on information about him, algorithms can automatically provide you with more information related to the keywords behind this item (Newitz, 2016).
By staying in your own network, you will barely see information different from what you already had. This phenomenon is better known as ‘echo chambers’ (Garrett, 2009). A person does not consciously choose to exclude him or herself from the ‘outside’ (i.e., information of less interest to them) by creating an imaginary wall. Instead, Facebook and Google control your flow of information, building an imaginary wall around you (Ravenscraft, 2016). Ending up in such echo chambers is a result of ‘clicking’, which includes actions such as ‘liking’, ‘sharing’ and ‘searching’. Through all the clicks you have done, a data-double is created of you, which is a profile of the ‘online you’.
Political echo chambers
The political echo chamber is a metaphor for the ‘clicks’ online resulting in a political ‘bubble’ people can get themselves into while using online services. It has been claimed that, thanks to algorithms, people fail to recognize their own cognitive biases (Allan, 2015). Ravenscraft (2016) gives an example of how algorithmic feeds encourage your bias: “If you read liberal news sources—or even just have predominantly liberal friends—Facebook will show you more liberal-leaning news. The same thing happens for conservatives and even the most fringe members of the political spectrum. In short, this algorithmically-enforced confirmation bias means the more you read information you agree with, the more Facebook will show you even more information you agree with.” It is possible that even if someone reads a liberally-oriented article, they do not necessarily profile themselves as liberal. Nevertheless, Facebook will provide you with more related information. This can contribute to creating political echo chambers.
Cathy O’Neil author Weapons of Math Destruction states, "Like so many algorithms, political polls have a feedback loop. The more we hear a certain candidate is ahead in the polls, the more we recognize their name and the more we see them as electorally viable."
Considering that 6 out of 10 millennials (a generation approximately born between 1982 and 2000) use Facebook as their primary news source, its contribution in shaping the electoral opinion is relevant (Pew Research Center, 2015). Bixby (2016) suggests that the younger generation is in “the biggest political echo chamber ever” within a political election (referring to the 2016 U.S. elections). Nash (2016) also points out that “the most shareable, clickable and likable content on the site aligns strongly with its readership’s pre-existing biases, assumptions and political affiliation.”
Garrett (2009), however, argues that people are not averse to opinion challenges, so there is little reason to equate online platforms with the construction of homogeneous (personalized) political information environments. What is arguable, as Yun and Park (2011) warn us, is that people’s willingness to speak out on online platforms is disturbed due to the fear of being psychologically or socially sanctioned for expressing unpopular ideas in their echo chambers.
Some experts state that our offline lives are in fact the real echo chambers (Ellis, 2016). In the same article, an interesting quote was given from professor Jeff Jarvis from the City University of New-York: “That is a presumption about the platforms, because we in media think we do this better. Newspapers came from the perspective of very few people: one editor, really. Facebook comes with many perspectives and gives many; as Zuckerberg points out, no two people on Earth see the same Facebook (2016).”
Who we are, and how we operate, heavily relies on the presence of ‘power’ and ‘knowledge’.
All in all, however, we can still wonder whether algorithms can function as a ‘determination of reality’ through Facebook (used by 86% of internet users aged 16 to 64) and Google (3.5 billion searches per day), and also through information they provide third parties with (Ressa, 2016; Internet Live Stats, 2016). For instance, as Donohue (2016) inquires, how is it possible that so many people assumed Brexit would not happen? Algorithms also cannot distinguish fact from fiction. Therefore, in arguing about the impact of Facebook’s algorithms on our democracy, Ressa (2016) states that these black boxes create negative echo chambers.
Echo chambers and power
The ‘(political) echo chamber’ is a metaphor for the ‘clicks’ online, resulting in algorithms providing us with similar information in our news feeds. This means that people could end up with a lack of diverse knowledge about many topics, for example, politics. In addition, the knowledge that Facebook and Google have about their users gives them power. This power lies within the profiling that Facebook and Google have created, which is of high interest for political parties. The data can also be used for political parties' own profiling and targeted information. However, this does not necessarily mean that the information within a ‘political echo chamber’ is always information you agree with.
Furthermore, research has shown that people are not necessarily averse to opinion challenge, and algorithms are also manipulable. Therefore, someone’s political conviction is not always shaped by algorithms. At the same time, people’s willingness to debate on these platforms can be disturbed due to the fact that they could be afraid of psychological and social sanctions for expressing unpopular ideas in their echo chambers. It seems that more information on where data from digital platforms is being transferred to, how it will be used, and what the consequences of such actions are, is needed. Based on the above, we also need more information on potential echo chambers, how they work and what their implications are. The issue may be more complex than what it seems. Further humanities research will help establish a greater degree of accuracy on this matter.
Allan, P (2015, September 16). This Graphic Explains 20 Cognitive Biases That Affect Your Decision-Making.
Bennett, W. L (2012). The Personalization of Politics Political Identity, Social Media, and Changing Patterns of Participation. The annals of the american academy of political and social science, 644(1), 20-39.
Bhikhie, A (2016, September 30). Waarom sociale media onmisbaar worden bij de komende verkiezingen.
Bixby, S (2016, October 1). 'The end of Trump': how Facebook deepens millennials' confirmation bias.
Bucher, T (2012). Want to be on the top? Algorithmic power and the threat of invisibility on Facebook. Sage Journals, 14(7), 1164-1180.
Cormen, T. H & Leiserson, C. E (2009). Introduction to Algorithms (3rd ed.). London, England: MIT Press.
Donoghue, C (2016). Does it matter that Silicon Valley CEOs don’t follow women on Twitter?
Ellis, P (2016, October 25). How Do We Open Up The Social Media Echo Chamber Without Letting The Trolls In?
Garrett, R. K (2009). Echo chambers online?: Politically motivated selective exposure among Internet news users. Journal of Computer Mediated Communication, 14(2), 265-285.
Gozdecka, D. A, Ercan, S. A & Kmak, M (2014). From multiculturalism to post- multiculturalism: Trends and paradoxes. Journal of Sociology, 50(1), 51-64.
Helmrich, B (2016, January 29). Social Media for Business: 2016 Marketer’s Guide.
Highfield, T (2016). Social Media and Everyday Politics. Cambridge, England: Polity press.
Internet Live Stats (2016). Google Search Statistics. Retrieved October 12, 2016, rld, unite! The challenges and opportunities of Social Media. Business Horizons, 53, 59-68.
Kincaid, J (2010, April 22). EdgeRank: The Secret Sauce That Makes Facebook's News Feed Tick. Retrieved from: https://techcrunch.com/2010/04/22/facebook-edgerank/
Metz, C (2012, May 24). How Facebook Knows What You Really Like. Retrieved from: https://www.wired.com/2012/05/facebook-open-graph/
Mullin, J (2012, September 10). How much do Google and Facebook profit from your data? Retrieved from: http://arstechnica.com/tech-policy/2012/10/how-much-do-google-and-facebook-profit-from-your-data/
Nash, C (2016, October 3). The Guardian: Facebook a Giant Political Echo Chamber for Millennials as Election Nears.
Newitz, A (2016, November 19). It’s time to get rid of the Facebook “news feed,” because it’s not news - Fake news didn’t throw the election. It was a symptom, not a cause.
O'Farrell, C. (2005). Michael Foucault. London: Sage.
Otero, M (2014, May 26). The real 10 algorithms that dominate our world.
Pariser, E (2011). What the internet is hiding from you. London, England: Penguin group.
Pasquale, F (2015). The black box society: The secret algorithms that control money and information. London, England: Harvard University Press.
Pew Research Center (2015, June 1). Facebook Top Source for Political News Among Millennials.
Ravenscraft, E (2016, October 11). How sites like Google and Facebook put you in political echo chambers.
Ressa, M. A (2016, October 8). How Facebook algorithms impact democracy.
Simonite, T (2012, June 13). What Facebook Knows.
Smith, C. (2016, October 19). 100 Google search statistics and fun facts.
Van Dijck, J (2013). Engineering sociality in a culture of connectivity. In The culture of connectivity (pp. 3-18). New York: Oxford University Press.
Yun, G. W., & Park, S (2011). Selective Posting: Willingness to post a message online. Journal of computer mediated communication, 16(2), 201-227.