This paper examines what Shoshana Zuboff refers to as the "Big Other" (Zuboff, 2015). It raises the question of how we can understand Big Other more thoroughly and what exactly its implications are for human agency in an age of surveillance.
The age of surveillance and Big Other
Tech companies like Google and Facebook collect personal data in order to sell advertisements to third parties, which in turn are then enabled to target their audiences very accurately and influence their decisions on the basis of behavioral prediction. This is what Zuboff calls ‘surveillance capitalism’: it centers around the collection and commodification of data in order to modify user behavior (Zuboff, 2015). Most users of the free and convenient services provided by surveillance capitalists were largely unaware of the data tracking operations they were involved in and due to social dependency, have given up on privacy and rights in order to keep using (Zuboff, 2015, pp. 83–84).
Zuboff claims that this computer-mediated age of surveillance allowed for the emergence of "Big Other", which is a new form of power that commodifies, controls, and affects behavior and daily experience and which will slowly transform human agency into “a new kind of automaticity” (Zuboff, 2015, p. 82).
Big Other as a metaphor
By drawing on insights from both Media Ecology and the related Actor-Network Theory (ANT), I will argue that Big Other is not a body of power that is external to the individual or public, but rather can be conceived as an ecology of surveillance and influence, in which commercial, civic, political and non-human actors all participate actively. From there I argue that although human agency faces a threat, a minimal form of agency and ways of enhancing it exists. My conclusion is that Big Other is a reductive metaphor, that highlights and denounces the power of surveillance capitalists while concealing the participating role of the public among other actors within the ecology of surveillance and influence. Based on these insights, I will argue that we should use discourse that moves beyond dystopian thinking and which allows for self-recognition and self-responsibility if we want to address the threat of Big Other. I will suggest changing the Big Other metaphor and propose to use ‘Web of Eyes’ as a suitable alternative instead.
I will proceed as follows. First, I will discuss the definitions of Big Other provided by Zuboff. From there I will examine Big Other by giving an in-depth analysis of the socio-technical world and its actors that evolved after the introduction of ‘smart tech’. I will use this term in this paper to refer to all kinds of surveillance mechanisms (algorithms, behavioral data analysis, and prediction), devices (e.g. phones, watches), and applications (search engines, social media, apps). Subsequently, I will discuss implications for human agency. Lastly, I will contrast the value of using the Big Other and the Web of Eyes metaphors.
Zuboff on Big Other
Zuboff argues that smart tech laid the foundation for the emergence of “a distributed and largely uncontested new expression of power” which she calls ‘Big Other’ (Zuboff, 2015, p. 75). In her paper, she defines Big Other as a “ubiquitous networked institutional regime that records, commodifies and modifies everyday experience”, a “sovereign power…that annihilates the freedom achieved by the rule of law…creating new markets of behavioral prediction and modification” and as “a global architecture existing somewhere between nature and God.” (Zuboff, 2015, pp. 75–82)
The digital era of smart tech allowed advertising companies such as Google and Facebook to become surveillance capitalists and sell advertisements based on behavioral data to third parties
Big Other is thus omnipresent, invades all aspects of life, and steers daily experience, but its power is decentralized and different from the centralized and totalitarian power of Big Brother (Zuboff, 2015, p. 82). Still, Zuboff worries that it will slowly transform human agency into “a new form of automaticity – a lived experience of pure stimulus-response” (Zuboff, 2015, p. 82). Hence, she depicts a dystopian future in which humans will be slowly transformed into automatons by the power of Big Other. This frightening scenario demands us to demystify and clarify the workings of Big Other.
How to understand Big Other?
In order to give an in-depth analysis of Big Other, I will first analyze the socio-technical order that was evolved due to the introduction of smart tech by drawing on insights from Media Ecology and the similar ANT.
Media theorists such as McLuhan and Postman have come to view media as ecological environments (Scolari, 2012; Strate, 2008). The ecology metaphor emphasizes that media, technology, and humans evolve in a similar way to that of biotic organisms. For example, the printing press was not simply an additive to society when it was introduced: it changed the entire structure of societal relations and associations. Media understood as environments do not determine our actions, but rather define a range of possible actions we can take, assigning roles and stimulating certain actions while discouraging others (Strate, 2008, p. 135). Hence media do not cause certain effects in a linear matter, but rather create conditions for the evolution of particular forms of communication and culture (Strate, 2008, p. 135). For this reason, media ecology rejects technological determinism and considers the interactions and exchanges between actors and technology and their evolution as dialectical (Strate, 2008, p. 137).
The idea that actors, objects, and technologies are always engaged in an ecological process of network formation, is also the core idea of the very similar Actor-Network Theory (ANT) (Latour, 2005). From this perspective, actors or actants (which also includes nonhuman actors) within each network acquire power through their relatedness and are mutually dependent on each other (Couldry, 2008, p. 93). ANT does not claim that an actor’s existence is caused or determined by the relationship with others, but stresses the point that actors are simply being enacted by other actors and objects around them, and afforded by their ability to act by these relations (Mol, 2010, p. 258). These can be either stable or more fluid and their formations are contingent and formed historically (Couldry, 2008, p. 93). ANT thus views humans as sociotechnical constructs which are embedded in organic networks among other actors and actants, who all exert power on each other. Hence, there is no opposition between the social order and the technological order. The technological simply is the social.
This raises the question of how smart tech has restructured our socio-technical order and redefined the range of possible actions available to actors and the relations between them. For the scope of this paper, I will narrow the answer to this question down by focusing on surveillance capitalists, commercial, civic, (e.g. NGO’s and citizens), political and other actors.
As discussed, the digital era of smart tech allowed advertising companies such as Google and Facebook to become surveillance capitalists and sell advertisements based on behavioral data to third parties, which concern commercial, civic, political, or individual actors. This surveillance economy led to more consumer manipulation (e.g. ‘impulse buying’), discriminatory techniques (e.g. blacklisting) and data of questionable reliability (Clarke, 2019, p. 67). Moreover, political parties and individuals try to influence their audiences online, while states draw on social media and internet-based data to control public health (Velasco, Agheneza, Denecke, Kirchner, & Eckmanns, 2014).
Although users in this digital era can be viewed as rationalized, aggregated commodities for advertisers, at the same time they can also be recognized as active participants in cultural production that have achieved more autonomy and creative potential (Lewis & Westlund, 2015, p. 25). Think for example of artists, influencers and bloggers, and other professionals. Individuals also gained functional benefits (e.g. information access, educational possibilities) and social benefits (communication, relationships) from the digital surveillance economy (Clarke, 2019, p. 66).
Social media has led to new divisions of labour and participatory culture that gave rise to all kinds of grassroots movements, protest groups, and NGOs and allowed for more civic engagement, collective action, and direct customer resistance, influencing other commercial, civic and political actors (Obar, Zube, & Lampe, 2012; Uldam & Vestergaard, 2015). All kinds of public surveillance practices led to a world of participatory surveillance that opposes the conventional idea of surveillance as merely a way to control the public in a top-down manner (Albrechtslund, 2008).
However, even though users formally have the ability to analyze data provided by free digital tools (e.g. Google Trends & Analytics) and can run their own data mining operations (e.g. website statistics, followers, groups, and so forth), it is arguable that resource-rich actors such as large organizations and governments tend to have the best access to relevant data and analytic tools (Kennedy & Moss, 2015). Hence, we can assume that the benefits of our digital surveillance world are distributed unequally.
Big Other should be conceived as an omnipresent surveillance ecology in which the individual self participates among corporate, political and civic actors who all influence each other.
Still, all actors relate, interact and influence each other in a triangle or complex web of relations, all providing input to Big Data (e.g. user content, advertisements, public information) while participating actively in this surveillance economy. The social roles within this web are not fixed but rather fluid, echoing the idea of a "liquid society" (Bauman, 2000), as users can become both advertisers and data miners, while advertisers and surveillance capitalists are likely to be engaging as users themselves as well. Various commercial and non-commercial actors try to influence and persuade each other (e.g. vote party A, buy product B, attend event C), resulting in conflicting and oppositional ideological and commercial, governmental, and public forces within this complex web of relations. Moreover, this ecology of actors is always in flux, due to the continuous presence, interactivity, and connection within this computer-mediated marketplace.
Indeed, Big Other is omnipresent and decentralized as Zuboff pointed out. However, it is not a body of power that is external to the individual self or which lies strictly in the hands of surveillance capitalists or corporates. Big Other, I argue, should be conceived as an omnipresent surveillance ecology in which the individual self participates among corporate, political, and civic actors who all influence each other. It reveals the complex evolution of the socio-technical order after the introduction of smart tech, which led to a redispersion and redistribution of power among all these actors. Hence, Big Other is paradoxically party constituted by the individual self. It raises the question what consequences this view has for the relationship between Big Other and human agency.
Big Other's consequences for human agency
ANT’s insight that actors are socio-technical constructs which are mutually dependent on each other in networks, has consequences for our conception of agency. We do not have agency understood as the complete freedom to act and determine our actions free from external forces. Rather, our agency has always been distributive and reflexive, relating to a context of dependency on other human actors and technological artifacts (de Mul & van den Berg, 2011, p. 55). From this socio-technical perspective, we can adopt the definition that Couldry provides of agency: “the longer processes of action based on reflection, giving an account of what one has done, even more basically, making sense of the world so as to act within it.” (Couldry, 2014, p. 891)
As discussed, Zuboff claims that “in a world of Big Other, without avenues of escape, the agency implied in the work of anticipation is gradually submerged into a new kind of automaticity – a lived experience of pure stimulus-response.” (Zuboff, 2015, p. 82) Hence, Zuboff worries that human agency will be reduced to the bare minimum. However, I think that Zuboff has slightly overstated this issue. It suggests that humans will become automatons, obliged to follow a predetermined path which is shaped by the oppressing force of Big Other. It seems to be a deterministic view, which I contest. Although scholars have debated which has primacy, structure or agency, it is possible to accept that both positions within that debate hold truth because of the dialectic tension between structure and agency (Kennedy & Moss, 2015, p. 9), a view that is supported by both Media Ecology and ANT and which I share. Indeed, as already became clear from the analysis in the previous section, social media affect user agency in complex ways as they both empower users and those who steer and exploit users’ activities (van Dijck & Poell, 2013, p. 93). Thus, data mining practices can be used by individuals to become more active and reflective agents, while at the same time they can be constrained by the structure of their socio-technical environment.
The existence of surveillance capitalists is dependent on the agency they provide to users
Moreover, ANT’s insight regarding the mutual dependency of associated actors entails that the existence of surveillance capitalists is dependent on the agency they provide to users. This can be explained as follows. Surveillance capitalist actors can only exist when they have a product and clients to sell it to. As discussed, the product concerns advertisements based on behavioral data of users of their public services (e.g. search engine, Facebook platform). Their clients are advertisers (commercial, public and individual actors). The existence of the surveillance capitalist and its product is thus dependent on the surplus of data provided by their users. This is why surveillance capitalists run continuous experiments: it allows them to observe and track new data (Zuboff, 2015, p. 84). In order to generate this surplus of behavioral data, users need to keep acting. This entails that users need a range of options and the ability to decide. If a user runs out of options, it ceases to exist as a user, which causes problems for the existence of surveillance capitalists. This entails that surveillance capitalists have an interest in providing the user with at least some agency.
This raises an objection from the perspective of Zuboff for us to consider. Even if we accept that human agency may be affected in complex ways and even though surveillance capitalists need to provide their users at least some agency, their relationship is not much different from a master and an enslaved person, for this agency is still dependent on the surveillance capitalists and still primarily serves their commercial interests. This points towards a highly minimal and undesirable conception of agency.
However, contrary to an enslaved person who is being obliged to execute certain tasks and who can be sanctioned when not following these orders, a user is free from such domination. Surveillance capitalists do not determine the actions of their users, but rather define a certain architecture of choice which nudges users to take certain actions. Moreover, the range of possible user actions is not restricted to strictly commercial actions, for the algorithms creating these architectures of choice (e.g. a social feed, or search page) also take user-generated content (e.g. websites, profiles, video’s, etc.) into account. As such, it is possible that users may never buy anything or close a deal with a party. They are not confined to usership that obliges them to make a certain decision, nor is the path a user takes predetermined, even though their behavior may be accurately predicted. As such, they remain to have agency different from those who are enslaved.
As people are overwhelmed by attractive ideological and commercial ads that continuously draw their attention, people will most likely fail to make reflective decisions in their own interest
Still, Zuboff could raise the objection that this agency is rather elusive. If we accept agency as “making sense of the world so as to act within it” (Couldry, 2014, p. 891), then the growing complexity of smart tech and the incredible amounts of information that a user has to read and understand in order to consciously interact within the ecology of surveillance, poses severe problems for human agency. As people are overwhelmed by attractive ideological and commercial ads that continuously draw their attention, people will most likely fail to make reflective decisions in their own interest. It would take collective action and an incredibly educated public in order to deal with this problem adequately, which is too demanding and seems to be unfeasible.
I can only recognize this problem as such. An uneducated public interacting in a world that seems to grow more complex every second seems deemed to lose its autonomy almost completely. Still, we should refrain from a deterministic perspective. New socio-technical conditions have always been demanding for humanity. Whether it was the introduction of the printing press that demanded people to read or the invention of tv commercials that made people more suspicious of advertisements: we have never lost our agency completely and we have always been adapting to new circumstances, whether these concern technological, commercial or social developments. Even though this challenge seems much bigger, we should reject the view that claims that humans will be reduced to automatons. The public will remain to have at least some agency and actually holds many potentials for enhancing it by using smart tech. Instead of denouncing these technologies and devices and accepting the role of a victim, we can try to comprehend their workings and use them for our own benefits. In this respect, we should not forget that there exist other powerful actors within our web of relations, including state and civic actors that can effectively take collective action in order to preserve and enhance human agency. These actors can serve as allies. Currently, many states work on legislation and educational initiatives to protect citizens and individuals. I thus think that Zuboff’s dystopian claim regarding agency is somewhat overstated.
Web of Eyes: changing the metaphor
Big Other presents the image of a strange, dystopian, and mystic body of power that grows to almost limitless proportions vis-à-vis a small dot of power of the individual self that slowly loses its agency as it has to succumb to Big Other. Although this image holds truth, it does not capture reality adequately. Reductive metaphors tend to highlight certain aspects of reality while concealing others. Although Big Other refers to real and substantial worries concerning privacy, rights, and manipulation that should be taken seriously, its rhetoric seems to highlight the growing power of surveillance capitalists but conceals the network of related and interdependent civic, political, and commercial actors that all participate and influence each other in this surveillance ecology. Due to its participatory surveillance, the individual self partly constitutes Big Other. This entails that Big Other is not a metaphor that adequately captures reality.
Metaphors shape the way we think and act and thus serve as important cognitive devices within human culture (Lakoff & Johnson, 1980). When they gain momentum within academic and popular discourse, they can effectively change our perceptions and behavior.
It is for this reason that I would like to change the Big Other metaphor. Its mystified and dystopian associations are undesirable. It seems to suggest a clear dichotomy between those who surveil and those who are being surveilled. It depicts an incomprehensible enemy that can not be seen nor stopped and will lead to an inevitable dystopian future for humanity. It seems to strictly denounce the surveillance capitalist and justifies individuals for playing the victim, while they actually should recognize their own position in the web of relations and take self-responsibility, which could actually lead to the enhancement of their agency. Hence, I would like to change the metaphor and move beyond both tech-utopian and tech-dystopian thinking. Instead, I would like to propose that we use the ecological metaphor ‘Web of Eyes’ in order to capture the complexity of power relations existing in our contemporary surveillance ecology. Moreover, this metaphor allows us to recognize both our own position and those of other, more influential, and powerful ‘nodes’ within this web of related actors. As such, it enables us to take self-responsibility and actively change its evolution.
Instead, I would like to propose that we use the ecological metaphor ‘Web of Eyes’ in order to capture the complexity of power relations existing in our contemporary surveillance ecology
One example of how our new ecological metaphor accommodates thinking in solutions for problems within our age of surveillance is the following. If we are worried about the growing influence and power of surveillance capitalists like Google, we can think about how we can alter our relationship and interdependency with them. As discussed, Google is dependent on users. Some activists and scholars have proposed that users should ask money for the behavioral data that they supply to surveillance capitalists because they consider their data as sacrosanct or consider their user interactions as work (Malmgren, 2019, p. 47). Even though that may sound fair, it would consolidate the relationship between users understood as suppliers to surveillance capitalists understood as advertising companies. Instead, we could insist on breaking this relationship and creating a new one. For instance, instead of users being suppliers, users could become clients of companies like Google and start paying for their services. This would make advertisers obsolete and give more power to users, as they now have become clients, demanding a certain quality in service.
Now one could object to my proposal to change the Big Other metaphor by claiming that its use is justified and necessary because it serves to alarm ignorant people for the challenge of dealing with massive influential powers of surveillance that lies ahead of them. However, I believe that dealing with a problem requires having knowledge about it, which entails using adequate metaphors that contribute to a better understanding. In order to deal with our ignorance and the growing complexity of our socio-technical order, we need to refrain from using reductive metaphors. Moreover, Big Other is not only alarming, it can be paralyzing as well. People may think that it is best to go offline and detach themselves from smart tech completely. Although protecting ourselves from attention-seeking social media sure serves our reflective agency, we should not bury our heads in the ground once and for all. Instead, we should look for ways to protect and enhance our agency, which requires interaction.
Albrechtslund, A. (2008). Online social networking as participatory surveillance. First Monday, 13(3). https://doi.org/https://doi.org/10.5210/fm.v13i3.2142
Bauman, Z. (2000). Liquid Modernity. Cambridge: Polity Press.
Clarke, R. (2019). Risks inherent in the digital surveillance economy: A research agenda. Journal of Information Technology, Vol. 34, pp. 59–80. https://doi.org/10.1177/0268396218815559
Couldry, N. (2008). Actor Network Theory and Media: Do They Connect and On What Terms? In A. Hepp, F. Krotz, S. Moores, & C. Winter (Eds.), Connectivity, Networks and Flows: Conceptualizing Contemporary Communications (pp. 93–110). Cresskill, NJ, USA: Hampton Press.
Couldry, N. (2014). Inaugural: A necessary disenchantment: Myth, agency and injustice in a digital world. The Sociological Review, 62(4), 880–897.
de Mul, J., & van den Berg, B. (2011). Remote control: Human autonomy in the age of computer-mediated agency. In A. Rouvroy & M. Hildebrandt (Eds.), Law, Human Agency and Autonomic Computing: The Philosophy of Law Meets the Philosophy of Technology. https://doi.org/10.4324/9780203828342
Kennedy, H., & Moss, G. (2015). Known or knowing publics? Social media data mining and the question of public agency. Big Data and Society, Vol. 2. https://doi.org/10.1177/2053951715611145
Lakoff, G., & Johnson, M. (1980). Metaphors We Live By. Chicago: University of Chicago Press.
Latour, B. (2005). Reassembling the Social: An Introduction to Actor-Network-Theory. Oxford University Press.
Lewis, S. C., & Westlund, O. (2015). Actors, Actants, Audiences, and Activities in Cross-Media News Work. Digital Journalism, Vol. 3, pp. 19–37. https://doi.org/10.1080/21670811.2014.927986
Malmgren, E. (2019). Resisting “Big Other”: What Will It Take to Defeat Surveillance Capitalism? New Labor Forum, Vol. 28, pp. 42–50. https://doi.org/10.1177/1095796019864097
Mol, A. (2010). Actor-Network Theory: sensitive terms and enduring tensions. Kölner Zeitschrift Für Soziologie Und Sozialpsychologie. Sonderheft, 50(1), 253–269.
Obar, J. A., Zube, P., & Lampe, C. (2012). Advocacy 2.0: An Analysis of How advocacy groups in the United States Perceive and Use Social Media As Tools for Facilitating Facilitating Civic Engagement and Collective Action. Journal of Information Policy, 2, 1–25.
Scolari, C. A. (2012). Media Ecology: Exploring the Metaphor to Expand the Theory. Communication Theory, 22(2), 204–225. https://doi.org/10.1111/j.1468-2885.2012.01404.x
Strate, L. (2008). Studying Media As Media: Mcluhan and the Media Ecology Approach. MediaTropes EJournal Vol I, I, 127–142.
Uldam, J., & Vestergaard, A. (2015). Introduction: Social Media and Civic Engagement. In Civic Engagement and Social Media. https://doi.org/10.1057/9781137434166
van Dijck, J., & Poell, T. (2013). Understanding social media logic. Media and Communication, 1(1), 2–14. https://doi.org/10.12924/mac2013.01010002
Velasco, E., Agheneza, T., Denecke, K., Kirchner, G., & Eckmanns, T. (2014). Social media and internet-based data in global systems for public health surveillance: A systematic review. Milbank Quarterly, 92(1), 7–33. https://doi.org/10.1111/1468-0009.12038
Zuboff, S. (2015). Big other: Surveillance capitalism and the prospects of an information civilization. Journal of Information Technology, 30(1), 75–89. https://doi.org/10.1057/jit.2015.5