apple 1984, ad, digital utopianism, hyper-opticon, panopticon

Apple’s 1984 ad: from digital utopianism to hyper-opticon

20 minutes to read
Ico Maly

It is safe to say that Apple’s 1984 ad is part of the world’s collective memory. Even youngsters who weren’t born in the Eighties have seen it or at least saw a parody of it. It has been hailed as a watershed moment and praised as a brilliant advertisement. But it is much more than that. It was never just a commercial, or a smart way to implicitly frame IBM as big brother, it was a cultural product. It represented the rise of a new class, the class of digital entrepreneurs and its ideology. Central in this ideology is the idea that digital technologies aren’t just tools, but can empower individuals and even start a societal revolution. Computers are not just for accountants, they can create a better, freer society devoid of government surveillance and control. 40 years after the MacIntosh computer was released, reality looks far less utopian. 

Buy Apple and be different

We tend to see products as nothing more than stuff that is practical, that makes our life easy, fun or even joyful. At the same time, we see around us that commercial products, and especially technological products can stir up very strong emotions. The iPhone is such a product that generates desire and as a consequence creates global hypes. The launch of the iPhone is a cultural format. Each time a new version of Apple’s success product is launched, we see queuing people in front of retail stores in the news. The iPhone is more than just a practical tool, it is an object of desire. 

One central element in selling iPhones, iPods and Apple products, in general, is design. Apple’s colorful logo in the eighties, its colorful laptops and personal computers in the nineties and its slick white iPod and iPhone design all signify the Apple myth. The white ear plugs were more than just technology, they indexed the wearer as somebody in touch with tech-culture. It was and is an identity emblem. Apple products are drenched in a Barthesian myth (Barthes, 2000 (1972)). An Apple phone isn’t just a phone, it is adapted to a specific type of consumption. It is all about experience and identity. An Apple product allows to signify oneself as belonging to a specific group of individuals. For a very long time, Apple equaled a consumer narrative that presented it as the choice of programmers and designers. It was a product for people who saw the world differently, be it in terms of bits and bytes or in terms of aesthetics. This image is linked to Apple’s marketing discourse. It is not a coincidence that ‘Think Different’ was one of Apple’s long time slogans. It was grounded in the idea that technology, and especially Apple technology, is not only for those who think differently, but also that Apple can help you think differently. 


To understand this myth of individual freedom to become who you are associated with Apple, we have to go back to the Eighties. The Eighties were many things. They were the Reagan and Thatcher years. It was the decade in which capitalism was ‘liberated’ from the welfare state and Wallstreet was unleashed. Greed was framed as the essence of human nature and the yuppies - young urban professional that liked to show off their newly acquired wealth - became emblematic of the era. Even though the Vietnam failure had affected the power of the US, the cultural hegemony of the world’s superpower was still untouched, at least in the West. The American dream was still resonating on many Western screens. Tom Cruise played the hero in Top Gun and James Bond was still fighting Russian spies. It was still very much a time that was shaped by the Cold War. Mikhail Gorbachev, the president of the Soviet Union was as renowned as Reagan and Thatcher. In Western media, he exemplified the opposite of Western consumer culture and its accompanying rhetoric of ‘freedom’ and ‘progress’. The ‘colorfulness of market’ capitalism was contrasted with the greyness of the planned economy of the Soviet Union. The Eighties were also the time where ‘the computer’ became a consumer product and many companies tried to capture a share of the market. Advertisements were their weapons of choice. 

Apple’s 1984 ad and digital utopianism

The Apple 1984 commercial has proven to be one of the most memorable ads of that decade, and it is not difficult to see why the 1984 Apple commercial resonated so much. The image of a young, colorful and powerful blonde female athlete - representing consumerist techno-capitalism - destroying a grey ‘big brother’ - indexing the grey reality of Communism - repackages the dominant anti-Communist narrative that was still structuring the world. It captured the fear of Communism and dictatorship, while also portraying the hope that ‘freedom’ would prevail. Of course, the young athlete with a drawing of the new MacIntosh on her white top not only represented capitalism in general, but also ‘the revolutionary force’ of the new Apple computer. It is this computer that destroys the power of Big Brother. This subtle change in the dominant narrative predicts the rising influence of what would be called ‘the new economy’ in the Nineties. 

In the ad, Big Brother is portrayed as a hegemonic ruler, a ruler that succeeded in the ’Unification of Thoughts’ by creating ‘for the first time in all history, a garden of pure ideology’ as the leader on the screen claims in the clip. In this garden there is no doubt anymore, everyone is homogenized and secured ‘from the pests of contradictory and confusing truths.’ As the voice on the screen claims, as a result of ideological purification ‘[w]e are one people, with one will, one resolve, one cause. Our enemies shall talk themselves to death, and we will bury them with their own confusion. We shall prevail!’ It is not hard to see this image of Big Brother as an echo of how Communism, but also Nazism was described in mainstream discourse at the time. In other words, it reproduced the discourse on populism that was common at the time. This discourse presented Communism and Nazism as two similar sides of the medal of totalitarianism. And it is of course completely obvious how it reflects the grim reality Orwell described in his book. The grey bald headed people staring at Big Brother on a big screen, the omnipresence of small screens in the hallways, all echo the dystopian surveillance and control society Orwell described in his famous novel.

The MacIntosh suggested that all that industrial computing power could now be bought and used by the individual.

But that is not all there is to it: the commercial also introduces a vision on the computer that was well spread after World War II. In the Sixties and  Seventies computers were largely understood as technologies of power, used by bureaucratic governments and companies to control their people. IBM’s legacy of collaboration with the Nazi’s for the administration of the concentration camps is emblematic for that idea (Black, 2001). Against all of this, Apple’s MacIntosh was clearly portrayed as something entirely different. During the official launch event, the MacIntosh was not just presented as a tool that humans could use to edit, draw or fill in spreadsheets. It was presented by Jobs as a very different type of computer. A computer that you could trust, a computer with an identity. The MacIntosh was framed in the ad and during the launch as ‘a revolution’, a personal revolution even. This myth of revolution has become part of the Apple brand. When Jobs launched the first iPhone, he also, and literally even, framed it as a ‘revolutionary’ product. He applauded Apple for launching several ‘revolutionary products’ throughout the existence of the company. And he stressed that it all started with the launch of the ‘MacIntosh’ computer back in 1984 ‘that didn’t only change Apple, but changed the whole computer industry’ (Jobs, 2007).  

The MacIntosh was presented as an actual ‘personal computer’ that could talk and present itself, that seemingly had an identity of its own. Interestingly, the new Apple computer presented itself during the launch in 1984 in contrast with the computers of the past: ‘Never trust a computer you can’t lift’, the MacIntosh said in a digital voice reminiscent of Steven Hawking. The design and form of the MacIntosh were presented as essential to the revolution it claimed to be. It was its form that indexed the myth of ‘breaking through control’. The MacIntosh was implicitly contrasted with the large industrial computers, and suggested that all that industrial computing power could now be bought and used by the individual. The portability of the computer stood for the emancipation of the individual, and thus also the redistribution of power. IBM was the dominant player in the industry and was emblematic for the dominance of large computers used for surveilling the people. Apple now ‘gave’ that power to the individual that could use it to destroy the power of government. That is why, at the end of the commercial, the voice-over said: ‘On January 24, Apple will introduce the Macintosh. And you will see why 1984 will be nothing like ‘1984’. The Macintosh came with the promise of a world free of dictatorship, surveillance, control and ideological tyranny. 

The blending of Orwell’s Big Brother with the cold war perspective on the Soviet Union and Nazism in relation to the role of computing created a very powerful fear producing image. It produced the idea of a dangerous and all powerful danger looming in the future as a result of massive corporate computing. At the same time the ad gave reasons not to fear, because now, thanks to the Apple MacIntosh, a single technologically empowered individual can take such large powers down. This motive of computational empowerment was not new, and wasn’t invented by Apple. By the 1970s it was a very well-known narrative within the rising class of computer hobbyists that would later make up Silicon Valley. They saw in the personal computer a means to redistribute power, and to create an ideal society. This discourse where technology and the countercultural struggle for freedom were fused has a long history. Turner (2006) argued that it was emblematic for what he coined as digital utopianism. He traced the origins of this tech-ideology back to the countercultural entrepreneur Stewart Brand and his Whole Earth Catalog and network. In this catalog, and in later products he launched, Brand managed to popularize this ideology among an emerging class of hackers and digital entrepreneurs. Through ideological entrepreneurship he had taught them to understand technology as tools that could give power to the people and even could be imagined as the start of a social revolution. The cultural importance of this ad is not only that it gave the Macintosh a countercultural aura, but crucial was that the idea of the personal computer’s empowerment potential now entered the mainstream. All of a sudden computers were framed not just as tools for accountants, but as tools for personal and societal change (Turner, 2006: 105-107). The 1984 ad showed, in prime time, that technology could also produce ‘personal freedom’. The Macintosh normalized the idea that a technical revolution could introduce a societal revolution and shape a true utopia where we could all be free, where we could all be ourselves. 

Forty years later, the utopia that the personal computer promised looks far less colorful and appealing. The idea that the personal computer, and later the smartphone, has empowered individuals is increasingly replaced by the idea that they enabled a new world of unprecedented surveillance and datafication. Or to rephrase this in terms of Orwell’s novel, the rhetoric of digital empowerment and revolution has acquired the status of Oceania’s newspeak. Orwell’s newspeak argument is known. Oceania, Big Brother’s totalitarian superstate in his 1984 novel, introduced newspeak as an ideological instrument designed to avoid critical thinking. This is exactly the contemporary effect of the digital utopianist dreams that are evocated by big platforms. Even today, contemporary marketing talk sells digital platformto us as tools for ‘digital empowerment’. Facebook for instance is still articulating the digital utopianist discourse when they claim that it is their mission to ‘empower people’ through the technology they design (Maly, 2023). And we hear echoes of digital utopianism when we hear Google claim that it’s their mission not to do evil. Just like newspeak, these narratives of digital empowerment obfuscate the bleak reality of datafication, surveillance, intrusion in our privacy and the steering or programming of social relations (van Dijck, 2013).

1984 for real? From the panopticon to the post-panopticon

The omnipresence of digital screens and their role in surveilling us, may remind us of 1984. In Orwell’s 1984, the government in the form of big brother permeates every small detail of social life. Not just speech, but also thoughts and facial expressions were monitored, and as such controlled. Visibility was central in the construction of ideological homogenization. This relation between power and visibility of course reminds us of what Michel Foucault called Panopticism (Foucault, 1989). Foucault used Bentham’s panoptically designed prison as a metaphor to understand power in modern society. Panopticism, according to Foucault, was the dominant organization of power in modernity. Orwell’s 1984 could be described as a panopticon on steroids as it wasn’t limited to what people said, but also what they thought or felt. The control was total, and it created the grey world we saw in the Apple 1984 app. In Orwell’s dystopian novel this ‘garden of pure ideology’ was realized through technologically enabled surveillance in the form of two-way monitors, microphones in all corners of people’s homes and on the streets. This technological surveillance was complemented by the ‘thought police’ and citizens ratting out potential dissidents and enforced by the fear of dying, torture and other severe punishment. Surveillance was total, and all ‘non-approved’ discourse could be detected and punished. 

If we snap out of the fictional world of 1984 and look at the our Google Home device, or phones, laptops and smart TV we realize that (1) they too are two-way screens, (2) that they don’t look anything like the grey world depicted in the 1984 ad. All these digital tools look slick, well designed and cool. The world we see on those screens is even more colorful. There is no big dictator that controls those screens. On the contrary, what we see are influencers trying to grab eyballs. Individuals producing content in the hope of monetizing it. And politicians pay increasingly large sums of money to get their message to us. On this superficial level, Apple has indeed delivered the colorful world they promised, full of ideological battles and full of ‘the pests of contradictory and confusing truths’.

Contrary to what the 1984 app promised, the Macintosh didn’t smash a surveillance society, it enabled it.

If we scratch the surface, we see a far more grim reality that is far more reminiscent of that world of surveillance pictured in the ad. Digital technologies not only enable communication, they also enable surveillance (Zuboff, 2019). At the same time, it is obvious that this surveillance doesn’t operate in the same way as in Orwell’s writings or in the 1984 Apple ad. Digital surveillance is omnipresent, but it is seemingly focused on entertaining us. In other words, the dystopia of 1984 is fused with Huxley’s sketch of a future dystopia in Brave New World. In this new world, people are not disciplined through surveillance in order to maintain a peaceful order, they are kept happy by consuming ‘soma’, a medicine that artificially produces that happiness. In contemporary societies, we see that surveillance is used to show us the content that makes us happy, to produce soma. This surveillance doesn’t necessarily succeed in changing our (discursive) behavior or controlling our interaction out of fear. It does of course keep us hooked at our screens scrolling away, unleashing dopamine. Most people still have the idea that they are free, and that digital surveillance only results in more advertisements and good content following them around the web. 

Especially during the rise of web 2.0 most of its users were hardly aware of digital surveillance, let alone that they would have adapted their digital behavior. Scholars therefore coined the digital surveillance of commercial platforms as a ‘non-opticon’ or a crypto-opticon referring to a state of being watched without knowing it (Vaidhyanathan, 2008). This ‘invisibility’ of digital surveillance was intentional, as digital platforms want us to be completely at ease to show our true interests. Our true interests are valuable, they are the raw material of surveillance capitalism (Zuboff, 2019). In light of this, Cheney-Lippold (2011) stresses that the digital surveillance of platforms can be seen as a form of ‘soft biopolitics’ that most subtly guides us as online consumers, online and offline citizens. This type of surveillance Lyon and Bauman (2013) argued, is focused on the economic activity of the consuming middle class. For those classes, control is now complemented with seduction through surveillance. 

As a result of this reality, scholars stressed that we now live in ‘post-panoptic’ times  (Arnaut, 2012). The post-panopticon, argues Arnaut (2012), is interactive and decentralist. Whereas the Benthamian panopticon stressed the importance of architecture to organize surveillance in a specific place, post-panopticon surveillance ‘can process without space. That means, surveillance power does no longer need a certain grounding to settle on, rather it can operate like flows’ (Basturk, 2017). Digital media are central in the idea of post-panopticism as it is social media that afford ‘the many to watch the few, as much as the few are watching the many’ (Boyne, 2000). CCTV, GPS-traceable mobile phones and digital technologies allow people to surveil and be surveilled. The notion of the post-panopticon, at least in theory, is associated with a redistribution of power. Boyne (2000, p. 301) for instance argued that the ‘machinery of surveillance is now always potentially as much in the service of the crowd as it is of the executive’. This may have been true for the middle classes in society, it is obvious that certain groups still feel the disciplining power of surveillance. 

The ban-opticon and the crumbling of digital utopianism 

Even though surveillance has been omnipresent for quite some time, for a long time people didn’t not seem to see, feel and experience this world of massive surveillance as anything near the dystopian reality we saw in the 1984 ad. Many imagined their screens to bring that colorful world where we all are free and (can) think differently. This was at least partially the result of this digital utopianist framing of digital media. People learned to understand digital technology and online content production as empowerment and freedom. Hardly ever was the question asked how all of this is paid for and how it affected private lives and society at large. 

This digital utopianist normality has been grumbling in the last 5 years. People and academic research found that people were increasingly conscious about algorithmic surveillance and the surveillance of others on digital platforms. They acquired algorithmic imaginaries (Bucher, 2018) and actively ‘imagine surveillance’ (Duffy, ), and adopt discursive strategies to avoid algorithmic interventions (be it avoiding being targeted by companies or politicians, or being shadow banned or even de-platformed). Especially in relation to controversial or polarizing political topics, we see that users adopt algospeak – that is discourse that is carefully designed to circumvent algorithmic control – in order to stay integrated in digital platforms. The impact of algorithmic surveillance is not just superficial or operating as a ‘soft biopolitics’, it also affects how people talk, interact and behave. In other words, it disciplines people, and impacts culture (Maly, 2023) and power relations in society.  

Interestingly, surveillance works in complex ways. Sometimes we want to protect our privacy and thus want to become invisible. In other contexts, surveillance leads to becoming invisible when one wants to be visible. Palestinian activists for instance want to get their voice out, but as a result of algorithmic surveillance and moderation, their voices are made invisible. They thus feel the need to adjust their discourse to avoid being invisible. In such contexts, it becomes clear that surveillance power is not only in the hands of a specific digital elite, they also enforce rules that governments install. In other words, they reproduce existing power relations. Groups that are targets of surveillance, moderation and control don’t have the luxury to stay ignorant of digital surveillance. Breaking platform rules can have consequences outside the platform: you can get arrested and prosecuted. Bigo (2008) introduced the notion of the ban-opticon – consisting of ‘ban’ and ‘opticon’ - to stress that surveillance is focused not necessarily on ‘all people’ but on people on which a ‘ban’ rests. This ban-opticon, just like the post-panopticon, stresses the translocal nature of surveillance: it is embedded in large transnational security networks. In line with Foucault, he describes the ban-opticon as a dispositif by which he highlights that surveillance is not limited to the actual practice of surveilling, but is connected, informed and realized through discourses (that define who needs to be surveilled), institutions (from governments to platforms), architecture and so on. Surveillance in Bigo’s work, is about the focus on specific groups of people that are defined as security threats. Think about how Muslims, or asylum-seekers, became subject of intense surveillance in the last three decades. While the notions of the non-opticon and the post-panopticon are productive to understand commercial surveillance in the context of surveillance capitalism, they also obfuscate that specific groups are still very much in control.

The hyper-opticon

Contrary to what the 1984 ad promised, the Macintosh didn’t smash a surveillance based society, it enabled it. Surveillance is omnipresent around the globe. Digital technologies have created unprecedented surveillance capabilities in the hands of a limited group of companies. It is obvious now that computers and social media do not create an equal relation as companies control the data, have huge processing power and the power to not only store but also link and search different corporate databases. The ban-opticon shows that on top of the ‘post-panoptical logic’ of surveillance a different type of surveillance operates that still disciplines the people. 

The Snowden revelations showed an even more worrying reality. Former NSA contractor Eward Snowden collected and shared proof of the colossal scale of American and British mass surveillance activity. Among other things, he showed that those commercial data are not only used to steer consumer behavior or to connect users to advertisers, they also are at the disposal of the US government and its security partners. Interestingly, Snowden proofed that the NSA also has access to the servers of Apple (Greenwald, 2014: p. 20, 49, 50, 68 and 94). We are thus in a situation, where social media and digital devices enable surveillance that permeates each aspect of our lives. And even though it is obvious that all those types of surveillance do not all have the same disciplinary impact, it is hard to ignore that it steers, modulates and sometimes even disciplines individuals and specific groups. 

Contemporary societies are maybe best described through the notion of a hyper-opticon. Hyper refers to a rising complexity of surveillance, and the fact that surveillance now operates in complex networks on different scales with many centers of surveillance that all enforce specific norms. In a hyper-opticon people are seen in many ways depending on who they are, where they are and how they communicate. Through digital interaction they become visible to others and can watch others, but they are also datafied by large commercial actors that try to modulate and steer their behavior. These databases at scale, in turn, can be used by governments in order to control ‘insecurity’ (Bigo, 2008). In other words, the panopticon, non-opticon, post-panopticon and ban-opticon operate potentially simultaneously and are connected in large polycentric, transnational, layered and stratified networks. The non-opticon can thus quite quickly turn into ban-optical surveillance tool and the guiding and modulating can turn into direct control with a large disciplining capacity.

In the hyper-opticon there are many centers of surveillance ranging from national surveillance through the national state-security to transnational security networks like the NSA. The base of this hyper-opticon is that we all, from the moment we use digital platforms or even digital tools, we also are the object of commercial surveillance by the digital power houses like Google, Facebook, Amazon or TikTok, and the thousands of small actors like your local supermarket, webshops and online advertisers. In that same base layer of surveillance, we also are the object to many-to-many surveillance that is enabled by digital platforms. On top of that foundation of surveillance, we see that ban-optical, or even panoptical forms of surveillance, are built on top of that basis omnipresent surveillance.  It is thus clear that surveillance power is not ‘decentralized’ or ‘centralized’, but layered and stratified operating at different scales and in different centers. One such center – take Facebook or Instagram as an example – is clearly centralized, but it cannot be detached from other actors who temporarily can access those data. 

This hyper-opticon is a dispositif in a Foucauldian sense. It has clear material dimensions, surveillance is made possible through a huge global digital architecture. Cloudservers, internet cables, modems, interfaces of platforms, algorithms together with our personal digital devices form global surveillance networks. This hyper-opticon also has, as I have shown, ideological dimensions. Digital utopianism with its promise of a free and joyful individual life empowered by digital tools - as illustrated in much of the digital marketing talk since Apple’s 1984 ad - has normalized not only the use of digital technology but has also given rise to a naïve trust in the benevolent nature of those technologies. This trust was necessary for large institutions like Facebook, Instagram, TikTok and Twitter use surveillance to keep us hooked and scale up. They are of course not the only institutions who co-construct this surveillance network. A plethora of other actors including private moderation companies, national and internal security agencies have direct or indirect access to data or impact on how data is collected. This access is authorized by laws that were put in effect after all kinds of discourses on ‘security’ and danger were normalized. 

2024, 1984 on steroids? 

Since 1984, a lot has changed. Digital capitalism is now a dominant global force, economically but also culturally. This rising economic power of digital platforms is grounded in surveillance enabling infrastructure: cloud infrastructure, coding, and of course our smartphones, computers and laptops allow commercial actors not only to surveil and datafy our behavior, but also to steer, direct and eventually also discipline it. The promise of digital empowerment has been replaced by a more dystopian reality. The omnipresence of the digital tools does not exactly set us free. We are struggling to keep up with our mails. Laborers get monitored and surveilled through their electronic devices. People produce aspirational labor in the hope of monetizing their selves. Social platforms have disastrous effects on retail, mobility services, the tourism sector and politics. And even the normal consumer of digital content is unknowingly producing  attention labor that is monetized by platforms. The digital economy has profoundly affected all domains in life, and it is difficult to see how this empowers us. Yes, some influencers have made it big time, and we can find more information than ever online, but the price we pay for our digital integration is much larger than what the ‘free use of platforms’ suggests. 

The digital economy is even being described in terms of a new feudalism, techno-feudalism as Varafouakis (Varoufakis, 2023) calls it. These scholars stress how digital platforms enslave our minds and exploit us through surveillance. They stress how the digital economy is based on surveillance and datafication of all digital interaction in order to extract value. The computer that was hailed as our liberator in the 1984 ad, and in digital utopianism in general, turns out to have become a tool of surveillance and the monetization of free labor that installs a new feudalist regime where a limited group of fiefdoms ‘control’ our minds and steer our online behavior. This massive commercial, and thus private-owned surveillance, is the foundation on which contemporary US state surveillance is build. 

For many decades, the computer had a bad reputation. Apple’s 1984 ad can be read as a fundamental change in how people perceived the computer. Forty years after that ad, we see that the swing is going the other direction. It is clear now that digital utopianism has normalized unprecedented surveillance and that the personal computer didn’t create a free world. The reality of digital surveillance in a hyper-opticon logic shows how much control States and commercial actors have over our lives. Digital surveillance is omnipresent and uses different modes of power: from modulating and steering behavior, to (soft) biopolitics and repression to control individual and group behavior. That is not exactly the dream Apple sold customers in 1984. On the contrary.  It is clear that digital utopianism has normalized a very different reality. 


Arnaut, K. (2012). Super-diversity: elements of an emerging perspective. Diversities , vol. 14, No. 2.

Barthes, R. (2000 (1972)). Mythologies. New York: Hill and Wang.

Basturk, E. (2017). A brief analyse on post panoptic surveillance: Deleuze & Guattarian approach. International Journal of Social Sciences, Vol. VI, No. 2.

Bauman, Z., & lyon, D. (2013). Liquid Surveillance A Conversation. Cambridge: Polity Press.

Bigo, D. (2008). Globalized (In)Security: The field and the Ban-Opticon. In Bigo, Terror, Insecurity and Liberty. Illiberal practices of liberal regimes after 9/11, London & New York: Routledge. (pp. 10 - 48).

Black, E. (2001). IBM and the holocaust. The strategic alliance between Nazi Germany and America's most powerful corporation. London: Little, Brown and Company.

Boyne, R. (2000). Post-Panopticism. Economy and Society, (29:2), 285-307, DOI: 10.1080/030851400360505.

Foucault, M. (1989). Discipline, toezicht en straf. De geboorte van de gevangenis. Groningen: Historische uitgeverij.

Greenwald, G., (2014). No place to hide. Eward Snowden, The NSA & the surveillance state.  London: Penguin Books. 

Jobs, S. (2007). Steve Jobs announcing the first iPhone in 2007. YouTube.

Leskin, P. (/). Apple cofounder Steve Wozniak was so excited about the iconic '1984' commercial, he offered to pay $400,000 out of pocket to run it during the Super Bowl. Business Insider.

Maly, I. (2023). Digital economy and platform ideologies. Diggit Magazine.

Maly, I. (2023). Metapolitics, algorithms and violence. Far-right activism and terrorism in the attention economy. London & New York: Routledge.

Turner, F. (2006). From counterculture to cyberculture. Stewart Brand, the Whole Earth Network and the rise of Digital Utopianism. Chicago: University of Chicago Press.

Vaidhyanathan, S. (2008). Naked in the 'Nonopticon' Surveillance and marketing combine to strip away our privacy. The Chronicle Review of Higher Education.

Varoufakis, J. (2023). Techno-feudalism. What killed capitalism. London: The Bodley Head.

Zuboff, S. (2019). The age of surveillance capitalism. The fight for a human future at the new frontier of power. London: Profile books.