Conspiring Algorithms: Tracing the Anti-Vaccination and COVID-19 Conspiracy Movement on YouTube

by Didi Spaans

Image: Daniel Zender

Open article in pdf

We’re not just fighting a pandemic; we’re fighting an infodemic,” World Health Organization (WHO) director-general Tedros Adhanom Ghebreyesus claimed in a 2020 speech on the COVID-19 virus.[1] While “infodemic” is not an established concept in academic research,[2] the term has been used to describe “information epidemics” where statements mixing fear, speculation, and rumor are amplified and relayed worldwide by modern information technologies.[3] A report by the Center for Countering Digital Hate (CCDH) notes that, since the start of the international coronavirus vaccination program against SARS-CoV-2 (COVID-19, or ‘the coronavirus’), social media accounts that downplay the severity of the COVID-19 and propagate anti-vaccination claims have increased their followings by at least 17 million people worldwide.[4] On the popular video-sharing platform YouTube, CCDH calculates, 7,8 million people have subscribed to such accounts over the course of the pandemic.[5] YouTube, a company owned by Google, has been the most influential social media platform in propagating anti-vaccine and COVID-19 denial movements, with Facebook in second place. How can we understand the role of YouTube’s technological make-up play in influencing the discussion on COVID-19 and the related vaccines, and what are the political implications of YouTube’s video-recommendation algorithms?

The current state of popular uncertainty regarding the nature of the virus and the safety of vaccination appears to be affected by the great amount of data that individuals find online. On the web, and specifically on social media, anti-vaccination movements have flourished. In addition, public reaction to government-imposed social restrictions aimed at COVID-19 containment often takes place on social media platforms. Since the very first reported cases of the COVID-19, pharmaceutical companies have worked at unprecedented speed to develop an effective vaccine to help keep the virus under control. Nevertheless, the safety and efficacy of the vaccine has and continues to be questioned by a large number of people.[6] This distrust seems to be partly founded on the fact that prior to COVID-19, no vaccine for an infectious disease had ever been developed in such a short amount of time, and moreover, that no vaccine for preventing a human coronavirus infection had ever existed.[7] Vaccine hesitancy has taken on many conspiratorial forms. Some of these theories claim that Microsoft co-founder and billionaire philanthropist Bill Gates invented COVID-19 in a secret laboratory.[8] Other theories claim that COVID-19 vaccination injects a chip into the arm that tracks one’s movements.[9] Others claim that the Great Reset, an economic plan offering proposals for recovery from the COVID-19 crisis, is orchestrated by a group of world leaders that fabricated the pandemic to take control of the global economy.[10]

Such notions are examples of “conspiracy theories.” Karen M. Douglas et al. define conspiracy theories as “attempts to explain the ultimate causes of significant social and political events and circumstances with claims of secret plots by two or more powerful actors.”[11] Conspiracy theories, Douglas et al. write elsewhere, are alternative worldviews attributing to perceived social, governmental, political, and even supernatural elites a power over “the people” exercised through the relations of production and ideological structures of domination.[12] Importantly, these theories can form monological, holistic belief systems: self-sustaining worldviews which make complex global systems intelligible. Those who do not believe in these theories often regard their adherents as “stupid” or “irrational.” These vocal attacks also happen the other way around: people who believe COVID-19 conspiracy theories regard those who adhere to the virus containment measures introduced by the government (masking, social distancing, self-isolation) as “sheep” who refuse to “think critically.”

Van Prooijen and Douglas have remarked that conspiracy theories flourish specifically during periods of crisis.[13] They state that “people who have a relatively strong external locus of control … are more likely to report high levels of interpersonal mistrust, paranoia, and belief in conspiracy theories.”[14] Feelings of uncertainty and powerlessness increase people’s tendencies to resort to narratives that go against claims made and measures taken by authoritative institutions.

COVID-19 conspiracy theories predominantly concern health-related topics: supposedly, governments and large corporations aim to either kill “the people” or to intentionally make them ill. Such beliefs are disseminated and made cohesive through online materials and the virtual communities that are organized around these materials. A pandemic has serious social, economic, cultural and political consequences, and its effects on society are strong and internationally overwhelming. When the first vaccine against COVID-19 was developed, many negative responses were posted on the internet through Western social media platforms.[15] As of now, 85% of people search for healthcare information online,[16] where they are susceptible to influence by misleading information. It is necessary to examine the operational structure of the web to understand the spread of COVID-19 conspiracy theories. It is the word “social” in social media, I argue, that strengthens existing coronavirus-related uncertainties and the conspiracy theories sprouting from them.

In recent years, social media has come to play an increasingly large role in facilitating consumption of news and information. This movement has led to an increase in ideological polarization as social media generates so-called echo chambers and filter bubbles,[17] where we are “hearing our own thoughts about what’s right and wrong bounced back to us by the television shows we watch, the newspapers and books we read, the blogs we visit online … and the neighbourhoods we live in.”[18] Distrust of authority and conspiracy theories thrive in these environments.[19]

YouTube, a platform on which anyone can post a video on the condition that it follows the content guidelines set by Google is currently the most popular social media website for finding information about health issues.[20] At the same time, there has been a rise in cases of users being affected negatively by misleading information provided by YouTube’s recommendation algorithm.[21] With such a multitude of people searching for health information on YouTube, it is necessary to examine the ways in which the platform is structured. How does YouTube’s technical design influence and reinforce conspiracy theories related to the COVID-19 crisis and the global vaccination campaign?

Appeals on YouTube to emotion rather than reason have led to a widespread condition of distrust which, in turn, has resulted in a crisis of trust.[22] This crisis of trust corresponds to Lee McIntyre’s definition of “post-truth.”[23] In critical studies, the term “post-truth” refers to a historical shift in what Foucault calls the “regime of truth,” marked by the breakdown and reassembling of a particular apparatus of institutional truth production and maintenance.[24] However, I believe that it is not fruitful to solely study truth as a philosophical concept. Therefore, this article will leave aside the problem of defining absolute truth in virtual space. Rather than arguing that no truth is possible, which arguably is the same as stating that we will never understand each other, we should examine how truth claims manifest themselves in different spheres of life, be they virtual, cultural, or geographical. I follow N. Katherine Hayles in her argument that digital information is characterized by the “capitalist mode of flexible accumulation.”[25] Studying technology alone fails to take into account the ways in which information is implicated in the socio-political structures that make the information society of today possible. As digital culture accelerates the spread of conspiracy theories, it is necessary to study human-technology interaction to understand this spread.

Some conspiracy theories that circulate online are innocuous. Many COVID-19 conspiracy theories are not. Conspiracy-fueled distrust in science during the coronavirus pandemic has led to political polarization around the world and unnecessary loss of life.[26] Because I wish to understand how information circulates on YouTube and how this results in radical polarization, I build upon Jayson Harsin’s notion of emo-truth. Emo-truth, according to Harsin, is a manner in which truth is performed.Specifically, it is a performance of aggressive and often masculine trustworthiness that corresponds to a code of recognition.[27] We can observe the rhetoric of emo-truth in the most popular anti-vaccination videos on YouTube, which deny scientific expertise and perform displays of hate, violence and rage. The emotionally charged comments under these videos demonstrate that they operate on an affective level. Because these comments, which also contain emo-truth rhetoric, give us insight into the reception of anti-vaccination videos on YouTube, I will also examine them in this study. These videos, each of which appears on YouTube alongside a queue of recommended similar videos, attract a community of like-minded YouTube users who are repeatedly exposed to the same rhetorics. The popularity and similarity of anti-vaccination videos is a product of the YouTube recommendation system, which on the basis of keywords appearing in particular videos determines what content might be recommended to a user. This mediation, I will argue, gives anti-vaccination videos an inherent affective coherency which is knotted together by the viewer’s engagement in these videos through YouTube’s recommendation system.

Algorithms

Algorithms have been silently present throughout the course of this article. The background presence of algorithms is analogous to the manner in which they operate within the current COVID-19 crisis: algorithms control YouTube’s video-recommendation system, promoting to viewers content similar to what they have already seen. This reduces the diversity of content consumed by any particular viewer, a process which leads to ideological polarization. When such algorithmically determined processes have consequences impacting global health and public trust in institutions, as is the case in the ongoing COVID-19 crisis, it is important to understand and frame how polarization takes digital form in times of crisis.

When watching a video on YouTube, a viewer will not fail to notice the list of recommended videos displayed at the right side of the webpage. The composition of this list is determined by two elements: the user’s search history on YouTube itself and elsewhere, and the user’s digital profile, which YouTube determines according to specific algorithms programmed by the platform’s developers. Recommending videos to users on the basis of these datasets, YouTube seems to know exactly what each viewer likes, is interested in, and might want to watch next.[28] Selecting one video directs to others similar to the first. This process of data-informed content personalization tends to expose viewers repeatedly to the same or similar content and advertisements. This is the work of the algorithms.

What are algorithms and how can they be understood in relation to YouTube? In the book Introduction to Algorithms, Thomas H. Cormen et al. offer a timely description of the concept:

Now that there are computers, there are even more algorithms and algorithms lie at the heart of computing. … Informally, an algorithm is any well-defined computational procedure that takes some value, or set of values, as input and produces some value, or set of values, as output. An algorithm is thus a sequence of computational steps that transform the input into the output.[29]

According to this classic definition, an algorithm is a series of computational instructions put into a machine, which are to be followed step-by-step in order to solve a problem or achieve an optimal result. It is important to note, however, that this definition of an algorithm as a set of defined steps is somewhat of a simplification. What constitutes an algorithm has changed over time and the concept can be approached in a number of ways: technically, computationally, mathematically, politically, culturally, ethically, etc. Technical specialists, social scientists and the broader public consider and implement the term within different contexts.[30] At present, the term refers to more than simply a set of instructions. “Rather,” Mazzoti says,

the word now usually signifies a program running on a physical machine — as well as its effects on other systems. Algorithms have thus become agents, which is partly why they give rise to so many suggestive metaphors. Algorithms now do things. They determine important aspects of our social reality. They generate new forms of subjectivity and new social relationships.[31]

When placing algorithms within a discursive framework, we should not only be aware of what algorithms are, but also what their application and impact is. What is the role of the programmer in relation to the algorithm? Does the algorithm have agency? If so, what kind of agency is this? What is the role of computer users in relation to algorithmic structures? As algorithms solve problems by organizing what is unorganized, structuring what seems unstructured, they construct order out of chaos. Or as Ulf Otto states, “[t]hey introduce order into the world of data.”[32] This observation allows us to see that algorithms are what computers were built for initially: the task of finding order within numbers, reducing thinking to some sort of mechanical repetition. Algorithms calculate new experimental outcomes in order to reach an optimal outcome.[33]

When people speak of “YouTube’s algorithm,” they often refer to YouTube as a company and the choices it makes. The algorithm, and the way it is structured and implemented, have become one and the same with the company. As Tarleton Gillespie says, “the term [algorithm] offers the corporate owner a powerful talisman to ward off criticism, when companies must justify themselves and their services to their audiences, explain away errors and unwanted outcomes, and justify and defend the increasingly significant roles they play in public life”.[34] The algorithm, then, is held responsible for a platform’s undesirable effects rather than its designer or the corporate owner. Is YouTube or its algorithm to blame for the spread on the platform of coronavirus-related misinformation and conspiracy theories?

YouTube’s recommendation system

To their designers, YouTube’s algorithms arise from a certain model: a protocol. This protocol entails the formalization of a goal for the algorithm, which is articulated in code. Ultimately, the company YouTube is organized to make profit, and its development and utilization of algorithmic video-recommendation technology reflects this objective. It is in YouTube’s interest to keep the viewer engaged by presenting videos that match the user’s interests, because this increases watch time and generates more advertising clicks.[35] YouTube’s recommendation algorithm is designed to achieve this engagement by promoting the videos that users are most likely to watch to the end.[36] The recommendation system is optimized for watch-through, because a user who watches a video in its entirety is likely to watch the next recommended video as well. In this way, the user comes in contact with as many advertisements as possible. YouTube profits from selling advertisements, which are placed adjacent to or embedded within videos. For this reason, AutoPlay (where the next video automatically plays without having to select it) is YouTube’s default setting, ensuring that users spend as much time as possible watching recommended videos. According to tech-reporter and data scientist Karen Hao, 70% of all watch-time on YouTube is a result of the platform’s algorithmic recommendation system.[37]

This algorithmic recommendation system follows a two-step process to select the particular videos that will be presented to any particular user: firstly, it classifies videos according to a score based on performance analytics. This score is based on several elements including the popularity of the video, the date of its publication, the upload frequency of its creator, the amount of time users spend watching the video, and how long users stay on the platform watching other videos after viewing the video.[38] The second step in the algorithmic recommendation process is matching videos to users. This process is determined by the user’s watch history, their subscriptions, and what they do not watch.[39]

Radicalism, echo chambers, and filter bubbles

The design of the above-described algorithmic system may seem innocuous, but it has several problematic consequences. Because the platform is optimized to maintain user engagement, the algorithm tends to offer recommendations that reinforce a viewer’s already-existing outlook on the world. Emphasizing the concept of this mere familiarity effect, Alfano et al. note that “people tend to develop positive associations with the things, people, and concepts to which they’ve been directly exposed.”[40] Furthermore, they state that “people tend to believe or think they know the things that they’ve encountered before.”[41] The effectiveness of YouTube’s algorithm comes from its having been designed with two psychological considerations in mind: a viewer desires enjoyment, and a viewer is inclined to believe information that conforms to their worldview. YouTube praises itself for the way, they say, the platform creates communities. YouTube content creators (so-called YouTubers) mirror this language. Popular YouTubers often use nicknames for their audiences which suggest that these audiences are communities united in common interest.[42] The danger lurking in this interplay between the algorithm and its psychological effects is that it has the ability to generate echo chambers and filter bubbles rather than heterogeneous communities.

Dubois and Blank define the concept of echo chamber as “a situation where only certain ideas, information and beliefs are shared.”[43] Echo chambers occur when people with similar interests and ideology interact primarily with like-minded people in a closed group. When certain beliefs are stimulated by communication inside an isolated system, internet users become entangled within a web of selective exposure. Hence, echo chambers are primarily formed by users. Filter bubbles, alternatively, are a form of “algorithmic filtering which personalizes content presented on social media.”[44] Through personalized search engine results and recommendation systems, filter bubbles aggravate a user’s inclination to search for and consume media content that reinforces the user’s existing ideas, swiftly entrapping people in knowledge-closed circles.[45] In that they can distort one’s reality in ways that cannot be altered by outside sources, echo chambers and filter bubbles have the potential to create significant barriers to critical discourse.

A second peril of YouTube’s recommendation algorithm is that it seems to favor divisive, extreme and sensational content.[46] This kind of extreme content thrives within the recommendation system because it is highly effective in capturing a user’s sustained attention, which is, as mentioned before, one of the key metrics YouTube uses to sell advertisements. As a result, Röchert et al. explain that “[t]he YouTube recommendation algorithm partially paves the way for staying on the politically extreme path, especially if the user has had the impulse to visit something politically extreme from the beginning.”[47] But this is not always the case, Marc Tuters observes: “[a]cademic researchers exploring this phenomenon have … found that YouTube’s ‘recommendation algorithm’ has a history of suggesting videos promoting bizarre conspiracy theories to channels with little or no political content.”[48] Working within a continuous feedback loop of metrics data, algorithms are optimized to recommend content that users are most likely to watch. Extreme and sensational content is successful not just because it is more interesting than the sobering reality,[49] but also because algorithms operate in formats that engage a large number of users and learn from their engagement.

As a result, algorithmic recommendation procedures serve users with content that confirms their existing worldview. The fact that the recommendation system favors extreme content plays a major role in spreading and reinforcing radical ideas.

The Great Ban

YouTube’s algorithms, I have argued, lead users onto following specific ideological paths. But during the process of writing this article, I stumbled upon a problem that simultaneously demonstrated and frustrated my research. It had been my intention to investigate the algorithmic paths that facilitate the spread of holistic conspiracy theories which, in turn, carry enormous political and social consequences. However, in March 2021 the direction of my research was interrupted when YouTube started a campaign to actively ban videos containing material contradicting healthcare information issued by the WHO.[50] In my argument above, I have considered the algorithm — a highly complex theoretical term within the humanities[51] — according to an assumption that for YouTube its sole purpose is to serve the commercial objective of maximizing the amount of time that users spend watching YouTube videos. While I was familiar with YouTube’s fight against conspiracy theories, I was not aware of its active cancellation of COVID-19, and specifically anti-vaccination related, misinformation. The ban on videos countering information provided by health organizations such as the WHO involves an altered algorithm as well.[52] Users interested in anti-vaccination-related videos no longer see content reinforcing such ideas, as this content is either deleted at all or removed from the list of recommended content. This is an extremely political choice on behalf of YouTube: they are making a truth claim that denies alternative worldviews. For this reason, I have added another dimension to the question as to what role algorithms play in the spread of disinformation in times of COVID-19: what are the political implications of YouTube’s claim for truth?

Videos spreading coronavirus-related conspiracy theories and misinformation about vaccines are not the only kind of content that YouTube moderates. Videos containing pornography, flat earth theories, and neo-Nazi propaganda have, since prior to the pandemic, been routinely “cropped” from the algorithm. The algorithmic suppression of videos on these topics, as well as on coronavirus skepticism, has led to the advent of “borderline content” that tests the limits of YouTube’s content rules. This kind of content continues to be recommended by the algorithm because, despite its borderline-objectionable themes, it tends to generate views. COVID-19 anti-vaccination content is especially potent as “borderline content” because the COVID-19 anti-vaccination movement is relatively young, such that the border between acceptable and unacceptable vaccine-related content is not yet clearly defined. Inappropriate and divisive content should be banned according to YouTube’s policy. These videos, however, concur with the recommendation algorithm’s overall goal, which is to generate views.

Emo(tional) truth on Sky News Australia

In order to detect the “borderline” quality in anti-vaccination videos, I employ Jayson Harsin’s idea of emo-truth, which he defines as follows:

Emo-truth then is truth that often appears as “losing control.” While the surrounding promotional culture demands bragging (and that people be inured to it), emo-truth refers to the implosion of emotion, knowledge, and trust, in truth-telling/trust-giving and truth recognition and trust-granting. Emo-truth is aggressive, and must mix boasting with insults, attacks, and outrage. It must perform authenticity or truthfulness as aggressive emotion, in order to garner “active trust”.[53]

In short, emo-truth is where emotion serves an inference. It interferes with a user’s affective response and has the potential to alter one’s entire belief system.[54] Harsin takes Donald Trump’s media persona as a prime example of the inherently aggressive emo-truth rhetoric, stating that the former President’s extreme expressions, boastfulness, and willingness to insult are examples of emo-truth’s inherently aggressive rhetoric. Harsin names several specific aspects of Trump’s digital persona that utilize emo-truth rhetoric: the abundant use of capital letters and exclamations in his social media posts, his brash body language, and the sarcastic and mocking tone of his voice all contribute to the “outrageousness” of his political communication.[55] Very often, the information Trump attempts to convey through his words, be they verbal or textual, is loaded with racism, sexism and xenophobia. In this, Trump’s digital persona is exemplary of emo-truth rhetoric.

If the algorithm were granted any kind of anthro-pomorphistic qualities, it would ‘like’ the emo-truth rhetoric. Many of the videos that it recommends most frequently have sensationalizing all-caps titles containing exclamation points and other “clickbait” devices. The effect of Donald Trump’s emo-truth rhetoric is that not only his partisans become attracted to his social media posts. Even those who do not support Trump’s political agenda can be fascinated by the manner in which he expresses himself on social media. Guillaume Chaslot, a former Google employee who helped to program YouTube’s recommendation algorithm, writes that “[e]ven if a user notices the deceptive nature of the content and flags it, that often happens only after they’ve engaged with it … As soon as the AI learns how it engaged one person, it can reproduce the same mechanism on thousands of users.”[56] As a consequence, both the algorithm and the user contribute to the promotion of deceptive content. Algorithmic recommendation of content based on user engagement frequently entails the promotion of incendiary and controversial content. The more borderline a video is relative to YouTube’s content moderation policy, the more engagement it generates. As recommendation algorithms are informed by user activity, they prefer content that provokes the engagement of its viewers. A single click can be enough.

Emo-truth rhetoric is strongly present in COVID-19 conspiracy and anti-vaccination videos, as well as in the comment sections of these videos. Two examples of videos which use emo-truth rhetoric are “There is a ‘disturbing’ element to the vaccine rollout”[57] and “Australians must know the truth – this virus is not a pandemic.”[58] Both of these videos were uploaded to YouTube by Sky News Australia, a right-wing 24-hour news channel that, at the moment of writing, has 1.47 million subscribers on YouTube. Despite this channel being favorable to COVID-19 conspiracy theories and encouraging anti-vaccination sentiment, its videos remain online as of May 2021.[59] This could be because Sky News Australia has been in a business partnership with YouTube, since mid-2019.[60] As a result of this partnership, both parties have mutual financial interests. From its launch in 1996 to its takeover in 2018, Sky News Australia was little more than an oddity, located at the periphery of the Australian broadcasting system and watched only by a few distracted channel-hoppers.[61] When it started to shift its focus to digital platforms — hiring former Daily Telegraph digital editor Jack Houghton, entering partnerships with social media platforms, and covering non-Australian cultural figures such as Donald Trump, Greta Thunberg and Meghan Markle — the channel experienced an explosive growth in popularity. With its provocative right-wing editorial stance, Sky News Australia produces highly partisan opinion content targeted at a global audience. The channel’s response to the COVID-19 outbreak and the ensuing crisis has been to double down on their digital strategy: they frequently post videos that deny the existence of COVID-19, insinuate that the virus was man-made, demonize scientific institutions, and encourage anti-vaccination sentiment.

The videos “There is a ‘disturbing’ element to the vaccine rollout” and “Australians must know the truth – this virus is not a pandemic,” as of May 2020 viewed an astonishing 1.4 million and 4.1 million times, respectively, are two products of this strategy. Both show Australian commentators and interviewees discussing the coronavirus and international vaccine production and distribution. One of these commentators is Alan Jones, whom Business Insider describes as “among the most sensationalist out of all the hosts.”[62] Jones’ catchpenny statements fit within the emo-truth rhetoric Harsin proposes, for example when he says: “What I do find more disturbing is the fact — I’m sure you’re not aware of this — [that] healthcare providers and doctors have [been] banned from revealing which vaccine they’re offering.” This assertion is proven to be untrue, as visitors to the website of the Australian Government Department of Public Health can find which vaccine they will receive.[63] Jones’ use of emotional emphasis to make questionable statements sound convincing is a strategy of emo-truth rhetoric. Intonation allows what is proven false to be claimed as true.

By deploying words such as “disturbing,” “disgraceful,” and “terrifying,” these Sky News Australia videos illustrate Harsin’s point that “[e]mo-truth pertains to a style (regardless of content that might be false) that is highly aggressive; it often demonstrates outrage, disgust, and humiliation.”[64] People often perceive these emotions to be in themselves “indexical signs of truth and/or honesty, because they supposedly are harder to fake.”[65] We detect this emotional rhetoric when the commentators position themselves in their accounts of the “truth,” as in lines like “that is how I felt, when the government paid millions of dollars, of our money for a lump of land in Leppington”[66] and “that’s what I have been saying for months.”[67] Both of these lines are articulated in tones indicating feelings of stress, alarm and anger. Conveying these feelings, the commentators of Sky News Australia position themselves as “the voice of the people.” The commentators become whistleblowers of the truths that government and scientific authorities do not share with the public. The title “Australians must know the truth,” which implies that Australians are being lied to, also demonstrates this self-positioning by the commentators. Both videos downplay the risk of COVID-19 and undermine the efficacy of the vaccines.

Harsin states that “[e]mo-truth pertains first to the perception by citizen-audiences that someone is a truth-teller because they address supposedly hot button topics, too controversial for more cowardly communicators to touch.”[68] Sky News Australia continually claims to have inside information, implying that to ignore any of their claims or suggested actions will bring serious adverse consequences. By targeting an audience that is already vaccine-hesitant and winning the active trust of this audience, Sky News Australia’semo-truth rhetoric only reinforces the mistrustful viewer’s idea that governments and health institutions are deceitful. The overall sentiment in the comments sections of these videos illustrate the perception of Sky News as the ultimate truth-teller. Comments such as “Thank you SkyNews Australia for telling society a TRUTH. Unfortunately it’s very rare these days to hear something like that from world media,”[69] “It’s bad when US citizens have to get their news from Australia,”[70] and “Thank you AUSTRALIA Sky News. From USA..OUR NATION IS CENSORED LIKE COMMUNIST CHINA,”[71] exemplify a belief amongst viewers that Sky News Australia is the only source of substantial and trustworthy news. Furthermore, the blusterous rhetoric of these comment sections mirror the emo-truth performance of the videos themselves: displays of emo-truthful investment are, following Harsin, insulting and boasting modes, expressed textually through an abundance of exclamation points and capitalized words.[72] Such comments are, Rose-Stockwell contends, “strong indicators of engagement.”[73] When processed through YouTube’s recommendation algorithm, “[t]his kind of divisive content will be shown first, because it captures more attention than other types of content.”[74] When users are engaged, the algorithm is as well.

In closely examining the comments sections of the two videos, we find that another misinformation-related danger lurks, however subtle it may be. Although several videos published on Sky News Australia’s YouTubechannel are sympathetic to conspiracy theories involving the Great Reset, Bill Gates, and vaccine-induced DNA alteration, the two videos that I discuss above are not among them. But when we delve further into the comments sections of these videos, it becomes clear that many viewers are very much concerned with coronavirus-related conspiracy theories. The trigrams most common in the comments section of the video “Australians must know the truth — this virus is not a pandemic”[75] are, ordered by frequency, “the great reset,” “build back better,” and “new world order.”[76] Even though the Great Reset and Build Back Better are slogans related to various economic, social, political, and environmental programs to tackle COVID-19 and other global crises, the conspiracy-minded have interpreted these slogans as code words signaling the advent of a new world order in which the virus functions as a “plandemic:” a pandemic staged by the global elite to impose new forms of social control. In its other videos, Sky News Australia encourages the idea that these terms have become “proof” of a (global) conspiratorial plot by a cabal of elites. The interplay between new forms of emo-truth rhetoric and already-existing political uncertainty facilitates the development of contemporary conspiracy theories.

Recommendation to recommendation…

Conspiracy theories have existed long before the invention of the internet, having been documented since ancient times in multiple cultures around the world.[77] Documented conspiracy theories date as far back as to AD 64, when the great fire of Rome transpired and Emperor Nero, who was out of town when the fire erupted, was accused of deliberately starting the fire in order to seize power and rebuild Rome according to his own political vision. Nero, displeased, reacted by spreading his own conspiracy theory, which held that the Christian community was to blame for the fire. Nero’s conspiracy theory caught on, leading many Christians to be crucified or burned alive.[78] In the many centuries since the time of Nero, conspiracy theories have continued to capture individuals’ imaginations. On the contrary, the internet has proven to provide billions of people with information that they would not get as quickly and efficiently, or at all, for that matter, and it is much easier to communicate and form groups with like-minded people. The internet has accelerated the spread of conspiracy theories. A quick search on the web can give almost any statement substance.[79] At the same time, the internet also poses a potential challenge to conspiracy theorists. As Steve Clarke observes, in the same way that the internet allows people to instantly disseminate conspiracy-informed explanations for certain events, it similarly allows for anti- and non-conspiracy-theorists to express criticisms just as quickly.[80] With all of these considerations in mind, the internet — and YouTube as an extension of that — fulfills an important role in the circulation of contemporary conspiracy theories.

In the coronavirus pandemic, YouTube videos and their comments sections are not exclusive in espousing conspiracy theories. Until recently, the recommended videos listed next to these videos did so as well. When a user would watch the video dismissing the factuality of the COVID-19 pandemic in October 2020, a month after the video was originally published and before YouTube’s strict campaign against misinformation began, the list of recommendations looked very different from what it looks like after YouTube’s altered algorithm. In October 2020, videos with titles such as “63 Documents the Government Doesn’t Want You to Read,” “Are We Being Told the Truth About COVID-19,” and “Global Elite’s ‘Great Reset’ Agenda (Shocking Discoveries Revealed)” appeared in the playlist of recommendations. In these playlist recommendations, we can clearly see the formation of echo chambers and filter bubbles where users dive further into misinformation about COVID-19 and the COVID-19 vaccines.[81] Writing this article, I have had a front-row seat to watch YouTube tweak and refine its recommendation algorithm. In late March and early April 2021, the list of recommendations accompanying this specific video consisted of non-conspiracy-promoting videos telling me how I, a “suspicious” content-chaser, should be aware of the many conspiracy theories online and should further be informed about the virus and the myths surrounding it.[82] In mid-April, the same query pushes me toward videos titled “Why I HAVEN’T yet taken the COVID VACCINE,” “Vaccine Passports: THIS Is Where It Leads” and “Perspectives on the Pandemic.” The goal of the algorithm has been changed multiple times, leading to different outcomes.

This change has manifested itself in multiple ways. First, YouTube has altered the programming of its recommendation algorithms. The result of this alteration is that when a hesitant web user queries their coronavirus-related insecurities online, they will no longer be directed to filter bubbles assenting to the idea that COVID-19 is harmless. Second, YouTube has started to ban and delete videos and accounts that promote “fake news” about COVID-19 and encourage doubts about vaccine safety. Third, alterations have been made to the auto-complete function of YouTube’s video search engine, which the user finds on YouTube’s default page: the first page displayed when a user visits YouTube. These changes mean that search queries that situate COVID-19 and the vaccines in a negative light are no longer recommended to users. Fourth, YouTube now promotes videos that follow WHO guidelines and that encourage coronavirus vaccination.[83]

These changes lead to a contradiction. On the one hand, YouTube’s policy prohibits content disputing the statements of the WHO. On the other hand, the algorithm is programmed to promote content that engages as many viewers and generates as many views as possible. If that promoted content is against the YouTube policy, it becomes borderline content. The videos denying that COVID-19 is a pandemic and doubting the safety and efficacy of vaccination fall under this category. This brings us to the following questions: What exactly do algorithms do? What is their agency — if they have any at all? I argue that the algorithm, and YouTube’s recommendation system specifically, has been enormously successful in exploiting already-existing popular opposition to COVID-19 containment measures. Nevertheless, even users who are not opposed to COVID-19 containment measures find that YouTube’s recommendation algorithm leads them to videos expressing fringe viewpoints. As a result of its tendency to recommend these kinds of videos which cast doubt on the WHO’s statements about the coronavirus, YouTube’s algorithm has become highly political.

Conspiring algorithms

The goal of YouTube’s recommendation algorithm is to generate user engagement. Sensationalist and conspiracy-promoting videos are often recommended by this algorithm because they have provocative “clickbait” titles and concern controversial themes which stimulate user engagement. YouTube’s commercial interest conflicts with the interest of global health, and YouTube’s promotion of videos which question scientific expertise may have dire consequences for humanity. The dynamics of the relation between humans and technology play an immense role within this conflict. It is crucial to acknowledge the role of users as participants in the world of data for understanding the political implications of algorithms on social platforms. If users are interested in a particular type of content, the algorithm will be as well. The algorithm responds to what people do online: what links they click on, how long they watch a video, and what they scroll through. When one attempts to understand YouTube and its use of algorithmic technologies, one should take into account the different factors involved in algorithmic systems. It is important to analyze what algorithms do, how they relate to YouTube’s prominent subcultures, what roles they play in political polarization, and how they are utilized in YouTube’s monetary and commercial ambitions. Many actors are involved in the distribution of information online: institutions, governments, web users, content creators, and uploaders — and, of course, YouTube itself.

Information distribution, then, is an interplay between human and machine. If the datafication of information has societal consequences, the human being is inevitably located in the “emergent processes through which consciousness, the organism and the environment are constituted.”[84] The individual is not simply a consciousness “in control.”[85] In a state of crisis, when declarations of truth concern global health, the consequences of datafication are weighty. If it is not a doctor telling what is best, but the web, how can a person decide how to act? We must consider the role of YouTube and other social media platforms in determining and distributing claims of “truth.” Because social media algorithms reinforce outlooks on the world, it is necessary to delve further into the workings of social media and their role in global health.

What defines “conspiracy,” and who, in contemporary society, decides whether something is “true” or not? With so many people relying on social media for information, health-related and otherwise, we have entered a novel digital paradigm where neither God nor science determines what the “truth” is. Notions of truth may be out of our hands. In the digital space, algorithmic information processing is purely a mathematical process involving zeroes and ones. But without web-users to make decisions about “interestingness” or “clickability,” mathematical theories of information do not have any substance or meaning. The algorithm needs users, because it depends on user participation. If the algorithm “likes” sensationalist videos, it is partly because humans virtually perform and illustrate their interest in this kind of content. In order to achieve its intended optimal results, an algorithm needs to learn its users’ desires. As our lives are increasingly shaped by algorithmic processes, let us not forget those without which the algorithm could not be: the designers of the algorithm, the users that modify their practices in response to algorithms, and lastly, the institutions, companies and individuals who upload to the internet the data that algorithms process.

Bibliography

Amoore, Louise. Cloud Ethics: Algorithms and the Attributes of Ourselves and Others. Duke University Press, 2020.

Ahmadi, Omid, Jacqueline Louw, Heta Leinonen, and Peter Yee Chiung Gan. “Glioblastoma: Assessment of the Readability and Reliability of Online Information.” British Journal of Neurosurgery (2021): 1-4.

Airoldi, Massimo, Davide Beraldo, and Alessandro Gandini. “Follow the Algorithm: An Exploratory Investigation of Music on YouTube.” Poetics 57, (2016): 1-13.

Alfano, Mark, Amir Ebrahimi Fard, J. Adam Carter, Peter Clutton, and Colin Klein. “Technologically Scaffolded Atypical Cognition: The Case of YouTube’s Recommender System.” Synthese, (2020): 1-24.

Bishop, Bill. The Big Sort: Why the Clustering of Like-Minded America Is Tearing us Apart. New York: Houghton Miffin Harcourt, 2008.

Chadwick et al. “Online Social Endorsement and Covid-19 Vaccine Hesitancy in the United Kingdom.” Social Media+ Society 7, no. 2 (2021).

Clarke, Steve. “Conspiracy Theories and Conspiracy Theorizing.” In Conspiracy Theories: the Philosophical Debate, edited by David Coady, 77-92. Milton: Routledge, 2019.

Cormen, Thomas H. et al. Introduction to Algorithms. Third edition. Cambridge, Massachusetts & London: MIT Press, 2009.

D’Souza, Ryan S., Shawn D’Souza, Natalie Strand, Alexandra Anderson, Matthew NP Vogt, and Oludare Olatoye. “YouTube as a Source of Medical Information on the Novel Coronavirus 2019 Disease (COVID-19) Pandemic.” Global Public Health 15, no. 7 (2020): 935-42.

Davidson, James, Benjamin Liebald, Junning Liu, Palash Nandy, Taylor Van Vleet, Ullas Gargi, Sujoy Gupta et al. “The YouTube Video Recommendation System.” In Proceedings of the Fourth ACM Conference on Recommender Systems (2010), 293-96.

Douglas, Karen M., Robbie M. Sutton, and Aleksandra Cichocka. “The Psychology of Conspiracy Theories.” Current Directions in Psychological Science 26, no. 6, (2017): 538-42.

Douglas, Karen M., Joseph E. Uscinski, Robbie M. Sutton, Aleksandra Cichocka, Turkay Nefes, Chee Siang Ang, and Farzin Deravi. “Understanding conspiracy theories.” Political Psychology 40 (2019): 3-35.

Dubois, Elizabeth, and Grant Blank. “The Echo Chamber is Overstated: The Moderating Effect of Political Interest and Diverse Media.” Information, Communication & Society 21, no. 5 (2018): 729-45.

Gillespie, Tarleton. “Can an Algorithm be Wrong?” Limn 1, no.2 (2012).

Hayles, N. Katherine. How We Became Posthuman: Virtual Bodies in Cybernetics, Literature, and Informatics. Chicago & London: University of Chicago Press, 2008.

Hao, Karen. “YouTube Is Experimenting With Ways To Make its Algorithm Even More Addictive.” MIT Technology Review, September 27 (2019).

Harsin, Jayson. “Post-truth Populism: The French Anti-gender Theory Movement and Cross-cultural Similarities.” Communication Culture & Critique 11, no. 1 (2018): 35-52.

—. “Trump l’Œil: Is Trump’s Post-Truth Communication Translatable?” Contemporary French and Francophone Studies 21, no. 5 (2017): 512-22.

 Havey, Nicholas Francis. “Partisan Public Health: How Does Political Ideology Influence Support for COVID-19 Related Misinformation?” Journal of Computational Social Science 3, no.2 (2020): 319-42.

Hoekstra, Anne-Lotte. “Up Next: Recommended for you by YouTube: A Case Study Analysis on the Implications of YouTube’s Advertising-based Business Model on the US 2016 Presidential Elections.” Student thesis, KTH Royal Institute of Technology, 2020.

Holt, Jared. “White Supremacy Figured Out How to Become YouTube Famous.” Right Wing Watch,  October, 2017.

Iboi, Enahoro A., Calistus N. Ngonghala, and Abba B. Gumel. “Will an Imperfect Vaccine Curtail the COVID-19 Pandemic in the US?.” Infectious Disease Modelling 5 (2020): 510-24.

Kaiser, Jonas and Adrian Rauchfleisch. “Unite the Right? How YouTube’s Recommendation Algorithm Connects the U.S. Far-Right.” Data & Society Media Manipulation, April 11, 2018.

Kátai, Zoltán. “The Challenge of Promoting Algorithmic Thinking of Both Sciences‐and Humanities‐oriented Learners.” Journal of Computer Assisted Learning 31, no. 4 (2015): 287-99.

Khorsun, Nelli. “Understanding and Responding to Algorithm: How Different Age Groups Reflect and Respond to Problematic Aspects of YouTube’s Algorithm.” Project. KTH Royal Institute of Technology, 2020.

Krasmann, Susanne. “Secrecy and the Force of Truth: Countering Post-truth Regimes.” Cultural Studies 33, no. 4 (2019): 690-710.

Ledwich, Mark, and Anna Zaitsev. “Algorithmic Extremism: Examining YouTube’s Rabbit Hole of Radicalization.” arXiv preprint arXiv:1912.11211, (2019).

McIntyre, Lee. Post-truth. MIT Press, 2018.

Nielsen, Rasmus Kleis, et al. “Navigating the ‘infodemic’: How People in Six Countries Access and Rate News and Information about Coronavirus.” Reuters Institute, 2020.

Otto, Ulf. “Theatres of Control: The Performance of Algorithms and the Question of Governance.” TDR/The Drama Review 63, no. 4 (2019): 121-38.

Papadamou, Kostantinos, Savvas Zannettou, Jeremy Blackburn, Emiliano De Cristofaro, Gianluca Stringhini, and Michael Sirivianos. “Understanding the Incel Community on YouTube.” arXiv preprint arXiv:2001.08293, 2020.

Pariser, Eli. The Filter Bubble: What the Internet is Hiding from You. New York, NY: Penguin Press, 2011.

Patwa, Parth, Shivam Sharma, Srinivas PYKL, Vineeth Guptha, Gitanjali Kumari, Md Shad Akhtar, Asif Ekbal, Amitava Das, and Tanmoy Chakraborty. “Fighting an infodemic: Covid-19 fake news dataset.” arXiv preprint arXiv:2011.03327, 2020.

Orso, Daniele, Nicola Federici, Roberto Copetti, Luigi Vetrugno, and Tiziana Bove. “Infodemic and the spread of fake news in the COVID-19-era.” European Journal of Emergency Medicine 27, no.5 (2020): 327-28.

Prooijen, Jan-Willem van, and Karen M. Douglas. “Conspiracy Theories as Part of History: The Role of Societal Crisis Situations.” Memory Studies 10, no. 3 (2017): 323-33.

Röchert, Daniel, Muriel Weitzel, and Björn Ross. “The Homogeneity of Right-wing Populist and Radical Content in YouTube Recommendations.” International Conference on Social Media and Society (2020): 245-54.

Rose-Stockwell, Tobias. “This Is How Your Fear and Outrage Are Being Sold for Profit.” Quartz (2020).

Rosenbaum, Lisa. “Escaping Catch-22—Overcoming COVID Vaccine Hesitancy.” The New England Journal of Medicine 384, no.14 (2021): 1367-71.

Solomon, Daniel H., Richard Bucala, Mariana J. Kaplan, and Peter A. Nigrovic. “The ‘Infodemic’ of COVID‐19.” Arthritis & Rheumatology 72, no.11 (2020): 1806-08.

Spinelli, Larissa, and Mark Crovella. “How YouTube Leads Privacy-Seeking Users Away from Reliable Information.” Adjunct Publication of the 28th ACM Conference on User Modeling, Adaptation and Personalization, 2020.

Tufekci, Zeynep. “YouTube, the Great Radicalizer.” The New York Times , March 15, 2018.

Tuters, Marc. “Fake News and the Dutch YouTube Political Debate Space.” The Politics of Social Media Manipulation, edited by Richard Rogers & Sabine Niederer 217-237. Amsterdam: University Press, 2020.

Uscinski, Joseph E., Darin DeWitt, and Matthew D. Atkinson. “A Web of conspiracy? Internet and Conspiracy Theory.” In Handbook of Conspiracy Theory and Contemporary Religion, edited by Asbjørn Durendal, David G. Robertson, and Egil Asprem, 106-30. Leiden: Brill, 2018.

Vosoughi, Soroush, Deb Roy, and Sinan Aral. “The Spread of True and False News online.” Science 359, no. 6380 (2018): 1146-51.

Zimmer, Franziska, Katrin Scheibe, Mechtild Stock, and Wolfgang G. Stock. “Fake News in Social Media: Bad Algorithms or Biased Users?.” Journal of Information Science Theory and Practice 7, no. 2 (2019): 40-53.


[1] “Munich Security Conference,” World Health Organization, https://who.int/director-general/speeches/detail/munich-security-conference.

[2] Nielsen et al., “Navigating the ‘infodemic.”

[3] Orso et al., “Infodemic and the Spread of Fake News in the COVID-19-era”; Patwa et al., “Fighting an Infodemic”; Solomon et al., “The ‘Infodemic’ of COVID‐19,” 1806.

[4] “Disinformation Dozen,” The Center of Digital Hate, 2020.

[5] Ibid.

[6] Rosenbaum, “Escaping Catch-22,” 1370.

[7] Iboi et al., “Will an Imperfect Vaccine,” 515.

[8] “A bizarre conspiracy theory puts Bill Gates at t‌he center of the coronavirus crisis — and major conservative pundits are circulating it,”Business Insider (19 April 2020), https://www.businessinsider.nl/coronavirus-conspiracy-bill-gates-infowars-2020-4?international=true&r=US.

[9] “Coronavirus vaccine will be a way of injecting people with microchips? Don’t fall for this Facebook hoax,” Times of India (30 September 2020), http‌‌s://timesofindia.indiatimes.com/life-style/health-fitness/health-news‌/‌‌corona‌virus-vaccine-will-be-a-way-of-injecting-people-with-microchips-dont-fall-for-this-facebook-hoax/articleshow/78404956.cms.

[10] Goodman, Jack and Flora Carmichael, “The coronavirus pandemic ‘Great Reset’ theory and a false vaccine claim debunked.” BBC, https://bbc.com/‌news/55017002.

[11] Douglas et al., “Understanding Conspiracy Theories,” 4-5.

[12] Douglas et al., “The Psychology of Conspiracy Theories,” 539.

[13] Prooijen and Douglas, “Conspiracy Theories as Part of History,” 324.

[14] Id., 328.

[15] Chadwick et al., “Online Social Endorsement and Covid-19 Vaccine Hesitancy in the United Kingdom.”

[16] Ahmadi et al., “Glioblastoma,” 1.

[17] Pariser, The Filter Bubble, 7.

[18] Bishop, The Big Sort, 39.

[19] Zimmer et al., “Fake News in Social Media,” 41.

[20] D’Souza et al., “YouTube as a Source of Medical Information on the Novel Coronavirus 2019 Disease (COVID-19) Pandemic,” 935.

[21] Khorsun, “Understanding and Responding to Algorithm,” 1.

[22] McIntyre, Post-truth, 1-2.

[23] Ibid.

[24] Krasmann, “Secrecy and the Force of Truth,” 690.

[25] Hayles, How We Became Posthuman, 131.

[26] Havey, “Partisan Public Health,” 319.

[27] Harsin, “Trump l’Œil,” 514.

[28] Airoldi et al., “Follow the Algorithm,” 8.

[29] Cormen et al., Introduction to Algorithms, xiv.

[30] Gillespie, “Can an Algorithm be Wrong?”

[31] Mazzotti, “Algorithmic Life,” 33.

[32] Otto, “Theatres of Control,” 125.

[33] Amoore, Cloud Ethics, 2020.

[34] Gillespie, “Can an Algorithm be Wrong?,” 21.

[35] Alfano et al., “Technologically Scaffolded Atypical Cognitio,” 3.

[36] Ibid.

[37] Hao, “YouTube Is Experimenting.”

[38] Davidson et al., “The YouTube Video Recommendation System,” 296.

[39] Ibid.

[40] Alfano et al., “Technologically Scaffolded Atypical Cognitio,” 8-9.

[41] Id., 9

[42] Examples of these are some of the most-viewed YouTubers, such as PewDiePie (108 million subscribers), who calls his fan base “bros” and James Charles (25,5 million subscribers) “sisters.”

[43] Dubois and Blank, “The Echo Chamber is Overstated,” 729.

[44] Id., 731.

[45] Pariser, The Filter Bubble, 10.

[46] Papadamou et al., “Understanding the Incel Community on YouTube.”

[47] Röchert et al., “The Homogeneity of Right-wing Populist and Radical Content in YouTube Recommendations.”

[48] Tuters, “Fake News and the Dutch YouTube Political Debate Space,” 217.

[49] Vosoughi et al., “The Spread of True and False News online,” 1147.

[50] YouTube, “Community guidelines,” https://www.youtube.com‌/how‌youtube‌‌works/‌‌‌‌‌policies/community-guidelines/.

[51] Kátai, “The Challenge of Promoting Algorithmic Thinking,” 287.

[52] Also see: Cooper, Paige, “How does the Youtube Algorithm Work? A Guide to Getting More Views,” Hootsuite, 18 August 2020, https://blog.hootsuite.com/how-the-youtube-algorithm-works/.

[53] Harsin, “Trump l’Œil,” 516.

[54] Id., 520.

[55] Id., 519.

[56] Chaslot, Guillaume, “The toxic potential of YouTube’s feedback loop,” Wired (13 July 2019), https://www.wired.com/story/the-toxic-potential-of-youtubes-feedback-loop/; See, as an example, Donald Trump’s post on his (meanwhile removed) Twitter page: “To Iranian President Rouhani: NEVER, EVER THREATEN THE UNITED STATES AGAIN OR YOU WILL SUFFER CONSEQUENCES THE LIKES OF WHICH FEW THROUGHOUT HISTORY HAVE SUFFERED BEFORE. WE ARE NO LONGER A COUNTRY THAT WILL STAND FOR YOUR DEMENTED WORDDS OF VIOLENCE & DEATH. BE CAUTIOUS!” (posted 22 July 2018).

[57] “There is a ‘disturbing’ element to the vaccine rollout,” Sky News Australia, YouTube, (25 February 2021), https://www.youtube.com/watch?v=Wa1o‌ZV7O3h0.

[58] “Australians must know the truth – this virus is not a pandemic,” Sky News Australia. YouTube, (16 September 2020), https://www.youtube.com/watch?v=kGBEaYEtiys.

[59] See other titles of Sky News Australia’s videos in regard to COVID-19: “Vaccine passports place all of our freedoms under threat” (21 April 2021), “Vaccine passports ‘serious concern’ for humanity’s ‘freedoms’ and ‘social interaction’” (15 April 2021), “The Great Reset is a ‘coup’ by the globalist elite” (11 February 2021), and “Plans to use COVID for Great Reset are ‘very sinister’” (8 February 2021).

[60] Blackiston, Hannah, “Sky News partners with YouTube, Microsoft News, Facebook and Taboola on content distribution deals.” Mumbrella, 5 August 2019, https://mumbrella.com.au/sky-news-partners-with-youtube-microsoft-news-facebook-and-taboola-on-content-distribution-deals-592069.

[61] Davies, Anne, “Sky News Australia is tapping into the global conspiracy set – and it’s paying off,” The Guardian, 23 February 2021, https://theguardian.com/‌australia-news/2021/feb/24/sky-news-australia-is-tapping-into-the-global-conspiracy-set-and-its-paying-off.

[62] Wilson, Cam, “‘In digital, the right-wing material is 24/7’: How Sky News quietly became Australia’s biggest news channel on social media,” Business Insider, 6 November 2020, https://www.businessinsider.com.au/sky-news-australia-biggest-social-media-channel-culture-wars-2020-11.

[63] See: “Which COVID-19 vaccine will I receive?,” Australian Government Department of Health, https://www.health.gov.au/initiatives-and-programs/covid-19-vaccines/getting-vaccinated-for-covid-19/which-covid-19-vaccine-will-i-receive.

[64] Harsin, “Post-truth Populism,” 46.

[65] Ibid.

[66] “There is a ‘disturbing’ element to the vaccine rollout,” Sky News Australia.

[67] Ibid.

[68] Harsin, “Post-truth Populism,” 45.

[69] YouTube comment, “Australians must know the truth: the virus is not a pandemic,” Sky News Australia.

[70] YouTube comment, “There is a ‘disturbing’ element to the vaccine rollout,” Sky News Australia.

[71] YouTube comment, “There is a ‘disturbing’ element to the vaccine rollout,” Sky News Australia.

[72] Harsin, “Trump l’Œil,” 517.

[73] Rose-Stockwell, “This is How Your Fear and Outrage.”

[74] Ibid.

 [75]Other most common trigrams were ‘not the vaccine’, ‘take the vaccine’ and ‘there is no’.

[76] I have implemented the following codes in order to find the most used words in the comment section. To scrape all comments of the videos, I implemented the code https://github.com/MAN1986/LearningOrbis/blob/master/scrapeCommentsWithReplies.gs. Then, I used the code proposed by Tirthajyoti Sarkar, https://towards‌datascience.com/very-simple-python-script-for-extracting-most-common-words-from-a-story-1e3570d0b9d0, to find the most commonly used words.

[77] Uscinski et al., “A Web of Conspiracy?,” 118.

[78] Prooijen and Douglas, “Conspiracy Theories as Part of History,” 326.

[79] Uscinski and Atkinson, “A Web of Conspiracy?,” 107.

[80] Clarke, “Conspiracy Theories and Conspiracy Theorizing,” 79.

[81] The Wayback Machine, https://web.archive.org/web/202011221131‌23if_/https://www.youtube.com/watch?v=kGBEaYEtiys.

[82] I research these websites in “incognito” mode and through VPN in order to stay as “clean” as possible.

[83] Google, “COVID-19 medical misinformation policy,” https://support.google.com/youtube/answer/9891785?hl=en.

[84] Hayles, How We Became Posthuman, 288.

[85] Ibid.

Leave a Reply

Your email address will not be published. Required fields are marked *