Research Review

Vaccine Mis- and Disinformation

Introduction

This research review addresses and expands on a critical case study at the intersection of information disorder and scientific knowledge: the antivaccination movement. In this review we examine the existing literature on the antivaccination movement and address emerging questions and research gaps. Through this specific case study, we seek to highlight the effects of mis- and disinformation on societies’ abilities to effectively communicate and act on credible scientific knowledge.

Events during the Covid-19 pandemic have shown that public health misinformation can have serious consequences for societies around the globe. Even as a potential Covid-19 vaccine is months or even years away, antivaccine protestors are already conducting disinformation campaigns and aligning with other conspiracy theorists to undermine public health initiatives (Molteni 2020). Experts have warned that vaccine conspiracy theorists, who have small but effective and growing online networks, could damage efforts to establish herd immunity[1] (Ball 2020; Wadman 2020; Garza 2020).

At the same time, concern over recent large-scale election interference campaigns has led to a wealth of scholarly attention paid to how dis- and misinformation may affect political processes such as voting or campaigns. While these issues are critical to the larger field of disinformation studies, this review demonstrates that the reach of problematic information online extends far beyond the boundaries of election-related concerns. Forms of problematic information such as dis- and misinformation have always intersected with broader knowledge fields, such as scientific expertise, with consequences for human and environmental health.

Understanding scientific misinformation

At their core, scientific mis- and disinformation are not categorically distinct from other forms of problematic information. MediaWell defines “disinformation” as a rhetorical strategy that produces and disseminates false or misleading information in a deliberate effort to confuse, influence, harm, mobilize, or demobilize a target audience, while “misinformation” is information that is unintentionally spread, but tends to confuse or harm its audience. (For more on this complex topic, see our literature review onDefining “Disinformation.”)

Instead of conceptualizing scientific or public health mis- and disinformation as specific types, it’s perhaps more useful to think of them as among the myriad ways that the spread of problematic information poses threats to a safe, just, and healthy society. As with instances of more obviously political mis- and disinformation, problematic information about science has been heavily dependent on phenomena such as declines in public trust, denigration of expertise and expert figures, the rise of global populism, and the amplifying quality of social media (Wanless and Berk 2019; Bennet and Livingston 2018; Elsasser and Dunlap 2013).

The value in addressing mis- and disinformation as they relate to specific case studies lies in those studies’ ability to clearly demonstrate the material effects of a polluted information ecosystem. While politically geared propaganda or election interference can result in lived consequences, their effects on institutions or outcomes can be difficult to measure, or to point to as “evidence” of an information problem (Benkler 2019; Karpf 2019). However, a look at scientific misinformation, especially related to health concerns, presents an opportunity to see the immediate and dire consequences of a problematic information ecosystem, including massive and costly public health crises. The current Covid-19 pandemic provides a tragic example of this: false information about viral spread, medications, and symptoms can come at the cost of human lives (United Nations 2020).

Addressing the antivaccination movement as a scientific mis- and disinformation problem is an urgent need—perhaps now more than ever, seeing that the path to mitigating Covid-19 will likely include a massive vaccination campaign on a global scale. In a theoretical sense too, the antivaccination movement demonstrates the individual- and collective-level consequences of larger information problems such as the denigration of scientific expertise, the loss of trust in institutions, and nefarious actors.

The antivax movement

The modern antivaccination movement can best be described as an unwillingness to participate in routine, and necessary, vaccination procedures, against the recommendations of credible public health institutions and doctors. This can range from outright refusal of any sort of vaccine to alterations in accepted vaccine schedules. Though there are different ways to define the antivaccination movement, we use it here to describe both “antivaxxers,” or active antivaccination advocates, and those expressing any degree of  “vaccine hesitancy,” a term used to “depolarize the ‘pro’ versus ‘anti’ vaccination alignment and to express the spectrum of parental attitudes toward vaccines” (Edwards and Hackell 2016; see also Cooper et al. 2018; Larson et al. 2014). While vaccine hesitancy has been defined in a number of ways, the World Health Organization’s SAGE Working Group on Vaccine Hesitancy defined the term as “a behavior, influenced by a number of factors including issues of confidence (do not trust a vaccine or a provider), complacency (do not perceive a need for a vaccine or do not value the vaccine), and convenience (access)” (WHO n.d.; see also Edwards and Hackell 2016).

Within the United States specifically, an increasing number of people have chosen not to vaccinate themselves, or their children, despite widespread and well-established evidence that vaccinations pose no true threat to human health and bring considerable value to shared public health. Even though the vast majority of people still participate in routine vaccination, and true antivaccination advocates are a vocal minority, vaccination rates have declined in recent years, leading to a resurgence of previously eradicated diseases such as measles and whooping cough (Feldscher 2017; Patel et al. 2019). Additionally, although the term “antivax” has typically been applied to childhood vaccines, vaccine reluctance has led to concerns about controlling future epidemic illnesses as well as routine illnesses such as the seasonal flu (Johnson et al. 2020). Reasons for vaccine refusal are complex, and as some scholars argue, frequently misunderstood (Gottlieb 2016; Wang et al. 2015). Many of the most extreme cases might center around unsubstantiated concerns about harm, such as that vaccines cause autism, that they contain dangerous preservatives or chemicals, or conspiratorial ideas that they are part of a hidden government scheme). As Navin (2015) notes in “Values and Vaccine Refusal,” though, reasons for vaccine hesitancy are also rooted in much larger societal value shifts that exist somewhat separately from pseudoscientific or conspiratorial claims. For example, an increasing societal emphasis on the individual has led some parents to embrace a “hyper-individualism,” where they are unwilling to accept a small level of individual risk for their children by getting them vaccinated in order to prevent the much larger community-wide risk of an unvaccinated populace.

Research on the antivaccination movement is a relatively established field, especially in contrast to many of the new areas of interest in the rapidly developing realm of disinformation studies. There are, however, unanswered questions. Much of the emerging research on vaccine mis- and disinformation focuses on filling two large gaps: (1) understanding the actual producers of the problematic information and (2) the relation of antivaccine material to larger patterns of decline in institutional trust and scientific credibility, both of which are addressed later in this review.

Existing research

One of first points of scholarly consensus on vaccine hesitancy is that antivaccine sentiment has existed for as long as vaccines themselves. Whether it’s the Anti-Compulsory Vaccination League formed in response to the 1867 Vaccination Act or the 1982 documentary DPT: Vaccine Roulette, examples of resistance to compulsory vaccination are not new (Kata 2012; Grant et al. 2015). Despite this long history, though, the antivaccination movement has become a major public health concern over the past two decades, with the World Health Organization citing vaccine hesitancy as one of the top ten threats to global health in 2019 (Ratzan et al. 2019; WHO 2019). In the face of this health crisis, much of the health communication literature from the past 15 years has sought to understand the seemingly sudden rise of antivaccination narratives and the most effective ways to combat them.

As with other aspects of disinformation studies, such as election interference, coordinated disinformation campaigns, trolling, and microtargeting, scholars have suggested that the answer to the “Why now?” question is closely tied to technological affordances (Hutchby 2001) and the rise of social media. Early studies on the internet and vaccine uptake showed that the internet was a primary resource for parents seeking information about childhood vaccination (Harmsen et al. 2013). The shift from a passive digital ecosystem to an interactive one has allowed consumers to interact with antivaccination materials in ways that were previously unavailable (Kata 2012; Witteman and Zikmund-Fisher 2012; Grant et al. 2015). With the rise of interactive online interfaces, forums, and social media, the concern was no longer what information about vaccines patients were accessing, but rather who was actively influencing decision making about online information via live commentary, posts, or images (Witteman and Zikmund-Fisher 2012).

Building from these changes, many have gone on to argue that information opportunities provided by the internet have ushered in a postmodern medical paradigm where the “school of lay medicine” has been able to flourish, making it so patients no longer have to rely on experts for health-care decision making (Kata 2010, 2012). Seeking “online health information” has become common practice, and though potentially empowering for the patient, it requires consumers to actively make decisions about what information to trust or not to trust (Meppelink et al. 2019; Smith and Graham 2017). This responsibility, combined with the fact that social media allows anyone to become a publisher of information, has resulted in the flattening of expertise, where on the internet the medical knowledge of an expert appears effectively equal to the uninformed commentary of a random person behind a screen (Hopf et al. 2019; Kata 2012). In discussing how users have adapted to the affordances of online technologies, Grant et al. (2015) point out that “it becomes more difficult to impose the authority of establishment medicine on online discourse.” This creates a prime environment for antivaccination proponents to spread their messages (Bean 2011). Adding to this difficulty is the fact that search engines like Google, and sites such as Facebook, may automatically surface controversial or false health content because of algorithmic engagement metrics (Stöcker 2020). Users who are actively seeking answers to vaccine-related questions may already have the cards stacked against them when it comes to finding credible or authoritative sources.

Social media and the connectivity of the internet can undoubtedly benefit health-related causes by acting as a gathering space where users can find communities not available to them in doctors’ offices (Kata 2012). Online forums, support groups, and awareness campaigns have become a critical part of the overall health infrastructure for patients. At the same time, as Kata points out, this connectivity can also bring together fringe groups whose members may then “easily and uncritically interact with like-minded individuals online” and eventually “fall into a trap of self-referencing and mutually reinforcing links that can fool users in believing there are many who share their beliefs, when in reality it may only be a small, committed group” (2012). Johnson et al. (2020) expand on this, pointing to evidence that although clusters of antivaccination Facebook users are fairly small, they can have an outsized influence on users who are “undecided.”

Finding ways to combat such an environment, and the health consequences that come with it, has proven difficult. As Grant et al. (2015) indicate, antivaccination advocates do not generally rely on reasoned, evidence-based arguments to make their claims, which makes scientific evidence fairly useless in combatting false claims (see also Kata 2010; Bean 2011; Moran et al. 2016). Additionally, antivaccine arguments are constantly changing and adapting to new events and challenges, while also appealing to a variety of sentiments, such as conspiratorial thinking, freedom, resistance to government authority, preventing harm against children, and suspicion of experts (Bean 2011; Smith and Graham 2017). In a recent study, Pluviano et al. (2018) found that attempting to debunk potential myths about childhood vaccines using facts and statistics proved exceedingly difficult, resulting in significant “backfire effects,” where parents dug further into false beliefs (Pluviano et al. 2018; Nyhan et al. 2014; for a competing view suggesting a limit to the backfire effect, see Wood and Porter 2018).

In the wake of the “failure” of fact, scholars have investigated other means of combatting false beliefs about vaccines. Increasingly common suggestions are media literacy programs and enhancing the ability of patients to identify credible sources, as with efforts to mitigate other forms of disinformation. A major hurdle to this type of solution is the trap of “selective exposure,” in which internet users are more likely to view information that aligns with their beliefs as “credible and useful,” a phenomenon that Meppelink et al. (2019) found heavily influences users seeking information about vaccines (see also our research review on Contexts of Misinformation). Complicating things further, studies have shown that parents with higher “health literacy” (defined as the extent to which people are able to use information sources in making informed health decisions) are actually less likely to vaccinate their children, perhaps because they perceive online health information as more reliable than it actually is, or because they view themselves as more knowledgeable than they actually are (Meppelink et al. 2019; Amit Aharon et al. 2017).

Emerging questions

As with many other disinformation campaigns, we know a good deal about what kind of bad information about vaccines is being spread, and how, but very little about those who are actively producing it and what their exact motivations might be (Ward et al. 2016; for more, see our research review on Producers of Disinformation). Much of the existing literature focuses on those who might passively spread misinformation about vaccines for a variety of reasons, but there is a large gap in addressing those who are actively and intentionally spreading disinformation.

One potential explanation for some instances of vaccine-related disinformation is profit (or social capital and influence). The classic example is disgraced former physician Andrew Wakefield, who published a fraudulent study in 1998 (that was ultimately debunked and retracted 12 years later) linking the measles, mumps, and rubella (MMR) vaccine to the development of autism in children (Deer 2011; Rao and Andrade 2011). Over the years, the study was ultimately exposed to have been orchestrated for financial gain, yet Wakefield and his message continue to be frequently cited by antivaccination proponents (Boseley 2018; Belluz 2019).

Profit motive is only one potential reason for why producers of vaccine disinformation operate. There is evidence that antivaccination disinformation has increasingly been used as a political “wedge issue” and means of foreign election interference (see our Election Interference research review). Recent research into vaccine-related content on Twitter from 2014 to 2017 found that Russian bots actively spread controversial vaccine content using the hashtag #VaccinateUS in an attempt to “amplify” and create “false equivalence” within the vaccinate-versus-antivaccinate discourse (Broniatowski et al. 2018). Kirk (2019) points out that this effort was likely designed to “ramp up social discord, erode trust in public health institutions, and exacerbate fear and division in the United States.” #VaccinateUS tweets frequently cited issues such as “racial/ethnic divisions, appeals to God, and arguments on the basis of animal welfare” along with commentary about socioeconomic divisions (Broniatowski et al. 2018). This, Broniatowski et al. argue, was clearly an attempt to tie the vaccine debate to other divisive issues in US politics.

The case of #VaccinateUS brings into view questions about how producers of vaccine-related disinformation may target citizens or institutions in the future. Though it appears that unlike climate change mis- and disinformation, antivaccination rhetoric is not as clearly embedded within a specific partisan identity, it could still be utilized in a similar manner to the #VaccinateUS campaign to further political polarization or hinder healthy democratic debate (Baumgaertner et al. 2018; McCoy 2018). Profit motive too will likely to continue to be a problem, with Broniatowski et al. noting that alongside the #VaccinateUS campaign, there was a significant amount of vaccine-related clickbait designed to generate advertising revenue. In light of these phenomena, researchers recommend that public health officials develop better ways of monitoring online conversations about vaccines, while also examining the offline activities of antivaccine advocates to better understand their specific motivations (Ward et al. 2016; Broniatowski 2018).

A closer look at the active producers of antivaccine disinformation still cannot account for the continued spread of unintentional misinformation among the public. This is perhaps a question partially answered by some of the overarching mechanisms and issues highlighted in our forthcoming “How Misinformation Spreads” research review (also see: Pennycook and Rand 2019). More specifically, though, some scholars have looked at what factors may make individuals more susceptible to antivaccine misinformation, or more likely to spread it among their networks. A study conducted by Tomeny et al. (2017) seems to provide further evidence that women with young children may be more likely to share antivaccination misinformation, and also showed that higher income levels were clearly associated with “anti-vaccine beliefs and behaviors” in comparison to households with lower income levels.

The role of political ideology in antivaccination beliefs in the US is contested, with Rabinowitz et al. (2016) suggesting that, so far, much of the evidence on whether the political right or left is more likely to share or believe antivaccination misinformation is largely ambiguous. What is clearly true, however, is that antivaccination beliefs span the entire US political spectrum. For example, some aspects of political ideology, such as the supposed presence of an elite “counter-cultural” left, may promote antivaccination beliefs among left-leaning individuals. On the other hand, on the right, a general skepticism toward science and expertise (including the denial of climate change) may also promote antivaccine sentiments (Koltai and Fleischmann 2017). Recently, Hornsey et al. (2020) attempted to understand the impact of President Donald Trump’s antivaccine-oriented tweets and initially found that Trump voters seemed to be more concerned about vaccines when compared to the population at large, but eventually rejected that hypothesis after controlling for conspiratorial thinking and political conservatism. Their finding points to the effect of those two factors, as opposed to an association between increased antivaccination sentiment and Trump’s tweets themselves. Overall, the exact characteristics that may make individuals more susceptible to antivaccination misinformation, or more likely to share it, are not perfectly clear.

Other emerging areas of research dealing with antivaccine material involve a shift outside of the scholarly realm of health communication. As with other phenomena within disinformation studies, some have suggested that antivaccine mis- and disinformation can be viewed within larger societal contexts involving the breakdown of democratic norms and a “crisis of epistemology” (Hopf et al. 2019; Davis 2019; Phillips 2019). In recent years, as some scholars have become increasingly concerned with the direct impacts of election interference and large-scale disinformation campaigns, others have turned their attention to the structural drivers of problematic information. Specifically, Benkler (2019) proposes that as a result of political-economic factors including widespread economic insecurity and the failure of elites to follow through on the promise of prosperity, there has been a massive loss of trust in institutions in the United States, leading to democratic breakdown. This loss of trust provides a fertile breeding ground for what some call the “post-truth” era.

Emerging research ties antivaccination mis- and disinformation to these larger societal contexts. Davis (2019) suggests viewing antivaccine discourse online through a “post-normative scope” that allows us to understand it as a type of discourse that ultimately undermines democratic ideals, the rules of argumentation, and the ideal of the public sphere. Others too, such as Hopf et al. (2019), have suggested that vaccine mis- and disinformation can and should be viewed as part of a larger pattern of scientific misinformation where the credibility of knowledge and the scientific method itself have been actively threatened.

Finally, some scholars have begun to tackle the “What do we do now?” question as it relates to combatting vaccine-related mis- and disinformation. Increasingly, the suggested answer seems to be the idea of narrative messaging. A common feature of antivaccine material is its reliance on emotionally charged narratives, which generally appear in the form of graphic and emotion-driven horror stories about the supposed consequences of vaccines (Shelby and Ernst 2013). This storytelling strategy is an asset that enables misinformation to spread effectively throughout networks, because the narrative being presented is seemingly emotionally powerful enough to counter traditional fact-checking or references to scientific evidence (Shelby and Ernst 2013; Moran et al. 2016; Caulfield et al. 2019). Some scholars have shown that information embedded in emotionally charged narrative messaging has a higher potential for “virality,” enhances recall, and has a better chance of provoking action as contrasted with nonemotional and nonnarrative messaging (Betsch et al. 2011; Zak 2015; Bail 2016). One potential avenue for combatting vaccine mis- and disinformation then, as Bail (2016) suggests, is not via fact but through the same sort of narrative storytelling that instead highlights the potentially “grave consequences” of refusing regular vaccination (see also Shelby and Ernst 2013). Others, such as Nyhan et al. (2014), have been more hesitant about the promise of narrative techniques, pointing to a significant backfire effect in which parents ultimately rejected messaging about the consequences of vaccine-preventable illness. Like many proposed solutions to misinformation then, more research is needed to understand the conditions under which narrative messaging is, and is not, helpful.

Vaccines and the future of scientific misinformation

Antivaccine mis- and disinformation are pervasive issues that are unlikely to disappear anytime soon. Though scholars largely understand what sorts of false information are being spread, it remains unclear whom to hold responsible or how to prevent them from continuing. Existing antivaccination discourse will present public health challenges as society continues to face pandemic crises that require large-scale vaccination, such as Covid-19 (Molteni 2020). The refusal of regular vaccinations by individuals and groups poses a massive public health risk with immediate consequences not only to those who do not receive vaccines, or cannot receive them due to their genuine individual health conditions, but also to the population at large.

In the larger frame of scientific misinformation as a whole, the existing and emerging literature on antivaccination mis- and disinformation has provided an important example of the denigration of scientific knowledge and its consequences, opening a window into the spread and persistence of false beliefs even in the face of overwhelming evidence. Many of the unanswered questions surrounding vaccine misinformation also apply to other scientific issues, and to broader principles of mis- and disinformation as a whole.

Our grateful acknowledgement to Alexa Dietrich, Sara Gorman, and the researchers at Critica for their assistance during the editing process of this research review. 

[1] The CDC defines herd immunity as: “A situation in which a sufficient proportion of a population is immune to an infectious disease (through vaccination and/or prior illness) to make its spread from person to person unlikely. Even individuals not vaccinated (such as newborns and those with chronic illnesses) are offered some protection because the disease has little opportunity to spread within the community.” (CDC 2019).

Works Cited

Amit Aharon, Anat, Haim Nehama, Shmuel Rishpon, and Orna Baron-Epel. 2017. “Parents with High Levels of Communicative and Critical Health Literacy Are Less Likely to Vaccinate Their Children.” Patient Education and Counseling 100 (4): 768–75. https://doi.org/10.1016/j.pec.2016.11.016.

Bail, Christopher A. 2016. “Emotional Feedback and the Viral Spread of Social Media Messages About Autism Spectrum Disorders.” American Journal of Public Health 106 (7): 1173–80. https://doi.org/10.2105/AJPH.2016.303181.

Ball, Philip. 2020. “Anti-Vaccine Movement Could Undermine Efforts to End Coronavirus Pandemic, Researchers Warn.” Nature, May. https://doi.org/10.1038/d41586-020-01423-4.

Baumgaertner, Bert, Juliet E. Carlisle, and Florian Justwan. 2018. “The Influence of Political Ideology and Trust on Willingness to Vaccinate.” PLoS ONE 13 (1). https://doi.org/10.1371/journal.pone.0191728.

Bean, Sandra J. 2011. “Emerging and Continuing Trends in Vaccine Opposition Website Content.” Vaccine 29 (10): 1874–80. https://doi.org/10.1016/j.vaccine.2011.01.003.

Belluz, Julia. 2018. “Research Fraud Catalyzed the Anti-Vaccination Movement. Let’s Not Repeat History.” Vox. February 27, 2018. https://www.vox.com/2018/2/27/17057990/andrew-wakefield-vaccines-autism-study.

Benkler, Yochai. 2019. “Cautionary Notes on Disinformation and the Origins of Distrust.” MediaWell, Social Science Research Council. October 22, 2019. https://mediawell.ssrc.org/expert-reflections/cautionary-notes-on-disinformation-benkler/.

Bennett, W Lance, and Steven Livingston. 2018. “The Disinformation Order: Disruptive Communication and the Decline of Democratic Institutions.” European Journal of Communication 33 (2): 122–39. https://doi.org/10.1177/0267323118760317.

Betsch, Cornelia, Corina Ulshöfer, Frank Renkewitz, and Tilmann Betsch. 2011. “The Influence of Narrative v. Statistical Information on Perceiving Vaccination Risks.” Medical Decision Making: An International Journal of the Society for Medical Decision Making 31 (5): 742–53. https://doi.org/10.1177/0272989X11400419.

Boseley, Sarah. 2018. “How Disgraced Anti-Vaxxer Andrew Wakefield Was Embraced by Trump’s America.” The Guardian, July 18, 2018. https://www.theguardian.com/society/2018/jul/18/how-disgraced-anti-vaxxer-andrew-wakefield-was-embraced-by-trumps-america.

Feldscher, Karen. 2017. “Increase in Pertussis Outbreaks Linked with Vaccine Exemptions, Waning Immunity.” News. Harvard T.H. Chan School of Public Health. July 11, 2017. https://www.hsph.harvard.edu/news/features/increase-in-pertussis-outbreaks-linked-with-vaccine-exemptions-waning-immunity/.

Broniatowski, David A., Amelia M. Jamison, SiHua Qi, Lulwah AlKulaib, Tao Chen, Adrian Benton, Sandra C. Quinn, and Mark Dredze. 2018. “Weaponized Health Communication: Twitter Bots and Russian Trolls Amplify the Vaccine Debate.” American Journal of Public Health 108 (10): 1378–84. https://doi.org/10.2105/AJPH.2018.304567.

Caulfield, Timothy, Alessandro R. Marcon, Blake Murdoch, Jasmine M. Brown, Sarah Tinker Perrault, Jonathan Jarry, Jeremy Snyder, et al. 2019. “Health Misinformation and the Power of Narrative Messaging in the Public Sphere.” Canadian Journal of Bioethics 2 (2): 52–60.

CDC. 2019. “Vaccine Glossary of Terms.” Centers for Disease Control and Prevention. August 13, 2019. https://www.cdc.gov/vaccines/terms/glossary.html.

Cooper, Sara, Cornelia Betsch, Evanson Z. Sambala, Nosicelo Mchiza, and Charles S. Wiysonge. 2018. “Vaccine Hesitancy – a Potential Threat to the Achievements of Vaccination Programmes in Africa.” Human Vaccines & Immunotherapeutics14 (10): 2355–57. https://doi.org/10.1080/21645515.2018.1460987.

Davis, Mark. 2019. “‘Globalist War against Humanity Shifts into High Gear’: Online Anti-Vaccination Websites and ‘Anti-Public’ Discourse.” Public Understanding of Science 28 (3): 357–71. https://doi.org/10.1177/0963662518817187.

Deer, Brian. 2011. “How the Case against the MMR Vaccine Was Fixed.” BMJ 342 (January): c5347. https://doi.org/10.1136/bmj.c5347.

Edwards, Kathryn M., Jesse M. Hackell, and The Committee on Practice and Ambulatory Medicine The Committee on Infectious Diseases. 2016. “Countering Vaccine Hesitancy.” Pediatrics 138 (3). https://doi.org/10.1542/peds.2016-2146.

Elsasser, Shaun W., and Riley E. Dunlap. 2013. “Leading Voices in the Denier Choir: Conservative Columnists’ Dismissal of Global Warming and Denigration of Climate Science.” American Behavioral Scientist 57 (6): 754–76. https://doi.org/10.1177/0002764212469800.

Garza, Mariel. 2020. “Opinion: Trump Is Handing Anti-Vaxxers an Invitation to Smear Coronavirus Vaccines.” Los Angeles Times, May 15, 2020. https://www.latimes.com/opinion/story/2020-05-15/trump-anti-vaxxers-coronavirus-vaccines.

Gottlieb, Samantha D. 2016. “Vaccine Resistances Reconsidered: Vaccine Skeptics and the Jenny McCarthy Effect.” BioSocieties 11 (2): 152–74. https://doi.org/10.1057/biosoc.2015.30.

Grant, Lenny, Bernice L. Hausman, Margaret Cashion, Nicholas Lucchesi, Kelsey Patel, and Jonathan Roberts. 2015. “Vaccination Persuasion Online: A Qualitative Study of Two Provaccine and Two Vaccine-Skeptical Websites.” Journal of Medical Internet Research 17 (5): e133. https://doi.org/10.2196/jmir.4153.

Harmsen, Irene A., Gemma G. Doorman, Liesbeth Mollema, Robert AC Ruiter, Gerjo Kok, and Hester E. de Melker. 2013. “Parental Information-Seeking Behaviour in Childhood Vaccinations.” BMC Public Health 13 (1): 1219. https://doi.org/10.1186/1471-2458-13-1219.

Hopf, Henning, Alain Krief, Goverdhan Mehta, and Stephen A. Matlin. 2019. “Fake Science and the Knowledge Crisis: Ignorance Can Be Fatal.” Royal Society Open Science 6 (5): 190161. https://doi.org/10.1098/rsos.190161.

Hornsey, Matthew J., Matthew Finlayson, Gabrielle Chatwood, and Christopher T. Begeny. 2020. “Donald Trump and Vaccination: The Effect of Political Identity, Conspiracist Ideation and Presidential Tweets on Vaccine Hesitancy.” Journal of Experimental Social Psychology 88 (May): 103947. https://doi.org/10.1016/j.jesp.2019.103947.

Hutchby, Ian. 2001. “Technologies, Texts and Affordances.” Sociology 35 (2): 441–56. https://doi.org/10.1017/S0038038501000219.

Johnson, Neil F., Nicolas Velásquez, Nicholas Johnson Restrepo, Rhys Leahy, Nicholas Gabriel, Sara El Oud, Minzhang Zheng, Pedro Manrique, Stefan Wuchty, and Yonatan Lupu. 2020. “The Online Competition between Pro- and Anti-Vaccination Views.” Nature, May, 1–4. https://doi.org/10.1038/s41586-020-2281-1.

Karpf, David. 2019. “On Digital Disinformation and Democratic Myths.” MediaWell, Social Science Research Council. December 10, 2019. https://mediawell.ssrc.org/expert-reflections/on-digital-disinformation-and-democratic-myths/.

Kata, Anna. 2010. “A Postmodern Pandora’s Box: Anti-Vaccination Misinformation on the Internet – ScienceDirect.” Vaccine28 (17): 1709–16. https://doi.org/10.1016/j.vaccine.2009.12.022.

———. 2012. “Anti-Vaccine Activists, Web 2.0, and the Postmodern Paradigm – An Overview of Tactics and Tropes Used Online by the Anti-Vaccination Movement.” Vaccine, Special Issue: The Role of Internet Use in Vaccination Decisions, 30 (25): 3778–89. https://doi.org/10.1016/j.vaccine.2011.11.112.

Kirk, Katherine. 2019. “How Russia Sows Confusion in the U.S. Vaccine Debate.” Foreign Policy (blog). Accessed August 2, 2019. https://foreignpolicy.com/2019/04/09/in-the-united-states-russian-trolls-are-peddling-measles-disinformation-on-twitter/.

Koltai, Kolina S., and Kenneth R. Fleischmann. 2017. “Questioning Science with Science: The Evolution of the Vaccine Safety Movement.” Proceedings of the Association for Information Science and Technology 54 (1): 232–40. https://doi.org/10.1002/pra2.2017.14505401026.

Larson, Heidi J., Caitlin Jarrett, Elisabeth Eckersberger, David M. D. Smith, and Pauline Paterson. 2014. “Understanding Vaccine Hesitancy around Vaccines and Vaccination from a Global Perspective: A Systematic Review of Published Literature, 2007–2012.” Vaccine 32 (19): 2150–59. https://doi.org/10.1016/j.vaccine.2014.01.081.

Mccoy, Charles. 2018. “The Social Characteristics of Americans Opposed to Vaccination: Beliefs about Vaccine Safety versus Views of U.S. Vaccination Policy.” Critical Public Health, July, 1–12. https://doi.org/10.1080/09581596.2018.1501467.

Meppelink, Corine S., Edith G. Smit, Marieke L. Fransen, and Nicola Diviani. 2019. “‘I Was Right about Vaccination’: Confirmation Bias and Health Literacy in Online Health Information Seeking.” Journal of Health Communication 24 (2): 129–40. https://doi.org/10.1080/10810730.2019.1583701.

Molteni, Megan. 2020. “An Army of Volunteers Is Taking On Vaccine Disinformation Online.” Wired, June 15, 2020. https://www.wired.com/story/can-a-keyboard-crusade-stem-the-vaccine-infodemic/.

Moran, Meghan Bridgid, Melissa Lucas, Kristen Everhart, Ashley Morgan, and Erin Prickett. 2016. “What Makes Anti-Vaccine Websites Persuasive? A Content Analysis of Techniques Used by Anti-Vaccine Websites to Engender Anti-Vaccine Sentiment.” Journal of Communication in Healthcare 9 (3): 151–63. https://doi.org/10.1080/17538068.2016.1235531.

Navin, Mark. 2015. Values and Vaccine Refusal: Hard Questions in Ethics, Epistemology, and Health Care. New York: Routledge.

Nyhan, Brendan, Jason Reifler, Sean Richey, and Gary L. Freed. 2014. “Effective Messages in Vaccine Promotion: A Randomized Trial.” Pediatrics 133 (4): e835–42. https://doi.org/10.1542/peds.2013-2365.

Patel, Manisha, Adria D. Lee, Susan B. Redd, Nakia S. Clemmons, Rebecca J. McNall, Amanda C. Cohn, and Paul A. Gastañaduy. “Increase in Measles Cases — United States, January 1–April 26, 2019.” MMWR. Morbidity and Mortality Weekly Report 68 (2019).

https://doi.org/10.15585/mmwr.mm6817e1.

Pennycook, Gordon, and David Rand. 2019. “Opinion | Why Do People Fall for Fake News?” The New York Times, January 19, 2019. https://www.nytimes.com/2019/01/19/opinion/sunday/fake-news.html.

Phillips, Whitney. 2019. “The Toxins We Carry.” Columbia Journalism Review, 2019. https://www.cjr.org/special_report/truth-pollution-disinformation.php/.

Pluviano, Sara, Caroline Watt, Giovanni Ragazzini, and Sergio Della Sala. 2019. “Parents’ Beliefs in Misinformation about Vaccines Are Strengthened by Pro-Vaccine Campaigns.” Cognitive Processing 20 (3): 325–31. https://doi.org/10.1007/s10339-019-00919-w.

Porter, Ethan, Thomas J. Wood, and David Kirby. 2018. “Sex Trafficking, Russian Infiltration, Birth Certificates, and Pedophilia: A Survey Experiment Correcting Fake News.” Journal of Experimental Political Science 5 (2): 159–64. https://doi.org/10.1017/XPS.2017.32.

Rabinowitz, Mitchell, Lauren Latella, Chadly Stern, and John T. Jost. 2016. “Beliefs about Childhood Vaccination in the United States: Political Ideology, False Consensus, and the Illusion of Uniqueness.” PLoS ONE 11 (7). https://doi.org/10.1371/journal.pone.0158382.

Rao, T. S. Sathyanarayana, and Chittaranjan Andrade. 2011. “The MMR Vaccine and Autism: Sensation, Refutation, Retraction, and Fraud.” Indian Journal of Psychiatry 53 (2): 95–96. https://doi.org/10.4103/0019-5545.82529.

Ratzan, Scott C., Barry R. Bloom, Ayman El-Mohandes, Jonathan Fielding, Lawrence O. Gostin, James G. Hodge, Peter Hotez, et al. 2019. “The Salzburg Statement on Vaccination Acceptance.” Journal of Health Communication 0 (0): 1–3. https://doi.org/10.1080/10810730.2019.1622611.

Shelby, Ashley, and Karen Ernst. 2013. “Story and Science: How Providers and Parents Can Utilize Storytelling to Combat Anti-Vaccine Misinformation.” Human Vaccines & Immunotherapeutics 9 (8): 1795–1801. https://doi.org/10.4161/hv.24828.

Smith, Naomi, and Tim Graham. 2017. “Mapping the Anti-Vaccination Movement on Facebook.” Information, Communication & Society 22 (9): 1310–27. https://doi.org/10.1080/1369118X.2017.1418406.

Stöcker, Christian. 2020. “How Facebook and Google Accidentally Created a Perfect Ecosystem for Targeted Disinformation.” In Disinformation in Open Online Media, edited by Christian Grimme, Mike Preuss, Frank W. Takes, and Annie Waldherr, 129–49. Lecture Notes in Computer Science. Cham: Springer International Publishing. https://doi.org/10.1007/978-3-030-39627-5_11.

Tomeny, Theodore S., Christopher J. Vargo, and Sherine El-Toukhy. 2017. “Geographic and Demographic Correlates of Autism-Related Anti-Vaccine Beliefs on Twitter, 2009-15.” Social Science & Medicine 191 (October): 168–75. https://doi.org/10.1016/j.socscimed.2017.08.041.

United Nations. 2020. “During This Coronavirus Pandemic, ‘Fake News’ Is Putting Lives at Risk: UNESCO.” UN News. April 13, 2020. https://news.un.org/en/story/2020/04/1061592.

Wadman, Meredith. 2020. “Antivaccine Forces Gaining Online.” Science 368 (6492): 699. https://doi.org/10.1126/science.368.6492.699.

Wang, Eileen, Yelena Baras, and Alison M. Buttenheim. 2015. “‘Everybody Just Wants to Do What’s Best for Their Child’: Understanding How Pro-Vaccine Parents Can Support a Culture of Vaccine Hesitancy.” Vaccine 33 (48): 6703–9. https://doi.org/10.1016/j.vaccine.2015.10.090.

Wanless, Alicia, and Michael Berk. “The Audience Is the Amplifier: Participatory Propaganda.” n.d. ResearchGate. Accessed June 15, 2020. https://www.researchgate.net/publication/329281693_The_Audience_is_the_Amplifier_Participatory_Propaganda.

Ward, Jeremy K., Patrick Peretti-Watel, and Pierre Verger. 2016. “Vaccine Criticism on the Internet: Propositions for Future Research.” Human Vaccines & Immunotherapeutics 12 (7): 1924–29. https://doi.org/10.1080/21645515.2016.1146430.

Witteman, Holly O., and Brian J. Zikmund-Fisher. 2012. “The Defining Characteristics of Web 2.0 and Their Potential Influence in the Online Vaccination Debate.” Vaccine, Special Issue: The Role of Internet Use in Vaccination Decisions, 30 (25): 3734–40. https://doi.org/10.1016/j.vaccine.2011.12.039.

WHO. 2019. “Ten Health Issues WHO Will Tackle This Year.” World Health Organization. Accessed August 1, 2019.https://www.who.int/emergencies/ten-threats-to-global-health-in-2019.

WHO. n.d. “WHO | SAGE Working Group Dealing with Vaccine Hesitancy (March 2012 to November 2014).” World Health Organization. Accessed June 12, 2020. https://www.who.int/immunization/sage/sage_wg_vaccine_hesitancy_apr12/en/.

Zak, Paul J. 2015. “Why Inspiring Stories Make Us React: The Neuroscience of Narrative.” Cerebrum: The Dana Forum on Brain Science 2015 (February). https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4445577/.