- The Covid-19 pandemic comes at a time when we were already grappling with information overload and pervasive misinformation.
- In a crisis, humans communicate in a process called collective sensemaking in order to understand uncertain and dynamic circumstances.
- Collective sensemaking is a vital process, but we can make mistakes—or the process can be manipulated and exploited.
The Covid-19 pandemic has been accompanied by a parallel “infodemic” (Rothkopf 2003; WHO 2020a), a term used by the World Health Organization (WHO) to describe the widespread sharing of false and misleading information about the novel coronavirus. Misleading information about the disease has been a problem in diverse societies around the globe. It has been blamed for fatal poisonings in Iran (Forrest 2020), racial hatred and violence against people of Asian descent (Kozlowska 2020), and the use of unproven and potentially dangerous drugs (Rogers et al. 2020). A video promoting a range of false claims and conspiracy theories about the disease, including an antivaccine message, spread widely (Alba 2020) across social media platforms and around the world. Those spreading misinformation include friends and relatives with the best intentions, opportunists with books and nutritional supplements to sell, and world leaders trying to consolidate political power.
This public health crisis hits us at a particularly challenging time, as we are already grappling with issues of information overload and pervasive misinformation due, in part, to increasing use of online information systems. This perfect storm of a global pandemic hitting a world with global connectivity may be unprecedented, but scientists have a long tradition of trying to understand how we—as individuals, groups, and societies—respond to collective-stress situations (Barton 1970).
What do we do when a crisis strikes? We communicate. We utilize our social networks to exchange information as we try to make sense of what is going on around us (Danzig et al. 1958; Erickson et al. 1978; Richardson et al. 1979; Pendleton 1998). We search for, disseminate, and synthesize the content we see into narratives that fit with our current understanding of the situation and within our larger worldviews. For example, during a hurricane watch, people in potentially affected areas may try to piece together information from local radio and television, combine and contrast that with their own experiences of the incoming storm as well as previous storms, and convene with their neighbors to share perspectives as they decide whether or not to evacuate. This process, called collective sensemaking, is critical for our decision-making, and in many cases allows us to relieve some of the anxiety and uncertainty we face in order to take action (Comfort et al. 2004; Shklovski et al. 2008).
But our information processing system isn’t perfect; we make mistakes. Sometimes our informal explanations get things wrong—rumors may turn out to be false and become misinformation. And at other times the collective-sensemaking process can be exploited by those who wish to purposefully mislead—for example, by seeding and spreading disinformation.
There is ongoing work to identify, distinguish, and define the different information toxicities contributing to the infodemic—and the broader “information disorder” (Wardle and Derakhshan 2017). Following Jack (2017), we define misinformation as information that is false, but not necessarily intentionally false, and disinformation as false or misleading information that is intentionally created and/or spread for a particular—e.g., financial or political—objective. (For more on the definitions of mis- and disinformation, please see the MediaWell research review Defining “Disinformation.”)
While the current public health crisis highlights the pressing problems of mis- and disinformation in our society, the tradition of studying these phenomena dates much earlier. In this review, we highlight and discuss existing research, both new and old, on the spread of misinformation during crisis events—and specifically public health crises. We trace the history and evolution of early work in this domain, foregrounding a behavioral perspective that focuses on the processes that generate and spread misinformation. Connecting that perspective to current challenges, we describe several distinct types of mis- and disinformation in the public health context and explain why, together, they represent a complex and critical problem. We end by situating these issues within understandings of the broader social and technical systems that shape how information spreads in our society.
Studies of rumoring behavior
- The field of mis- and disinformation studies traces back to earlier traditions of studying rumor.
- This use of the term “rumor” refers to information that is unverified, but not necessarily false.
- Information about major events can spread extremely quickly through informal sharing, or “rumoring.”
- A combination of factors makes societies susceptible to false information during crises.
The emerging field of mis- and disinformation studies draws from a variety of traditions, but perhaps the most influential has been the study of rumors—or rumoring, the underlying processes that generate them. Scientific inquiry into rumoring has spanned numerous fields and is often inherently interdisciplinary in nature (Pendleton 1998). In this review, we build from a tradition anchored in sociology and social psychology. This body of work has focused on the societal and collective-action aspects of rumoring rather than the individual factors that may contribute (Rosnow 1988). Indeed, key contributors to this field viewed rumor as “part and parcel of the efforts of men [and people] to come to terms with the exigencies of life” (Shibutani 1966, 62). In this view, rumor is a byproduct of the group problem-solving process—or collective sensemaking. To be clear, we use the term rumor to refer to information that is unverified at the time it is being discussed. This definition implies that rumors can turn out to be false—or they can turn out to be true.
Crisis events provide a research setting ripe for studying rumoring, as they bring together a number of contextual factors associated with an increased prevalence of rumors. Historically, scientific exploration of rumoring received notable attention within wartime contexts, where world leaders and scholars were concerned about the potential for manipulation and coercion (Knapp 1944; Caplow 1947). In the years since, this work has expanded to include a broad array of crisis events, such as natural disasters, pandemics, and acts of terrorism. Crisis events are accompanied by high levels of uncertainty and anxiety. Officials and authoritative sources may offer limited or untimely information. Traditional communication channels such as broadcast media may break down (Danzig et al. 1958). To fill these gaps, rumoring becomes a collective problem-solving technique—a way to “improvise news” in order to make sense of the unfolding situation and cope with accompanying uncertainties (Shibutani 1966).
In early work, researchers took two distinct methodological approaches. One was studying rumors already in circulation, focusing on the conditions that made their existence more likely (Festinger et al. 1948; Schachter and Burdick 1955; Kapferer 1989). Another was studying rumors experimentally by planting them in controlled situations (Anthony 1973; Walker and Beckerle 1987). In both cases, much early research focused on the overall prevalence of rumors as well as the social process that underlay them—e.g., how information was distorted through the transmission process. Taken together, this body of work points to a set of factors associated with rumor circulation, including the importance of the subject to the individuals concerned, the ambiguity of the evidence pertaining to the topic, and the relevance of the information for taking actions or changing behaviors (Allport and Postman 1947). Scholars also explored the role of an authoritative figure or opinion leader who lends credence to a rumor, as well as individual (such as gender and age) and group characteristics (like network homophily) that may shape common pathways of rumor transmission in a population (Koenig 1985).
One thing is evident in these studies: significant news events can diffuse extremely rapidly within an attentive public through this process of informal information sharing, or rumoring. Long before the internet became ubiquitous, researchers were already remarking upon the speed of information propagation through social networks (Richardson et al. 1979). For example, it is estimated that 68 percent of adults in the United States heard about the assassination of President John F. Kennedy within 30 minutes of its incidence (Pendleton 1998).
Exacerbating this issue of rapid transmission, crisis contexts pair uncertainty with challenges in verifying information; this often results in limited ability to clarify facts or check sources. There is also a perceived—and often real—risk that not sharing information during crisis events could have consequences. This combination of factors makes societies and individuals vulnerable to false or misleading information. In the case of public health crises, misinformation can have life-or-death effects.
Conspiracy theories and disinformation
- Conspiracy theories add perceptions that a crisis is being intentionally caused or manipulated by powerful entities.
- Many conspiracy theories appear to arise organically from corrupted sensemaking processes.
- Conspiracy theories can be both the products and beneficiaries of disinformation campaigns.
- Conspiracy theories and disinformation campaigns can undermine trust in providers of information.
Rumors that turn out to be false are one contributor to misinformation during crisis events. But misinformation takes on other forms—including fake medical advice, elaborate conspiracy theories about underlying causes, and intentional disinformation campaigns that attempt to leverage the crisis for political gain. The latter two are exceptionally vexing during a crisis because they feed off and contribute to uncertainty and distrust of governments, journalists, and scientists (Sunstein and Vermeule 2009; Pomerantsev and Weiss 2014).
Conspiracy theories are salient in discourse surrounding many public health issues, from fluoridation of water to vaccines. Building upon alternative narratives about likely causes and effective treatments, conspiracy theories add a dimension of perceived intentionality, suggesting that the crisis is being manipulated or that information about it is being purposefully hidden by powerful entities for political or financial gain. A recent example claims that the symptoms associated with Covid-19 are actually caused by 5G technologies (rather than the SARS-CoV-2 coronavirus) and that powerful people are conspiring to hide this “fact” (Andrews 2020; Sorkin 2020). Research suggests that these unfounded theories can play a role in shaping health behaviors—for example, in decisions of whether or not to vaccinate (Jolley and Douglas 2014; Falade and Coultas 2017).
In the crisis context, many conspiracy theories appear to develop organically from a sort of corrupted sensemaking process (Sunstein and Vermeule 2009; Kou et al. 2017; Starbird at al. 2019). In this process, participants assemble evidence to fit previously held meta-theories—e.g., about a world where powerful people control global events, and where “mainstream” media, scientific experts, and government officials cannot be trusted. Conspiracy theories often build from compelling (and in some cases valid) criticisms of the intersections between power, politics, and the often competing interests of the public (Hofstadter 2008; Barkun 2003; Fenster 2008; Oliver and Wood 2014). For example, the US military operation that led to the assassination of Osama bin Laden utilized a fake vaccine program to identify where his family was living (Lenzer 2011). It is not difficult to see how operations like these can feed into conspiracy theorizing, for example, about the “true purpose” of vaccines.
Conspiracy theories can be both the products and beneficiaries of disinformation campaigns.
Unlike rumors and misinformation, which can be unintentional, disinformation is false or misleading information that is produced and/or spread intentionally for a strategic objective (Jack 2017; Starbird et al. 2019). It can be productive to think of disinformation not as a single piece of content, but as a campaign (Starbird et al. 2019). History provides examples of conspiracy theories about public health crises being seeded or amplified by disinformation campaigns. For example, in the 1980s, Soviet intelligence operatives carried out an international campaign claiming that HIV/AIDS was a US bioweapon (Boghardt 2009). Similarly, in the Covid-19 crisis, we are witnessing efforts to frame the disease as a Chinese or US bioweapon, based on the motivations of those spreading that disinformation narrative.
By spreading false information and fostering doubt and confusion, conspiracy theories and disinformation campaigns can undermine trust in information providers—a problem with potentially severe consequences in a public health crisis like the Covid-19 pandemic.
Public health crises and misinformation
- Pandemics often require major changes in behavior—like social distancing—and this can make the positive and negative aspects of collective sensemaking more obvious.
- Having information—even misinformation—can help soothe feelings of anxiety, fear, and uncertainty.
- Misinformation can delay or prevent implementation of effective public health measures.
One need only look at the outbreaks of Zika fever (Miller et al. 2017; Bode and Vraga 2018; Dredze et al. 2016), Ebola (Allgaier and Svalastog 2015; Jin et al. 2014; Fung et al. 2016), and measles (Kata 2010, 2012; Dubé et al. 2014) over the past decade to see the troubling role of misinformation, disinformation, and conspiracy theories in public health crises. Infectious disease pandemics bring collective sensemaking processes to the fore, in part because large-scale behavioral changes—such as adjusted cultural practices around death and mourning during Ebola or massive social distancing measures for Covid-19—are often necessary to address them.
During a public health crisis, people seek information to help them understand risks and make decisions on how to respond. It can be difficult to determine what information to trust or not trust, and emotions such as fear, anxiety, and uncertainty can mobilize people and shape their actions (van der Meer and Jin 2020), including how they search for information (Gui et al. 2017). Having information can help to soothe these feelings (Jin et al. 2016; Tan et al. 2015). Similarly, misinformation can be powerful during a crisis because it can reduce feelings of uncertainty and provide a (false) sense of safety and control (Crabtree and Masuda 2019). For example, misleading claims downplaying the risks of Covid-19 can provide people with a sense of security, encouraging them to return as quickly as possible to their normal routines. Misinformation can be particularly persuasive when it supports already-held beliefs (Dredze et al. 2016).
Unfortunately, misinformation during a public health crisis can prevent the adoption and use of evidence-based preventative measures and treatments and consequently worsen an epidemic (Tan et al. 2015). For example, misinformation spread during the 2014–2016 Ebola outbreak may have contributed to negative health outcomes by motivating attacks on health workers and blocking them from providing treatment (Allgaier and Svalastog 2015). We are already seeing the health effects of misinformation during the Covid-19 pandemic, such as incidences of people self-medicating by taking chloroquine, an unproven treatment (Mackey 2020), or drinking bleach (Bernard 2020).
Larson (2018) warned that the biggest pandemic risk would be viral misinformation. She wrote that the next major outbreak would be exacerbated by efforts to sow distrust in the vaccines developed for the pandemic. She described some of the most influential actors in the spread of misinformation about vaccines, including “people with medical credentials stoking overblown or unfounded fears,” people seeking financial gain, and people seizing a political opportunity. We can already see these three types of actors spreading misinformation about future vaccines for Covid-19.
Larson’s work reveals a larger underlying problem: a worldwide increase in vaccine hesitancy that can be tied to a growing “antivaccine” movement. This movement, which has largely taken shape within online communities, has been characterized by widespread misinformation about vaccinations, specifically the false link between the measles, mumps, and rubella (MMR) vaccine and autism, and the false belief that vaccines are ineffective in protecting against communicable diseases (Kata 2012; Poland and Jacobson 2001). Consequently, due to increasing resistance against childhood vaccinations, there has been an increased incidence of measles outbreaks across the world, including in the US, Samoa, and the Democratic Republic of the Congo (CDC 2020; Craig et al. 2020; WHO 2020b). The antivaccination movement is also salient in conversations about the development of a Covid-19 vaccine, especially among people who oppose social distancing measures (Bogel-Burroughs 2020).
The discourse promoting vaccine hesitancy is difficult to classify as simply rumor or disinformation or conspiracy theory—and indeed it has elements of all three. Communities of activists have coalesced around a set of antivaccine narratives and worked to gather evidence to support their beliefs, to recruit new members, and to spread their ideas (Kata 2010, 2012; Dubé et al. 2014). However, while some participants may be motivated by reputational or financial gain, much of this activity appears to be the work of sincere believers (Koltai and Fleischmann 2017; Wang et al. 2015; Gottlieb 2015).
Though the antivaccination movement has gone global, research on this phenomenon has lagged behind. For example, we can see that countries like South Africa and Brazil are experiencing growth in vaccine hesitancy due to fears of a link between autism and the MMR vaccine (Brown et al. 2018; Burnett et al. 2012; Fujita et al. 2018; Sato 2018). However, there have been very few studies dedicated to studying the antivaccination movement and related misinformation in specific cultural contexts (Cooper et al. 2018; de Menezes Succi 2017). As with other research in mis- and disinformation studies, the majority of work in this area has focused on the global North.
Misinformation, trust, and public health messaging
- Getting factual narratives out quickly is essential to prevent misinformation.
- Science moves more slowly than public demand for information.
- Experts recommend that officials acknowledge uncertainty, but officials may feel pressure to appear as if the situation is under control.
- Trust in institutions has been declining for decades in many societies.
During a public health crisis, scientists, physicians, communications professionals, and public health officials can play a critical role in informing the public and preventing the spread of misinformation (Pribble et al. 2010; Tirkkonen and Luoma-aho 2011; Walker 2016; SAMHSA 2019). In particular, public health officials can provide the most up-to-date, accurate health information during public health crises, which can be especially important for vulnerable populations (Vaughan and Tinker 2009). Distributing factual narratives quickly from health officials is essential to help prevent misinformation (Bowen and Heath 2007; SAMHSA 2019).
Effectively communicating health information to the public, however, can be challenging—especially in the context of an emerging pandemic characterized by high levels of fear, anxiety, and uncertainty (Covello 2003). The persistent scientific uncertainty with a disease like Covid-19 can make this particularly challenging, as the best information (and the scientific consensus) changes from day to day. The pace of the science combined with the intensity of media coverage about that science is proving particularly challenging for public health communicators during the Covid-19 pandemic (Garrett 2020). Experts have recommended that public health officials acknowledge the uncertainty of the situation (SAMHSA 2019), but this can be difficult, as officials may feel pressure to appear as if the situation is under control. Another issue is the potential misalignment between the information that is being communicated and what the public is interested in knowing (Gui et al. 2017).
Perhaps the most critical challenge for communicating official information during public health crises is trust (Covello et al. 2001). In recent decades, many societies have experienced a loss of trust in the very institutions—such as government and media (Brenan 2019; Rainie et al. 2019)—that people rely on for information during these events. When public health officials are seen as less credible sources, people tend to turn to informal sources to find health information (Jan and Baek 2019). Increasingly, those information searchers are going online where new media and social media have disrupted how trust is formed and provided massive visibility to new kinds of influencers. In the context of Covid-19, this has given rise to a group of arm-chair epidemiologists who are difficult to distinguish from qualified scientists (Limaye et al. 2020). All of these factors can contribute to the spread of misinformation.
- People turn to the internet to fill information gaps in crisis events.
- Building from early work, scholars argue that online rumors stem from collective efforts to reduce uncertainty during disruptive events.
- Most people’s behavior after a crisis is altruistic, but unscrupulous actors can exploit circumstances for financial or political gain; we see this online as well.
People are now going online during crisis events—including public health crises—to fill information gaps and resolve uncertainty (Sutton et al. 2008; Hughes et al. 2008; Jan and Baek 2019). Research on rumoring and misinformation is increasingly going online as well, perhaps following the action, as the internet affords rumor participation at a massive new scale, but also seizing the opportunity to study human behavior, including rumoring during crisis events, through a new sort of data—i.e., traces left behind on social media platforms (Palen and Anderson 2016).
In recent years, scholars have paid considerable attention to the study of online rumors and misinformation (e.g., Mendoza et al. 2010; Oh et al. 2013; Starbird et al. 2014; Andrews et al. 2016), conspiracy theorizing (e.g., Del Vicario et al. 2016; Starbird 2017; Samory and Mitra 2018), disinformation (e.g., Marwick and Lewis, 2017; Ong and Cabañes 2018; Starbird et al. 2019), false news (Vosoughi et al. 2018; Lazer et al. 2018), and other related phenomena. A large portion of this research has focused on techniques for automatic detection (e.g., Castillo et al. 2011; Qazvinian et al. 2011; Derczynski et al. 2015; Zhao et al. 2015; Shao et al. 2016; Zubiaga et al. 2018). But a parallel track of research seeks to better understand how and why rumors and misinformation spread.
This empirical and conceptual work has demonstrated a range of findings. For example, in terms of pure size, the vast majority of rumor cascades are small—though a few are very big (Vosoughi et al. 2018; Goel et al. 2016). Looking at underlying mechanisms, network structure shapes how rumors spread (Arif et al. 2016; Del Vicario et al. 2016). And exploring differences due to veracity, researchers have found that false rumors spread further and faster than true information (Vosoughi et al. 2018) and corrections (Starbird et al. 2014). Extending that last point, there is active debate about whether and how corrections work. Researchers disagree about the existence of a so-called backfire effect that proponents argue causes people to double-down on false beliefs when corrected (Nyhan and Reifler 2010; Nyhan et al. 2014; Bode and Vraga 2015, 2018; Wood and Porter 2019; Ecker et al. 2020). Similarly, researchers continue to explore—and question—the role of “echo chambers” (Sunstein 2001; Jamieson and Cappella 2008) or “filter bubbles” (Pariser 2011) in the spread of misinformation online (e.g., Del Vicario et al. 2016; Bruns 2017; Guess et al. 2018). [For more on these concepts, see our research review on Contexts of Misinformation.] [link]
Focusing specifically on the crisis context, researchers have examined the spread of rumors and misinformation during natural disasters (e.g., Mendoza et al. 2010; Oh et al. 2010; Gupta et al. 2013; Acar and Muraki 2011), industrial accidents (Zeng et al. 2017), mass shootings and acts of terrorism (Oh et al. 2013; Starbird et al. 2014; Starbird 2017), ethnic violence (Banaji and Bhat 2019), and public health crises (Kou et al. 2017; Oyeyemi et al. 2014; Chen et al. 2015).
The earliest studies of online rumoring (e.g., Bordia and Rosnow 1998; Bordia and DiFonzo 1999; Bordia, DiFonzo, and Chang 2004) built upon Shibutani’s (1966) conceptualization of rumoring as a form of group problem-solving (described in the first sections of this paper). Oh et al. (2010, 2013) applied this lens to crises, theorizing that online rumors stem from collective work by online communities to resolve uncertainty during disruptive events. This phenomenon gained widespread attention after the 2013 Boston Marathon bombings, when an online effort to identify the perpetrators notoriously pointed fingers at the wrong suspects (Madrigal 2013; Starbird et al. 2014). Rumors about the National Guard being deployed to “lock down” parts of the United States in response to Covid-19 (Lamothe 2020) suggest similar origins in sensemaking processes.
Online rumors also take the form of viral internet memes such as fake or misattributed photos—for example, the photo of a young girl running who was falsely claimed to have been killed in the Boston Marathon bombings (Maddock et al. 2015a) and the often-used image of a shark that is falsely claimed to be swimming in hurricane waters (Gupta et al. 2013). In the context of a public health crisis like Covid-19, chain-letter-style messages around fake remedies (Doherty 2020) have a similar meme-like quality.
The spread of medical misinformation has become a particularly salient problem online—both within social media platforms themselves and on the diverse websites that feed social media discourse. Public health crises can catalyze and call attention to this phenomenon. For example, during the 2010 Deepwater Horizon oil spill, as people converged online to voice concerns about health impacts, they encountered a scientifically complex information space where false theories about hidden dangers (e.g., “It’s raining dispersants”) emerged and spread (Starbird et al. 2015; Dailey and Starbird 2015). So-called alternative health information—which included false and misleading claims about treatments—also spread online during the Ebola outbreak in 2014–2016 (Oyeyemi et al. 2014; Fung et al. 2016). And in predominantly English-language threads on Reddit during the Zika outbreak, online sensemaking efforts produced a range of false conspiracy theories about, among other things, the disease’s origins (e.g., as a bioweapon in a lab), its severity (e.g., exaggerated by media), and its true cause (e.g., fertilizers from “big agriculture”) (Kou et al. 2017). Similarly, during the Covid-19 pandemic, several conspiracy theories spread online—at times at a massive scale—claiming, for example, that the virus was a “planned” event (Neuman 2020) and that 5G technology is the “real” cause of symptoms (Andrews 2020).
Most human behavior after a disaster is prosocial and altruistic (Fritz and Mathewson 1957). But just as in-person exploiters have been known to converge upon a crisis-affected community, online exploiters are now converging onto the digital scene of the crisis to take advantage of the situation—for example, by spreading disinformation for financial or political gain. On the financial side, there have been numerous cases of fake fundraising efforts after natural disaster events (e.g., Strickler 2010; Lehr 2011) and, in the public health context, online campaigns that set the stage for the sale of unproven remedies (e.g., Caulfield 2020; Paul 2020). Public health crises are also leveraged for political gain. For example, though the 2014–2016 Ebola outbreak primarily affected African countries, the virus was mobilized as a political frame to discuss domestic politics—e.g., to argue for border control—in the United States and United Kingdom (Abeysinghe 2016). Roy et al. (2020) describe how people used social media to identify figures to blame, focusing over time on political leaders in their own countries (e.g., national governments and “Obama”).
In recent years, we have seen more intentional and organized disinformation campaigns during crises. Between 2014 and 2017, “trolls” working inside Russia’s Internet Research Agency (Ru-IRA) took advantage of the convergence of attention during real-world crises—and even manufactured fake crisis events—as part of their disinformation campaigns. Interestingly, Ru-IRA trolls were also active in online conversations about vaccines (Broniatowski et al. 2018), and though their activities reflect tactics of sowing and amplifying confusion and division, the objectives of their vaccine-related engagement are not yet fully understood.
Methodological and ethical considerations
- Most misinformation studies focus on Twitter because data is publicly available, but that creates gaps in knowledge about different demographics and global contexts.
- Similarly, most studies focus on text, leaving us with incomplete understandings of misinformation in videos and images.
- Misinformation studies raise troubling questions around privacy and consent which can be especially problematic in the context of digital content.
As research on the spread of misinformation during crisis events increasingly moves online, we are confronted by new methodological and ethical concerns.
In the crisis context, online platforms make activities that were previously very hard to capture newly legible for investigation (Palen and Anderson 2016). Researchers from diverse fields including computer science, sociology, psychology, media studies, social computing, and human-computer interaction have converged on this new crisis data. They have brought with them a wide range of methodologies, from quantitative analysis at scale (e.g., Del Vicario et al. 2016) to mixed-method studies that move back and forth from high-level to close-up views of the data (e.g., Andrews et al. 2016; Wilson et al. 2018). Experimental studies measuring the actual spread of misinformation during crisis events have proven difficult. However, researchers have effectively used survey experiments—for example, to explore the efficacy of corrections (Bode and Vraga 2018).
But new data bring new challenges (boyd and Crawford 2011; Tufekci 2014; Crawford and Finn 2015; Olteanu et al. 2019). Though online misinformation takes shape and spreads across many and diverse platforms, the vast majority of research in this space focuses on one platform, Twitter, due to public availability of its data. Though Twitter is a relatively popular platform, other platforms with quite different types of affordances (e.g., Facebook) have far more users and interactions—which suggests that current research is missing large parts of the online misinformation phenomenon. This singular focus also means that we overlook whole demographics and sections of the global population where Twitter is not a primary means of communication. For example, there is evidence that WhatsApp facilitated the spread of misinformation that played a role in religious-based mob violence in India (Banaji and Bhat 2019). But, due to the private nature of communications on that platform, the data are not easily accessible, and there have been few (mostly interview-based) research studies. In addition, boyd and Crawford (2011) write of a “data divide,” where access to most social media data is restricted—by cost and access—to a select group of researchers. That fact can be especially problematic when it comes to trying to understand the role of the platforms themselves in facilitating (or dampening) the spread of misinformation. There are related concerns about the representativeness of the data that are accessible to researchers (boyd and Crawford 2011; Tromble et al. 2017), an issue of particular concern in the crisis context (Crawford and Finn 2015). These data limitations render it difficult to make comparisons—across events, platforms, geographies, and time.
The methodologies brought to these data have limitations as well. Online misinformation takes a range of different forms, from textual to graphical memes to videos. With a few notable exceptions (e.g., Gupta et al. 2013), the vast majority of misinformation studies have focused on textual content. We may need new techniques and approaches to better understand how false information spreads via images and videos.
There are ethical concerns as well. Much of the data used for online misinformation studies were created by people who were not aware that their activities would become part of research studies—raising troubling questions around privacy and consent (boyd and Crawford 2011; Crawford and Finn 2015; Olteanu et al. 2019). This can be especially problematic in the context of digital misinformation, as studies that reveal the identities (intentionally or accidentally) of people who spread misinformation may put those people at risk of reputational damage. Misinformation researchers have often navigated these issues by anonymizing and attempting to protect the identities of specific users (franzke et al. 2020), though exceptions are often made in cases of public individuals such as professional journalists, political figures, and government officials, as well as other highly visible accounts.
There are other dangers specific to conducting research in this context of online misinformation. For example, researchers may amplify hoaxes and extreme messages, even if that is not their intention (Phillips 2018). And researchers themselves face potential negative effects, such as harassment from extremist groups (Gewin 2018) and mental health concerns.
The historical perspective on rumors, and the growing body of work that leverages digital data to understand how and why misinformation spreads through sociotechnical systems, has much to add to the scholarly conversation surrounding the Covid-19 infodemic, but there are notable gaps and pressing questions for the scholarly community.
We live in an increasingly complex and networked global information environment. The consequences of this new ecosystem are visible in the wake of disinformation campaigns attributed to the Russian government and the rise of networked propaganda across the globe. This is the backdrop for the current infodemic. We are already seeing the ways that alternative narratives seeded and amplified by disinformation campaigns become entangled with current public health communications and recommendations.
Effective crisis communications will depend on our ability to disentangle this information; collective sensemaking will be put to the test. We need to know more about how problematic information is taken up, given value, and acted upon; we need to know how to identify explicitly coordinated campaigns from emergent or resonant effects; we need to know how false claims can best be corrected, both for the author and for their audience; we need to understand how authentic information is sustained, exchanged, and applied; and we need to understand the cognitive, social, and technical facets of vulnerability.
At the same time, we have seen a nascent field taking shape around questions at the intersection of technology, democracy, and misinformation studies. Emerging work in this domain will be immediately relevant to the current public health crisis. The Covid-19 pandemic will provide opportunities for researchers to study how information campaigns—both good and bad—get started, take shape, and spread across populations. Bringing perspectives from the social and behavioral sciences to these is vital.
Our grateful acknowledgement to Robert Peckham and Monica Schoch-Spana for their feedback during the editing process for this research review.
Abeysinghe, Sudeepa. 2016. “Ebola at the Borders: Newspaper Representations and the Politics of Border Control.” Third World Quarterly 37 (3): 452–67. https://doi.org/10.1080/01436597.2015.1111753.
Acar, Adam, and Yuya Muraki. 2011. “Twitter for Crisis Communication: Lessons Learned from Japan’s Tsunami Disaster.” IJWBC 7 (July): 392–402. https://doi.org/10.1504/IJWBC.2011.041206.
Alba, Davey. 2020. “Virus Conspiracists Elevate a New Champion.” New York Times, May 9, 2020. https://www.nytimes.com/2020/05/09/technology/plandemic-judy-mikovitz-coronavirus-disinformation.html.
Allgaier, Joachim, and Anna Lydia Svalastog. 2015. “The Communication Aspects of the Ebola Virus Disease Outbreak in Western Africa – Do We Need to Counter One, Two, or Many Epidemics?” Croatian Medical Journal 56 (5): 496–99. https://doi.org/10.3325/cmj.2015.56.496.
Allport, Gordon W., and Leo Postman. 1947. The Psychology of Rumor. Henry Holt and Company.
Andrews, Cynthia, Elodie Fichet, Yuwei Ding, Emma S. Spiro, and Kate Starbird. 2016. “Keeping Up with the Tweet-Dashians: The Impact of ‘Official’ Accounts on Online Rumoring.” In CSCW ’16: Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing, 452–4 65. San Francisco, California, USA: Association for Computing Machinery. https://doi.org/10.1145/2818048.2819986.
Andrews, Travis. 2020. “Why Dangerous Conspiracy Theories about the Virus Spread so Fast — and How They Can Be Stopped.” Washington Post, May 1, 2020. https://www.washingtonpost.com/technology/2020/05/01/5g-conspiracy-theory-coronavirus-misinformation/.
Anthony, Susan. 1973. “Anxiety and Rumor.” Journal of Social Psychology 89 (1): 91–98. https://doi.org/10.1080/00224545.1973.9922572.
Arif, Ahmer, Kelley Shanahan, Fang-Ju Chou, Yoanna Dosouto, Kate Starbird, and Emma S. Spiro. 2016. “How Information Snowballs: Exploring the Role of Exposure in Online Rumor Propagation.” In CSCW ’16: Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing, 466–477. San Francisco, California, USA: Association for Computing Machinery. https://doi.org/10.1145/2818048.2819964.
Banaji, Shakuntala, and Ram Bhat. 2019. “WhatsApp Vigilantes: An Exploration of Citizen Reception and Circulation of WhatsApp Misinformation Linked to Mob Violence in India.” Department of Media and Communications, London School of Economics. https://blogs.lse.ac.uk/medialse/2019/11/11/whatsapp-vigilantes-an-exploration-of-citizen-reception-and-circulation-of-whatsapp-misinformation-linked-to-mob-violence-in-india/.
Barkun, Michæl. (2003) 2006. A Culture of Conspiracy: Apocalyptic Visions in Contemporary America. New Ed edition. Berkeley, CA: University of California Press.
Barton, Allen H. 1970. Communities in Disaster: A Sociological Analysis of Collective Stress Situations. Anchor.
Bernard. 2020. “A Man Drank a Bottle of Rubbing Alcohol for COVID-19.” Medpage Today, April 22, 2020. https://www.medpagetoday.com/infectiousdisease/covid19/86094.
Blair, Robert A., Benjamin S. Morse, and Lily L. Tsai. 2017. “Public Health and Public Trust: Survey Evidence from the Ebola Virus Disease Epidemic in Liberia.” Social Science & Medicine 172 (January): 89–97. https://doi.org/10.1016/j.socscimed.2016.11.016.
Bode, Leticia, and Emily K. Vraga. 2015. “In Related News, That Was Wrong: The Correction of Misinformation Through Related Stories Functionality in Social Media.” Journal of Communication 65 (4): 619–38. https://doi.org/10.1111/jcom.12166.
———. 2018. “See Something, Say Something: Correction of Global Health Misinformation on Social Media.” Health Communication 33 (9): 1131–40. https://doi.org/10.1080/10410236.2017.1331312.
Bogel-Burroughs, Nicholas. 2020. “Antivaccination Activists Are Growing Force at Virus Protests.” New York Times, May 2, 2020. https://www.nytimes.com/2020/05/02/us/anti-vaxxers-coronavirus-protests.html.
Boghardt, Thomas. 2009. “Operation INFEKTION — Central Intelligence Agency.” Studies in Intelligence 53 (4). https://www.cia.gov/library/center-for-the-study-of-intelligence/csi-publications/csi-studies/studies/vol53no4/soviet-bloc-intelligence-and-its-aids.html.
Bordia, Prashant, and Nicholas Difonzo. 2004. “Problem Solving in Social Interactions on the Internet: Rumor As Social Cognition:” Social Psychology Quarterly, June. https://doi.org/10.1177/019027250406700105.
Bordia, Prashant, Nicholas Difonzo, and Artemis Chang. 2016. “Rumor as Group Problem Solving: Development Patterns in Informal Computer-Mediated Groups.” Small Group Research, August. https://doi.org/10.1177/104649649903000102.
Bordia, Prashant, and Ralph L. Rosnow. 1998. “Rumor Rest Stops on the Information Highway: Transmission Patterns in a Computer-Mediated Rumor Chain.” Human Communication Research 25 (2): 163–79. https://doi.org/10.1111/j.1468-2958.1998.tb00441.x.
Bowen, Shannon A., and Robert L. Heath. 2007. “Narratives of the SARS Epidemic and Ethical Implications for Public Health Crises.” International Journal of Strategic Communication 1 (2): 73–91. https://doi.org/10.1080/15531180701298791.
boyd, danah, and Kate Crawford. 2011. “Six Provocations for Big Data.” Paper presented at “A Decade in Internet Time: Symposium on the Dynamics of Internet and Society,” Oxford Internet Institute, University of Oxford, UK. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=1926431.
Brenan, Megan. 2019. “Americans’ Trust in Mass Media Edges Down to 41%.” Gallup.com (blog). September 26, 2019. https://news.gallup.com/poll/267047/americans-trust-mass-media-edges-down.aspx.
Broniatowski, David A., Amelia M. Jamison, SiHua Qi, Lulwah AlKulaib, Tao Chen, Adrian Benton, Sandra C. Quinn, and Mark Dredze. 2018. “Weaponized Health Communication: Twitter Bots and Russian Trolls Amplify the Vaccine Debate.” American Journal of Public Health 108 (10): 1378–84. https://doi.org/10.2105/AJPH.2018.304567.
Brown, Amy Louise, Marcelo Sperandio, Cecília P. Turssi, Rodrigo M. A. Leite, Victor Ferro Berton, Regina M. Succi, Heidi Larson, et al. 2018. “Vaccine Confidence and Hesitancy in Brazil.” Cadernos de Saúde Pública 34 (9). https://doi.org/10.1590/0102-311×00011618.
Bruns, Axel. 2017. “Echo Chamber? What Echo Chamber? Reviewing the Evidence.” Paper presented at the 6th Biennial Future of Journalism Conference (FOJ17), September 15. https://eprints.qut.edu.au/113937/.
Burnett, Rosemary J., Heidi J. Larson, Molelekeng H. Moloi, E. Avhashoni Tshatsinde, André Meheus, Pauline Paterson, and Guido François. 2012. “Addressing Public Questioning and Concerns about Vaccination in South Africa: A Guide for Healthcare Workers.” Vaccine 30 Suppl 3 (September): C72–78. https://doi.org/10.1016/j.vaccine.2012.03.037.
Bursztyn, Leonardo, Aakaash Rao, Christopher Roth, and David Yanagizawa-Drott. 2020. “Misinformation During a Pandemic.” BFI Working Paper no. 2020–44. Becker Friedman Institute.
Caplow, Theodore. 1947. “Rumors in War.” Social Forces 25 (3): 298–302. https://doi.org/10.1093/sf/25.3.298.
Castillo, Carlos, Marcelo Mendoza, and Barbara Poblete. 2011. “Information Credibility on Twitter.” In WWW ’11: Proceedings of the 20th International Conference on World Wide Web, 675–684.. Hyderabad, India: Association for Computing Machinery. https://doi.org/10.1145/1963405.1963500.
Caulfield, Timothy. 2020. “Pseudoscience and COVID-19 — We’ve Had Enough Already.” Nature, April. https://doi.org/10.1038/d41586-020-01266-z.
CDC. 2020. “Measles Cases and Outbreaks.” Centers for Disease Control and Prevention. June 9, 2020. https://www.cdc.gov/measles/cases-outbreaks.html.
Centola, Damon. 2011. “An Experimental Study of Homophily in the Adoption of Health Behavior.” Science 334 (December): 1269–72. https://doi.org/10.1126/science.1207055.
Chen, Bin, Jueman Mandy Zhang, Zhenggang Jiang, Jian Shao, Tao Jiang, Zhengting Wang, Kui Liu, Siliang Tang, Hua Gu, and Jianmin Jiang. 2015. “Media and Public Reactions toward Vaccination during the ‘Hepatitis B Vaccine Crisis’ in China.” Vaccine 33 (15): 1780–85. https://doi.org/10.1016/j.vaccine.2015.02.046.
Christakis, Nicholas A., and James H. Fowler. 2010. “Social Network Sensors for Early Detection of Contagious Outbreaks.” PLOS ONE 5 (9): e12948. https://doi.org/10.1371/journal.pone.0012948.
Comfort, Louise K., Kilkon Ko, and Adam Zagorecki. 2004. “Coordination in Rapidly Evolving Disaster Response Systems: The Role of Information.” American Behavioral Scientist, July. https://doi.org/10.1177/0002764204268987.
Cooper, Sara, Cornelia Betsch, Evanson Z. Sambala, Nosicelo Mchiza, and Charles S. Wiysonge. 2018. “Vaccine Hesitancy – a Potential Threat to the Achievements of Vaccination Programmes in Africa.” Human Vaccines & Immunotherapeutics 14 (10): 2355–57. https://doi.org/10.1080/21645515.2018.1460987.
Covello, Vincent T. 2003. “Best Practices in Public Health Risk and Crisis Communication.” Journal of Health Communication 8 (sup1): 5–8. https://doi.org/10.1080/713851971.
Covello, Vincent T., Richard G. Peters, Joseph G. Wojtecki, and Richard C. Hyde. 2001. “Risk Communication, the West Nile Virus Epidemic, and Bioterrorism: Responding to the Communication Challenges Posed by the Intentional or Unintentional Release of a Pathogen in an Urban Setting.” Journal of Urban Health : Bulletin of the New York Academy of Medicine 78 (2): 382–91. https://doi.org/10.1093/jurban/78.2.382.
Crabtree, Alexis, and Jeffrey R. Masuda. 2019. “Naloxone Urban Legends and the Opioid Crisis: What Is the Role of Public Health?” BMC Public Health 19 (1): 670. https://doi.org/10.1186/s12889-019-7033-5.
Craig, Adam T., Anita E. Heywood, and Heather Worth. 2020. “Measles Epidemic in Samoa and Other Pacific Islands.” The Lancet. Infectious Diseases 20 (3): 273–75. https://doi.org/10.1016/S1473-3099(20)30053-0.
Crawford, Kate, and Megan Finn. 2015. “The Limits of Crisis Data: Analytical and Ethical Challenges of Using Social and Mobile Data to Understand Disasters.” GeoJournal 80 (4): 491–502. https://doi.org/10.1007/s10708-014-9597-z.
Dailey, Dharma, and Kate Starbird. 2015. “‘It’s Raining Dispersants’: Collective Sensemaking of Complex Information in Crisis Contexts.” In CSCW’15 Companion: Proceedings of the 18th ACM Conference Companion on Computer Supported Cooperative Work & Social Computing, 155–158.. Vancouver, BC, Canada: Association for Computing Machinery. https://doi.org/10.1145/2685553.2698995.
Danzig, Elliot R., Paul W. Thayer, and Lila R. Galanter. 1958. The Effects of a Threatening Rumor on a Disaster-Stricken Community. Washington, DC: National Academy of Sciences. https://doi.org/10.17226/9552.
de Menezes Succi, Regina Célia de Menezes. 2018. “Vaccine Refusal – What We Need to Know.” Jornal de Pediatria 94 (6): 574–81. https://doi.org/10.1016/j.jped.2018.01.008.
Del Vicario, Michela, Alessandro Bessi, Fabiana Zollo, Fabio Petroni, Antonio Scala, Guido Caldarelli, H. Eugene Stanley, and Walter Quattrociocchi. 2016. “The Spreading of Misinformation Online.” Proceedings of the National Academy of Sciences 113 (3): 554–59. https://doi.org/10.1073/pnas.1517441113.
Derczynski, Leon, Kalina Bontcheva, Michal Lukasik, Thierry Declerck, Arno Scharl, and Georgi Georgiev. 2015. “Pheme: Computing Veracity – the Fourth Challenge of Big Social Data.” In Proceeedings of the Extended Semantic Web Conference EU Project Networking Session.
Doherty, Ben. 2020. “UN Warns of Deadly Effect of Covid-19 Misinformation in Pacific.” The Guardian, April 16, 2020. http://www.theguardian.com/world/2020/apr/17/un-warns-of-deadly-effect-of-covid-19-misinformation-in-pacific.
Dredze, Mark, David A. Broniatowski, and Karen M. Hilyard. 2016. “Zika Vaccine Misconceptions: A Social Media Analysis.” Vaccine 34 (30): 3441–42. https://doi.org/10.1016/j.vaccine.2016.05.008.
Dubé, Eve, Maryline Vivion, and Noni E. MacDonald. 2015. “Vaccine Hesitancy, Vaccine Refusal and the Anti-Vaccine Movement: Influence, Impact and Implications.” Expert Review of Vaccines 14 (1): 99–117. https://doi.org/10.1586/14760584.2015.964212.
Ecker, Ullrich, Stephan Lewandowsky, and Matthew Chadwick. 2020. Can Corrections Spread Misinformation to New Audiences? Testing for the Elusive Familiarity Backfire Effect. OSF Preprints. https://doi.org/10.31219/osf.io/et4p3.
Erickson, Bonnie H., T. A. Nosanchuk, Liviana Mostacci, and Christina Ford Dalrymple. 1978. “The Flow of Crisis Information as a Probe of Work Relations.” Canadian Journal of Sociology / Cahiers Canadiens de Sociologie 3 (1): 71–87. https://doi.org/10.2307/3339794.
Falade, Bankole A., and Clare J. Coultas. 2017. “Scientific and Non-Scientific Information in the Uptake of Health Information: The Case of Ebola.” South African Journal of Science 113 (7–8): 1–8. https://doi.org/10.17159/sajs.2017/20160359.
Fenster, Mark. (1999) 2008. Conspiracy Theories: Secrecy and Power in American Culture. University of Minnesota Press.
Festinger, Leon, Dorwin Cartwright, Kathleen Barber, Juliet Fleischl, Josephine Gottsdanker, Annette Keysen, and Gloria Leavitt. 1948. “A Study of a Rumor: Its Origin and Spread.” Human Relations 1: 464–85. https://doi.org/10.1177/001872674800100405.
Forrest, Adam. 2020. “700 Dead in Iran after Drinking Toxic Alcohol to ‘Cure Coronavirus.’” The Independent, April 28, 2020. https://www.independent.co.uk/news/world/middle-east/coronavirus-iran-deaths-toxic-methanol-alcohol-fake-news-rumours-a9487801.html.
Fothergill, Alice, and Lori A. Peek. 2004. “Poverty and Disasters in the United States: A Review of Recent Sociological Findings.” Natural Hazards 32 (1): 89–110. https://doi.org/10.1023/B:NHAZ.0000026792.76181.d9.
Franzke, Aline Shakti, Anja Bechmann, Michael Zimmer, and Charles M. Ess. 2019. “Internet Research: Ethical Guidelines 3.0.” Association of Internet Researchers.
Fritz, Charles E., and John H. Mathewson. (1957) 2018. Convergence Behavior in Disasters; A Problem in Social Control. Franklin Classics Trade Press.
Fujita, Dennis Minoru, Felipe Scassi Salvador, Luiz Henrique da Silva Nali, and Expedito José de Albuquerque Luna. 2018. “Decreasing Vaccine Coverage Rates Lead to Increased Vulnerability to the Importation of Vaccine-Preventable Diseases in Brazil.” Journal of Travel Medicine 25 (1). https://doi.org/10.1093/jtm/tay100.
Fung, Isaac Chun-Hai, King-Wa Fu, Chung-Hong Chan, Benedict Shing Bun Chan, Chi-Ngai Cheung, Thomas Abraham, and Zion Tsz Ho Tse. 2016. “Social Media’s Initial Reaction to Information and Misinformation on Ebola, August 2014: Facts and Rumors.” Public Health Reports 131 (3): 461–73. https://doi.org/10.1177/003335491613100312.
Garrett, Laurie. 2020. “COVID-19: The Medium Is the Message.” The Lancet 395 (10228): 942–43. https://doi.org/10.1016/S0140-6736(20)30600-0.
Gewin, Virginia. 2018. “Real-Life Stories of Online Harassment — and How Scientists Got through It.” Nature 562 (7727): 449–50. https://doi.org/10.1038/d41586-018-07046-0.
Goel, Sharad, Ashton Anderson, Jake Hofman, and Duncan J. Watts. 2016. “The Structural Virality of Online Diffusion.” Management Science 62 (1): 180–96. https://doi.org/10.1287/mnsc.2015.2158.
Gostin, Lawrence O. 2014. “Global Polio Eradication: Espionage, Disinformation, and the Politics of Vaccination.” Milbank Quarterly 92 (3): 413–17. https://doi.org/10.1111/1468-0009.12065.
Gottlieb, Samantha D. 2015. “Vaccine Resistances Reconsidered: Vaccine Skeptics and the Jenny McCarthy Effect.” BioSocieties 11 (2): 152–74. https://doi.org/10.1057/biosoc.2015.30.
Guess, Andrew, Brendan Nyhan, and Jason Reifler. 2018. “Selective Exposure to Misinformation: Evidence from the Consumption of Fake News during the 2016 U.S. Presidential Campaign.”
Gui, Xinning, Yue Wang, Yubo Kou, Tera Leigh Reynolds, Yunan Chen, Qiaozhu Mei, and Kai Zheng. 2017. “Understanding the Patterns of Health Information Dissemination on Social Media during the Zika Outbreak.” AMIA Annual Symposium Proceedings. 2017: 820–29.
Gupta, Aditi, Hemank Lamba, Ponnurangam Kumaraguru, and Anupam Joshi. 2013. “Faking Sandy: Characterizing and Identifying Fake Images on Twitter during Hurricane Sandy.” In WWW ’13 Companion: Proceedings of the 22nd International Conference on World Wide Web, 729–36. https://doi.org/10.1145/2487788.2488033.
Hofstadter, Richard. (1964) 2008. The Paranoid Style in American Politics. Reprint edition. New York: Vintage.
Hughes, Amanda, Leysia Palen, Jeannette Sutton, Sophia Liu, and Sarah Vieweg. 2008. “Site-Seeing in Disaster: An Examination of on-Line Social Convergence,” April.
Inc, Gallup. 2019. “Americans’ Trust in Mass Media Edges Down to 41%.” Gallup.com. September 26, 2019. https://news.gallup.com/poll/267047/americans-trust-mass-media-edges-down.aspx.
Jack, Caroline. 2017. Lexicon of Lies. Data & Society Research Institute. August 9, 2017. https://datasociety.net/library/lexicon-of-lies/.
Jamieson, Kathleen Hall, and Joseph N. Cappella. 2008. Echo Chamber: Rush Limbaugh and the Conservative Media Establishment. Oxford; New York: Oxford University Press.
Jang, Kyungeun, and Young Min Baek. 2019. “When Information from Public Health Officials Is Untrustworthy: The Use of Online News, Interpersonal Networks, and Social Media during the MERS Outbreak in South Korea.” Health Communication 34 (9): 991–98. https://doi.org/10.1080/10410236.2018.1449552.
Jin, Fang, Wei Wang, Liang Zhao, Edward Dougherty, Yang Cao, Chang-Tien Lu, and Naren Ramakrishnan. 2014. “Misinformation Propagation in the Age of Twitter.” Computer 47 (12): 90–94. https://doi.org/10.1109/MC.2014.361.
Jin, Yan, Julia Daisy Fraustino, and Brooke Fisher Liu. 2016. “The Scared, the Outraged, and the Anxious: How Crisis Emotions, Involvement, and Demographics Predict Publics’ Conative Coping.” International Journal of Strategic Communication 10 (4): 289–308. https://doi.org/10.1080/1553118X.2016.1160401.
Jolley, Daniel, and Karen M. Douglas. 2014. “The Effects of Anti-Vaccine Conspiracy Theories on Vaccination Intentions.” PloS One 9 (2): e89177. https://doi.org/10.1371/journal.pone.0089177.
Kangmennaang, Joseph, Lydia Osei, Frederick A. Armah, and Isaac Luginaah. 2016. “Genetically Modified Organisms and the Age of (Un) Reason? A Critical Examination of the Rhetoric in the GMO Public Policy Debates in Ghana.” In “Futures for Food,” special issue, Futures, 83 (October): 37–49. https://doi.org/10.1016/j.futures.2016.03.002.
Kapferer, J. N. 1989. “A Mass Poisoning Rumor in Europe.” Public Opinion Quarterly 53 (4): 467–81. https://doi.org/10.1086/269167.
Kata, Anna. 2010. “A Postmodern Pandora’s Box: Anti-Vaccination Misinformation on the Internet.” Vaccine 28 (7): 1709–16. https://doi.org/10.1016/j.vaccine.2009.12.022.
———. 2012. “Anti-Vaccine Activists, Web 2.0, and the Postmodern Paradigm–an Overview of Tactics and Tropes Used Online by the Anti-Vaccination Movement.” Vaccine 30 (25): 3778–89. https://doi.org/10.1016/j.vaccine.2011.11.112.
Knapp, Robert H. 1944. “A Psychology of Rumor.” Public Opinion Quarterly 8 (1): 22–37.
Koenig, Fred. 1985. Rumor in the Marketplace: The Social Psychology of Commercial Hearsay. Dover, MA: Praeger.
Koltai, Kolina S., and Kenneth R. Fleischmann. 2017. “Questioning Science with Science: The Evolution of the Vaccine Safety Movement.” Proceedings of the Association for Information Science and Technology 54 (1): 232–40. https://doi.org/10.1002/pra2.2017.14505401026.
Kou, Yubo, Xinning Gui, Yunan Chen, and Kathleen Pine. 2017. “Conspiracy Talk on Social Media: Collective Sensemaking during a Public Health Crisis.” Proceedings of the ACM on Human-Computer Interaction 61:1–61:21. https://doi.org/10.1145/3134696.
Kozlowska, Hanna. 2020. “How Anti-Chinese Sentiment Is Spreading on Social Media.” Quartz, March 25, 2020. https://qz.com/1823608/how-anti-china-sentiment-is-spreading-on-social-media/.
Lamothe, Dan. 2020. “U.S. Officials Combat Conspiracy Theories of Martial Law as the National Guard Assists in Coronavirus Response.” Washington Post, March 23, 2020. https://www.washingtonpost.com/.
Larson, Heidi J. 2018. “The Biggest Pandemic Risk? Viral Misinformation.” Nature 562 (7727): 309–309. https://doi.org/10.1038/d41586-018-07034-4.
Lazer, David M. J., Matthew A. Baum, Yochai Benkler, Adam J. Berinsky, Kelly M. Greenhill, Filippo Menczer, Miriam J. Metzger, et al. 2018. “The Science of Fake News.” Science 359 (6380): 1094–96.
Lehr, Jeff. 2011. “Judgment Obtained in Tornado Charity Fraud Case.” Joplin Globe, October 6, 2011. https://www.joplinglobe.com/news/local_news/judgment-obtained-in-tornado-charity-fraud-case/article_5c58fbc5-f7b5-5c63-a800-5de969c2c085.html.
Lenzer, Jeanne. 2011. “Fake Vaccine Campaign in Pakistan Could Threaten Plans to Eradicate Polio.” BMJ (Clinical Research Ed.) 343 (July): d4580. https://doi.org/10.1136/bmj.d4580.
Limaye, Rupali Jayant, Molly Sauer, Joseph Ali, Justin Bernstein, Brian Wahl, Anne Barnhill, and Alain Labrique. 2020. “Building Trust While Influencing Online COVID-19 Content in the Social Media World.” The Lancet Digital Health 2 (6): e277–78. https://doi.org/10.1016/S2589-7500(20)30084-4.
Liu, Brooke Fisher, and Sora Kim. 2011. “How Organizations Framed the 2009 H1N1 Pandemic via Social and Traditional Media: Implications for U.S. Health Communicators.” Public Relations Review 37 (3): 233–44. https://doi.org/10.1016/j.pubrev.2011.03.005.
Mackey, Robert. 2020. “After Trump Hyped Chloroquine as a Covid-19 Cure, a Man Died Trying to Self-Medicate With a Version of the Chemical Used in Fish Tanks.” The Intercept (blog). March 24, 2020. https://theintercept.com/2020/03/24/trump-hyped-chloroquine-cure-covid-19-man-arizona-took-died/.
Maddock, Jim, Kate Starbird, Haneen J. Al-Hassani, Daniel E. Sandoval, Mania Orand, and Robert M. Mason. 2015a. “Characterizing Online Rumoring Behavior Using Multi-Dimensional Signatures.” In CSCW ’15: Proceedings of the 18th ACM Conference on Computer Supported Cooperative Work & Social Computing, 228–241. Vancouver, BC, Canada: Association for Computing Machinery. https://doi.org/10.1145/2675133.2675280.
Maddock, Jim, Kate Starbird, and Robert Mason. 2015b. “Using Historical Twitter Data for Research: Ethical Challenges of Tweet Deletions.” In CSCW 2015: Workshop on Ethics for Studying Sociotechnical Systems in a Big Data World.
Madrigal, Alexis C. 2013. “#BostonBombing: The Anatomy of a Misinformation Disaster.” The Atlantic, April 19, 2013. https://www.theatlantic.com/technology/archive/2013/04/-bostonbombing-the-anatomy-of-a-misinformation-disaster/275155/.
Marwick, Alice E., and Becca Lewis. 2017. Media Manipulation and Disinformation Online. Data & Society Research Institute. May 15, 2017. https://datasociety.net/library/media-manipulation-and-disinfo-online/.
Meadows, Charles W., Cui Zhang Meadows, Lu Tang, and Wenlin Liu. 2019. “Unraveling Public Health Crises Across Stages: Understanding Twitter Emotions and Message Types During the California Measles Outbreak.” Communication Studies 70 (4): 453–69. https://doi.org/10.1080/10510974.2019.1582546.
Mendoza, Marcelo, Barbara Poblete, and Carlos Castillo. 2010. “Twitter Under Crisis: Can We Trust What We RT?” In SOMA ’10: Proceedings of the First Workshop on Social Media Analytics. https://doi.org/10.1145/1964858.1964869.
Miller, Michele, Tanvi Banerjee, Roopteja Muppalla, William Romine, and Amit Sheth. 2017. “What Are People Tweeting About Zika? An Exploratory Study Concerning Its Symptoms, Treatment, Transmission, and Prevention.” JMIR Public Health and Surveillance 3 (2): e38. https://doi.org/10.2196/publichealth.7157.
Neuman, Scott. 2020. “Seen ‘Plandemic’? We Take A Close Look At The Viral Conspiracy Video’s Claims.” NPR. May 8, 2020. https://www.npr.org/2020/05/08/852451652/seen-plandemic-we-take-a-close-look-at-the-viral-conspiracy-video-s-claims.
Nyhan, Brendan, and Jason Reifler. 2010. “When Corrections Fail: The Persistence of Political Misperceptions.” Political Behavior 32 (2): 303–30. https://doi.org/10.1007/s11109-010-9112-2.
Nyhan, Brendan, Jason Reifler, Sean Richey, and Gary L. Freed. 2014. “Effective Messages in Vaccine Promotion: A Randomized Trial.” Pediatrics 133 (4): e835–42. https://doi.org/10.1542/peds.2013-2365.
Oh, Onook, Manish Agrawal, and Raghav Rao. 2013. “Community Intelligence and Social Media Services: A Rumor Theoretic Analysis of Tweets During Social Crises.” Management Information Systems Quarterly 37 (2): 407–26.
Oh, Onook, Kyounghee Hazel Kwon, and H. Raghav Rao. 2010. “An Exploration of Social Media in Extreme Events: Rumor Theory and Twitter during the Haiti Earthquake 2010.” In ICIS 2010 Proceedings – Thirty First International Conference on Information Systems. https://asu.pure.elsevier.com/en/publications/an-exploration-of-social-media-in-extreme-events-rumor-theory-and.
Oliver, J. Eric, and Thomas J. Wood. 2014. “Conspiracy Theories and the Paranoid Style(s) of Mass Opinion.” American Journal of Political Science 58 (4): 952–66. https://doi.org/10.1111/ajps.12084.
Olteanu, Alexandra, Carlos Castillo, Fernando Diaz, and Emre Kıcıman. 2019. “Social Data: Biases, Methodological Pitfalls, and Ethical Boundaries.” Frontiers in Big Data 2. https://doi.org/10.3389/fdata.2019.00013.
Ong, Jonathan, and Jason Vincent Cabañes. 2018. Architects of Networked Disinformation: Behind the Scenes of Troll Accounts and Fake News Production in the Philippines. University of Massachusetts Amherst, January. https://doi.org/10.7275/2cq4-5396.
Oyeyemi, Sunday Oluwafemi, Elia Gabarron, and Rolf Wynn. 2014. “Ebola, Twitter, and Misinformation: A Dangerous Combination?” BMJ 349 (October). https://doi.org/10.1136/bmj.g6178.
Palen, Leysia, and Kenneth M. Anderson. 2016. “Crisis Informatics—New Data for Extraordinary Times.” Science 353 (6296): 224–25. https://doi.org/10.1126/science.aag2579.
Pariser, Eli. 2011. The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think. Penguin Books.
Paul, Kari. 2020. “YouTube Profits from Videos Promoting Unproven Covid-19 Treatments.” The Guardian, April 3, 2020. http://www.theguardian.com/technology/2020/apr/03/youtube-coronavirus-treatments-profit-misinformation.
Pendleton, Susan Coppess. 1998. “Rumor Research Revisited and Expanded.” Language & Communication 18 (1): 69–86. https://doi.org/10.1016/S0271-5309(97)00024-4.
Phillips, Whitney. 2018. The Oxygen of Amplification. Data & Society Research Institute. https://datasociety.net/library/oxygen-of-amplification/.
Poland, G. A., and R. M. Jacobson. 2001. “Understanding Those Who Do Not Understand: A Brief Review of the Anti-Vaccine Movement.” Vaccine 19 (17–19): 2440–45. https://doi.org/10.1016/s0264-410x(00)00469-2.
Pomerantsev, Peter, and Michael Weiss. 2014. “The Menace of Unreality: How the Kremlin Weaponizes Information, Culture and Money.” Institute of Modern Russia. https://weaponizednarrative.asu.edu/library/menace-unreality-how-kremlin-weaponizes-information-culture-and-money.
Pribble, James M., Erika F. Fowler, Sonia V. Kamat, William M. Wilkerson, Kenneth M. Goldstein, and Stephen W. Hargarten. 2010. “Communicating Emerging Infectious Disease Outbreaks to the Public through Local Television News: Public Health Officials as Potential Spokespeople.” Disaster Medicine and Public Health Preparedness 4 (3): 220–25. https://doi.org/10.1001/dmp.2010.27.
Qazvinian, Vahed, Emily Rosengren, Dragomir R. Radev, and Qiaozhu Mei. 2011. “Rumor Has It: Identifying Misinformation in Microblogs.” In Proceedings of the 2011 Conference on Empirical Methods in Natural Language Processing, 1589–1599. Edinburgh, Scotland, UK: Association for Computational Linguistics. https://www.aclweb.org/anthology/D11-1147.
Rainie, Lee, Scott Keeter, and Andrew Perrin. 2019. “Distrust in America Viewed as Problem.” Pew Research Center – U.S. Politics & Policy (blog). July 22, 2019. https://www.people-press.org/2019/07/22/how-americans-see-problems-of-trust/.
Richardson, R. J., Bonnie H. Erickson, and T. A. Nosanchuk. 1979. “Community Size, Network Structure, and the Flow of Information.” Canadian Journal of Sociology / Cahiers Canadiens de Sociologie 4 (4): 379–92. https://doi.org/10.2307/3340260.
Rogers, Katie, Christine Hauser, Alan Yuhas, and Maggie Haberman. 2020. “Trump’s Suggestion That Disinfectants Could Be Used to Treat Coronavirus Prompts Aggressive Pushback.” New York Times, April 24, 2020. https://www.nytimes.com/2020/04/24/us/politics/trump-inject-disinfectant-bleach-coronavirus.html.
Rosnow, Ralph L. 1988. “Rumor as Communication: A Contextualist Approach.” Journal of Communication 38 (1): 12–28. https://doi.org/10.1111/j.1460-2466.1988.tb02033.x.
Rothkopf, David J. 2003. “When the Buzz Bites Back.” Washington Post, May 11, 2003. http://www1.udel.edu/globalagenda/2004/student/readings/infodemic.html.
Roy, Melissa, Nicolas Moreau, Cécile Rousseau, Arnaud Mercier, Andrew Wilson, and Laëtitia Atlani-Duault. 2020. “Ebola and Localized Blame on Social Media: Analysis of Twitter and Facebook Conversations During the 2014–2015 Ebola Epidemic.” Culture, Medicine, and Psychiatry 44 (1): 56–79. https://doi.org/10.1007/s11013-019-09635-8.
SAMHSA. 2019. “Communicating in a Crisis: Risk Communication Guidelines for Public Officials | Publications and Digital Products.” PEP19-01–005. Substance Abuse and Mental Health Services Administration. https://store.samhsa.gov/product/communicating-crisis-risk-communication-guidelines-public-officials/pep19-01-01-005.
Samory, Mattia, and Tanushree Mitra. 2018. “Conspiracies Online: User Discussions in a Conspiracy Community Following Dramatic Events.” In ICWSM 2018: Proceedings of the Twelfth International AAAI Conference on Web and Social Media. Association for the Advancement of Artificial Intelligence.
Sato, Ana Paula Sayuri. 2018. “What Is the Importance of Vaccine Hesitancy in the Drop of Vaccination Coverage in Brazil?” Revista De Saude Publica 52 (November): 96. https://doi.org/10.11606/S1518-8787.2018052001199.
Schachter, Stanley, and Harvey Burdick. 1955. “A Field Experiment on Rumor Transmission and Distortion.” Journal of Abnormal and Social Psychology 50 (3): 363–71. https://doi.org/10.1037/h0044855.
Shao, Chengcheng, Giovanni Luca Ciampaglia, Alessandro Flammini, and Filippo Menczer. 2016. “Hoaxy: A Platform for Tracking Online Misinformation.” WWW ’16 Companion: Proceedings of the 25th International Conference Companion on World Wide Web, 745–50. https://doi.org/10.1145/2872518.2890098.
Shibutani, Tamotsu. 1966. Improvised News: A Sociological Study of Rumor. Oxford, England: Bobbs-Merrill.
Shklovski, Irina, Leysia Palen, and Jeannette Sutton. 2008. “Finding Community through Information and Communication Technology in Disaster Response.” In CSCW ’08: Proceedings of the 2008 ACM Conference on Computer Supported Cooperative Work, 127–136. San Diego: Association for Computing Machinery. https://doi.org/10.1145/1460563.1460584.
Sorkin, Amy Davidson. 2020. “The Dangerous Coronavirus Conspiracy Theories Targeting 5G Technology, Bill Gates, and a World of Fear.” New Yorker, April 24, 2020.
Starbird, Kate. 2017. “Examining the Alternative Media Ecosystem Through the Production of Alternative Narratives of Mass Shooting Events on Twitter.” In ICWSM 2017: Proceedings of the Eleventh International AAAI Conference on Web and Social Media. Association for the Advancement of Artificial Intelligence.
Starbird, Kate, Ahmer Arif, and Tom Wilson. 2019. “Disinformation as Collaborative Work: Surfacing the Participatory Nature of Strategic Information Operations.” Proceedings of the ACM on Human-Computer Interaction 127:1–127:26. https://doi.org/10.1145/3359229.
Starbird, Kate, Dharma Dailey, Ann Hayward Walker, Thomas M. Leschine, Robert Pavia, and Ann Bostrom. 2015. “Social Media, Public Participation, and the 2010 BP Deepwater Horizon Oil Spill.” Human and Ecological Risk Assessment: An International Journal 21 (3): 605–30. https://doi.org/10.1080/10807039.2014.947866.
Starbird, Kate, Jim Maddock, Mania Orand, Peg Achterman, and Robert M. Mason. 2014. “Rumors, False Flags, and Digital Vigilantes: Misinformation on Twitter after the 2013 Boston Marathon Bombing.” iConference 2014 Proceedings, iSchools, March. https://doi.org/10.9776/14308.
Strickler, Laura. 2010. “Watchdogs on Alert for Haiti Charity Fraud.” CBS News, January 13, 2010. https://www.cbsnews.com/news/watchdogs-on-alert-for-haiti-charity-fraud/.
Sunstein, Cass R. 2001. Echo Chambers: Bush v. Gore, Impeachment, and Beyond. Princeton University Press.
Sunstein, Cass R., and Adrian Vermeule. 2009. “Conspiracy Theories: Causes and Cures*.” Journal of Political Philosophy 17 (2): 202–27. https://doi.org/10.1111/j.1467-9760.2008.00325.x.
Sutton, Jeannette. 2018. “Health Communication Trolls and Bots Versus Public Health Agencies’ Trusted Voices.” American Journal of Public Health 108 (10): 1281–82. https://doi.org/10.2105/AJPH.2018.304661.
Sutton, Jeannette N., Leysia Palen, and Irina Shklovski. 2008. “Backchannels on the Front Lines: Emergency Uses of Social Media in the 2007 Southern California Wildfires.” In Proceedings of the 5th International ISCRAM Conference.
Tan, Andy S. L., Chul-joo Lee, and Jiyoung Chae. 2015. “Exposure to Health (Mis)Information: Lagged Effects on Young Adults’ Health Behaviors and Potential Pathways.” Journal of Communication 65 (4): 674–98. https://doi.org/10.1111/jcom.12163.
Tierney, Kathleen. 2019. Disasters: A Sociological Approach. Polity.
Tirkkonen, Päivi, and Vilma Luoma-aho. 2011. “Online Authority Communication during an Epidemic: A Finnish Example.” Public Relations Review 37 (2): 172–74. https://doi.org/10.1016/j.pubrev.2011.01.004.
Tromble, Rebekah, Andreas Storz, and Daniela Stockmann. 2017. “We Don’t Know What We Don’t Know: When and How the Use of Twitter’s Public APIs Biases Scientific Inference.” Available at SSRN 3079927.
Tufekci, Zeynep. 2014. “Big Questions for Social Media Big Data: Representativeness, Validity and Other Methodological Pitfalls.” ArXiv:1403.7400 [Physics], April. http://arxiv.org/abs/1403.7400.
Van der Meer, Toni G. L. A., and Yan Jin. 2020. “Seeking Formula for Misinformation Treatment in Public Health Crises: The Effects of Corrective Information Type and Source.” Health Communication 35 (5): 560–75. https://doi.org/10.1080/10410236.2019.1573295.
Vaughan, Elaine, and Timothy Tinker. 2009. “Effective Health Risk Communication about Pandemic Influenza for Vulnerable Populations.” American Journal of Public Health 99 S2: S324–332. https://doi.org/10.2105/AJPH.2009.162537.
Vosoughi, Soroush, Deb Roy, and Sinan Aral. 2018. “The Spread of True and False News Online.” Science 359 (6380): 1146–51. https://doi.org/10.1126/science.aap9559.
Walker, Jay. 2016. “Civil Society’s Role in a Public Health Crisis.” Issues in Science and Technology, December 5, 2016. https://issues.org/civil-societys-role-in-a-public-health-crisis/.
Walker, Charles J., and Carol A. Beckerle. 1987. “The Effect of State Anxiety on Rumor Transmission.” Journal of Social Behavior & Personality 2 (3): 353–60.
Wang, Eileen, Yelena Baras, and Alison M. Buttenheim. 2015. “‘Everybody Just Wants to Do What’s Best for Their Child’: Understanding How Pro-vaccine Parents Can Support a Culture of Vaccine Hesitancy.” Vaccine 33 (48): 6703–9. https://doi.org/10.1016/j.vaccine.2015.10.090.
Wardle, Claire, and Hossein Derakhshan. 2017. “Information Disorder: Toward an Interdisciplinary Framework for Research and Policy Making.” Council of Europe. https://edoc.coe.int/en/media/7495-information-disorder-toward-an-interdisciplinary-framework-for-research-and-policy-making.html.
WHO. 2020a. “Munich Security Conference.” February 15, 2020. https://www.who.int/dg/speeches/detail/munich-security-conference.
———. 2020b. “Deaths from Democratic Republic of the Congo Measles Outbreak Top 6000.” WHO | Regional Office for Africa (blog). Accessed June 18, 2020. https://www.afro.who.int/news/deaths-democratic-republic-congo-measles-outbreak-top-6000.
Wilson, Tom, Kaitlyn Zhou, and Kate Starbird. 2018. “Assembling Strategic Narratives: Information Operations as Collaborative Work within an Online Community.” Proceedings of the ACM on Human-Computer Interaction 183:1–26. https://doi.org/10.1145/3274452.
Wood, Thomas, and Ethan Porter. 2019. “The Elusive Backfire Effect: Mass Attitudes’ Steadfast Factual Adherence.” Political Behavior 41 (1): 135–63. https://doi.org/10.1007/s11109-018-9443-y.
Zeng, Jing, Chung-hong Chan, and King-wa Fu. 2017. “How Social Media Construct ‘Truth’ around Crisis Events: Weibo’s Rumor Management Strategies after the 2015 Tianjin Blasts.” Policy & Internet 9 (3): 297–320. https://doi.org/10.1002/poi3.155.
Zhao, Zhe, Paul Resnick, and Qiaozhu Mei. 2015. “Enquiring Minds: Early Detection of Rumors in Social Media from Enquiry Posts.” In WWW ’15: Proceedings of the 24th International Conference on World Wide Web, 1395–1405. Florence, Italy. https://doi.org/10.1145/2736277.2741637.
Zubiaga, Arkaitz, Ahmet Aker, Kalina Bontcheva, Maria Liakata, and Rob Procter. 2018. “Detection and Resolution of Rumours in Social Media: A Survey.” ACM Computing Surveys 51 (2): 32:1–32:36. https://doi.org/10.1145/3161603.