Abstract

In understanding and responding to the problem of misinformation during global health emergencies, health experts and organizations such as the WHO have relied on the concept of the “infodemic,” or the idea that there is such an overabundance of information that ascertaining trustworthy sources and reliable guidance is difficult. Is this the best way to understand the problem of misinformation, however? A large and multidisciplinary literature has argued that such an approach misses the important role of individual psychological factors and societal “mega-trends” such as hyperpolarization, structural shifts in the media, and public mistrust in elites. This article argues that such contributions are important in understanding the multifaceted problem of misinformation but may miss another, equally important component: the politics of emergency. Specifically, the prominent role of speculation during moments of emergency—the need to respond to “what ifs” rather than just “what is”—provides a conducive context for misinformation, facilitating its production and spread while also problematizing efforts to correct it. The article illuminates this relationship through a discourse analysis of prominent misinformation claims during the US responses to Ebola in 2014 and COVID-19 in 2020.

Pour comprendre le problème de la désinformation lors d'urgences sanitaires mondiales et y répondre, les experts de la santé et les organisations comme l'OMS se sont appuyés sur le concept « d'infodémie », l'idée qu'il y ait une surabondance telle d'informations que l’évaluation de sources fiables et dignes de confiance devient difficile. Néanmoins, est-ce la meilleure façon d'appréhender le problème de la désinformation ? Une littérature abondante et multidisciplinaire affirme qu'une telle approche omet le rôle important de facteurs psychologiques individuels et de « mégatendances » sociétales, comme l'hyperpolarisation, les changements structurels des médias et le manque de confiance de la population envers les élites. Cet article affirme que de telles contributions sont importantes quand il s'agit de comprendre toutes les facettes du problème de la désinformation, mais qu'elles pourraient passer à côté d'un autre composant tout aussi essentiel : la politique de l'urgence. Plus précisément, le rôle important de la spéculation dans les moments de crise, ou la nécessité de répondre à des « ce qui pourrait être » plutôt que simplement à « ce qui est », crée un contexte propice à la désinformation, en facilitant sa production et sa diffusion, tout en compliquant les efforts visant à y remédier. L'article met en lumière cette relation par le biais d'une analyse de discours de célèbres affirmations appartenant à la désinformation lors de la réponse américaine au virus Ebola en 2014 et au coronavirus en 2020.

Con el fin de comprender y responder al problema de la desinformación durante las emergencias sanitarias mundiales, los expertos en salud y las organizaciones, como la OMS, han utilizado el concepto de «infodemia», es decir, la idea de que existe tal saturación de información que es difícil determinar que fuentes y guías son fiables. Sin embargo, ¿es esta la mejor manera de entender el problema de la desinformación? Existe una gran cantidad de literatura, de carácter multidisciplinar, la cual ha argumentado que este enfoque pasa por alto el importante papel que ejercen tanto los factores psicológicos individuales como las «megatendencias» sociales, tales como la hiperpolarización, los cambios estructurales en los medios de comunicación y la desconfianza pública en las élites. Este artículo argumenta que estas contribuciones son importantes para poder comprender el problema multifacético de la desinformación, pero que, también, pueden pasar por alto otro componente igualmente importante: la política de emergencia. En concreto, el papel prominente que juega la especulación en momentos de emergencia (es decir, la necesidad de responder a «qué pasaría si» en lugar de responder solo a «qué es») proporciona un contexto propicio para la desinformación, facilitando su producción y propagación, al mismo tiempo que problematiza los esfuerzos para corregirla. El artículo arroja luz con respecto a esta relación mediante un análisis del discurso de las principales afirmaciones desinformativas durante las respuestas de Estados Unidos al ébola en 2014 y a la pandemia de COVID-19 en 2020.

In 2020, the word “infodemic” made a comeback. Popularized through a Washington Post column by David Rothkopf (2003) in relation to SARS, the term blends “information” and “epidemic” to refer to a rapid, far-reaching, and overwhelming spread of information, both accurate and inaccurate. 2020 saw this term revived for another global health crisis—COVID-19–and by no less than the World Health Organization (WHO). WHO (2020a) outlined the idea in a February 2 Situation Report: “the 2019-nCoV outbreak and response has been accompanied by a massive ‘infodemic’—an overabundance of information—some accurate and some not—that makes it hard for people to find trustworthy sources and reliable guidance when they need it.” While the wealth of information as a whole is noted, Dr. Tedros and the WHO were particularly concerned with the role of mis- or disinformation, or the “trolls and conspiracy theorists that push misinformation and undermine the outbreak response” (Tedros, quoted in WHO 2020b). Accordingly, the response by the WHO was thus to try and tip the scales: to clamp down on misinformation through collaborating with social media companies and counter it with greater access to accurate and verified knowledge from experts (WHO 2020b), or, in other words, less bad information, more good.

Is this accurate, however? Is it solely due to an “overabundance” of information—with too much bad information crowding out the good—that misinformation and disinformation proliferate and hinder our responses to global health crises? Such an approach leaves open the question of why misinformation is produced, spread, and believed, or, ultimately, what the problem of misinformation truly is: the underlying issues that lead to its manifestation and proliferation. This, ironically, is at odds with the very origins of the term “infodemic”: Gunther Eysenbach’s (2002) concept of “infodemiology,” which refers to “the study of the determinants and distribution of health information and misinformation.” If we are truly going to understand the “infodemic,” we must return to this task of analyzing the why. This paper thus asks: How do and should we understand the problem of misinformation, particularly as it appears during health emergencies?

Competing arguments from political science, sociology, psychology, and science and technology studies (STS) have challenged the WHO’s “Infodemic” narrative, with its focus on the overabundance of information. Some have focused on psychological factors such as confirmation bias to demonstrate how individuals are predisposed to believing and spreading misinformation (Jerit and Zhao 2020; Ecker et al. 2022). Others have presented misinformation as a symptom of wider societal problems, including a lack of trust in authoritative knowledge (Zimmerman and Kohring 2020; Neblo and Wallace 2021) and shifts in policy and the economy (Iyengar and Massey 2019). Meanwhile, a more critical literature has pointed to the enduring politics of knowledge production and expertise, where misinformation is simply part of the information ecosystem (Jasanoff and Simmet 2017; Scheufele et al. 2021). More recent scholarship has drawn these ideas together, ultimately portraying misinformation as a multifaceted problem spanning individual, political, technological, and societal factors (Lewandowsky et al. 2017; Krause et al. 2022).

These contributions provide a significantly more comprehensive understanding of misinformation, but this paper argues that they still miss one important element: the politics of emergency. Such contributions tend to be focused solely on moments of normality. In fact, one of the key arguments within this work is that there is a fundamental tension between the egalitarian ideals of democracy and the technocratic prioritization of experts (Fischer 1990; Nichols 2017). Yet this actually misses one of the core insights from the “Infodemic” approach: Global health crises are not moments of normality. They are moments of exception: They dramatically shift and even rupture normal politics, permitting rules to be broken, norms to be rewritten, and new forms of order to rise. We must then situate these arguments within the politics of emergency and consider what the latter may do to the former.

Specifically, I argue that exceptionalism—especially as it manifests in contemporary security policy—leads to an emphasis on speculation as a form of knowledge production. Threats like pandemics and outbreaks are not easily understood by a lay audience yet are surrounded by fears (both warranted and unwarranted) of catastrophic consequences. As research from critical security studies (CSS) has revealed, this results in a demand to imagine the unimaginable or to focus on unlikely but catastrophic possibilities. This can occur prior to crises, through anticipatory practices of preparedness, but notably also during emergencies, as individuals seek to address uncertainty through the construction of speculative futures. Ultimately, speculation becomes central to knowledge production in moments of crisis, producing multiple futures and permitting individuals to focus on their more feared possibilities. These futures are drawn into the present, blurring knowledge of each. And it illuminates the tenuous and deeply political nature of expertise. Such speculation is not inherently detrimental, however. It can encourage more self-reflexive and precautionary policy, as politicians and experts justify their choices, consider alternatives, and explore future consequences. Yet it can also be wielded (by bad faith and well-meaning actors) to destabilize the “what is,” the certainties, and the very notion of knowledge itself. Such a space not only encourages the proliferation of misinformation and disinformation, but makes it even more difficult to adequately address.

To make this argument, I first consider the multidisciplinary literature on misinformation, which portrays it as a multifaceted problem with individual, societal, technological, and political components. Yet, as I argue, this literature tends to focus on moments of “normal” democratic politics, which misses that—as recognized by the WHO’s “Infodemic” approach—health emergencies are a unique context. Meanwhile, the politics of exceptionalism that appear during moments of crisis suggests that we should see more unification or adherence to authoritative experts, not less. In the second section of this paper, I outline what the politics of exceptionalism look like, drawing from work in CSS that emphasizes the role of speculation. Combining this with the previous arguments on misinformation, I argue here that the demand to illuminate the “what if” during crises undermines efforts to produce and respond to the “what is,” including through facilitating mis/disinformation. I then explore this reading of misinformation by examining two contemporary health emergencies within the same context: the US responses to Ebola and (the early response to) COVID-19. I focus on two inaccurate claims that became prominent amongst those contesting these responses: the claim that Ebola was mutating or likely to mutate and the claim that hospitals and authorities were lying about COVID-19 deaths. I show that while the issues explored by the wider misinformation literature—overabundance, skepticism, polarization, and mistrust—are evident, these claims were ultimately premised on larger “what ifs” or imaginative scenarios about the outbreaks: the possibility of an American outbreak of Ebola and the prospect of an authoritarian denial of basic civil freedoms lasting beyond COVID-19.

The Problem of Misinformation

Within this article, I define misinformation as “information considered incorrect based on the best available evidence from relevant experts at the time” (Vraga and Bode 2020, 138). As with all definitions of misinformation, this is not without problems. The “best available evidence,” after all, may evolve over time, particularly in the case of a novel coronavirus. We must therefore always remain attentive to context, considering the state of evidence, who the relevant experts are, and the information environment in which this evidence is made available (Vraga and Bode 2020, 141). Yet while this definition may point to the elements in need of consideration, it does not itself explain the problem of misinformation. What is/are the underlying issue(s) that leads to the manifestation and proliferation of misinformation, particularly as it appears in health crises?

Multiple arguments exist here, with interventions from disparate bodies of literature such as public health, political communication, political science, psychology, sociology, and STS. Together, this literature presents misinformation as a multifaceted, transnational, and deeply political problem, both one of the inaccurate information itself and its “ecosystem” (Krause et al. 2022), or the context that facilitates its production, spread, and uptake. More specifically, this literature reveals that the problem of misinformation contains individual, societal, technological, and political components all in need of addressing.

Scholarship from political psychology has outlined the role of cognitive bias in both predisposing people to believe misinformation and problematizing efforts to correct resulting misperceptions. More specifically, misinformation is understood as a problem of “motivated reasoning” (Kunda 1990), where individuals uncritically adopt evidence that conforms to their attitudes (confirmation bias) yet deny that which challenges their positions (disconfirmation bias) (see Jerit and Zhao 2020). The role of emotional arousal has also been identified as a key factor, with misinformation eliciting disgust, fear, and anger more likely to be remembered and repeated (Ecker et al. 2022, 15). Some have also noted that incorrect information tends to “stick” even after correction, particularly when the misinformation provides a coherent, complete explanation and is frequently repeated (Lewandowsky et al. 2012; Walter and Tukachinsky 2020). In essence, misinformation can be more appealing, more persuasive, and easier to remember than accurate information, making it difficult to address on a purely psychological basis.

Individual psychological factors are not enough to produce a significant misinformation “problem,” however. The misinformation literature has also identified what Lewandowsky et al. (2017) refer to as “societal mega-trends,” or societal changes that encourage the wider proliferation and spread of misinformation. Growing polarization sits core amongst these trends, with research from the United States and Europe revealing a widening gap in beliefs between political parties and deepening animosity between groups (Iyengar and Westwood 2015; Reiljan 2020). Misinformation that favors a particular political party or their preferred policies thus disseminates more rapidly and—due to confirmation bias—is more easily accepted. Just as critically, accurate information that is perceived to align with “the other side” is either subject to extreme criticism or dismissed outright as “politically” motivated (Fischer 2019). Mis- and disinformation thus become less about “rational information sharing” and more about “identity expression” (Monsees 2023, 155). Indeed, in some cases, misinformation is purposefully produced (forming disinformation, or deliberately deceptive information) by those with political motives (Bennett and Livingston 2018; Fischer 2022).

This aligns with two other “mega-trends”: shifts in the media landscape and rising public mistrust in authorities, including experts. The former refers to the proliferation of ideologically informed news sources, both in traditional media and online, that furthered political polarization and undermined the notion of “authoritative knowledge” (Iyengar and Massey 2019, 7656–7). Similarly, advertising has become increasingly central to modern media, with emotive—yet false—claims being particularly powerful in generating online traffic and revenue (Bakir and McStay 2018, 155). The latter change is entwined with this: Research has shown rising public mistrust in knowledge authorities, with the media being particularly distrusted (Zimmermann and Kohring 2020) and scientific experts also facing hostility (Neblo and Wallace 2021). Some, including Lewandowsky et al. (2017), refer to these trends together as the rise of a “post-truth” society. Others eschew this label in favor of a wider shift toward populist politics (Fischer 2019, 2022).

Another “mega-trend” in need of consideration is technological change. Social media receives the bulk of the blame here, with scholars highlighting the lack of oversight, the growth in “bots” or automated accounts, and the use of algorithms to encourage those predisposed to misinformation to view more (Vosoughi et al. 2018; Gisondi et al. 2022). Search engines—namely, their capacity to direct users to particular results—are also noted as a concern (Noble 2018; Makhortykh et al. 2020). Such changes are compounded by declining public digital and scientific literacy. Increasingly, publics are observed as digitally illiterate—or unable to verify sources, identify clickbait or memes, and independently find or check information (Guess et al. 2020)—and scientifically illiterate, where they place too much emphasis on single studies and fail to appreciate the role of uncertainty and falsifiability in science (Parker et al. 2021, 7). This is only furthered by the decreasing number of journalists specializing in science communication and the commercialization of research, leading to the mistranslation of research through overstating the significance of findings, misquoting statistics, visuals, or statements, and omitting crucial context and limitations (The Lancet Infectious Disease 2020; West and Bergstrom 2021, 3). Yet while these societal and technological changes are important in understanding the contemporary nature of misinformation, other scholarship has emphasized that misinformation is not necessarily a new problem (see Scheufele et al. 2021). In fact, misinformation has long been part of politics, as appeals to emotion, efforts to undermine the factual claims of opponents, and propaganda have played key roles in electoral campaigns and maintaining political control (Jasanoff and Simmet 2017; Scheufele et al. 2021). Some have also highlighted how the production, spread, and belief in misinformation is entwined with individuals’ and communities’ histories and political contexts. Belief in vaccine misinformation, for instance, has been connected to histories of medical malpractice, controversies surrounding pharmaceutical companies, and even geopolitics (Larson 2020; Goldenberg 2021). Addressing misinformation thus requires being attentive to its position within the broader political landscape.

Moreover, as scholars of STS have long explored, knowledge is never free of politics and societal concerns, as it is “co-produced” with “social practices, identities, norms, conventions, discourses, instruments, and institutions” (Jasanoff 2004, 3; see also Latour and Woolgar 1979). Some have thus argued that the problem is less the “politicization of science” than the “scientization of politics,” where the values, judgments, and interests in knowledge production and resulting decision-making are obscured by the claim that it is simply “evidence-based policy” (Eyal 2019). Debates around such policy thus replace the underlying political dispute with a “scientific” one, including the use of misinformation (Jasanoff and Simmet 2017; Goldenberg 2021). It also leaves accurate knowledge open to the claim of being “biased” (and therefore dismissible) when its entwined political values can be identified (Fischer 2019), particularly when paired with the confirmation bias and polarization mentioned above.

Indeed, it is important to recognize that while there remains areas of debate and contestation within this body of scholarship,1 there is also significant overlap between them. We should thus consider them complementary in outlining the problem of misinformation. Taken together, this work reveals misinformation as not only an issue of “bad information” crowding out the good, but rather the continuing politics of knowledge production and use, exacerbated by individual psychological factors, wider societal changes, and new technologies.

Yet there remains an important gap in this larger, multidisciplinary argument: the politics of emergency. Misinformation, of course, is a problem of both “normal politics” and moments of crisis, and we must always be wary of suggesting that emergencies are a complete disjuncture from the norm (see Neal 2009). Yet, as scholars of security have persuasively argued, emergencies do provide a unique political context, termed “the exception” (Huysmans 1998). When societies are threatened (as by a novel virus), they seek authoritative, decisive, and urgent action. This can empower sovereign figures, such as political leaders. It can permit the breaking of rules, laws, and norms to ensure the threat is managed. And, in a basic sense, it heightens uncertainty and anxiety, as the “possibility of violent death” becomes the motivating factor of political life (Huysmans 1998, 582). Some parts of the misinformation literature do recognize these political impacts of emergencies. Some, for instance, have noted the role of emergency in encouraging more rapid—and less rigorous—peer review, including a lack of peer review entirely (Gazendam et al. 2020). Studies have also highlighted how uncertainty and anxiety during fast-moving crises can impact one’s openness to misinformation, with one study showing anxious individuals were more open to both accurate information and misinformation (Freiling et al. 2023, 156). More notably, this recognition of emergency politics as a unique context also resides at the heart of the WHO’s “infodemic management.” Indeed, the “infodemic” concept specifically outlines how information surges during health crises, while “infodemic management” aims to address the unique challenge of ensuring the spread of accurate information during such emergencies (see Eysenbach 2020; WHO 2020b). Missing from this work, however, is an engagement with another key component of the contemporary politics of emergency: the role of speculation.

Imagining the Unimaginable

As part of their broader critical examination of security’s contemporary meaning and practice,2 scholars from CSS have noted that emergency politics is increasingly concerned with the management of uncertain futures. The exception is now less about the elimination of imminent threats than it is the monitoring and preemption of “low probability, high consequence” events (see Aradau and van Munster 2011): the terrorist attack, the release of a biological agent, the nuclear explosion, the natural disaster, the global financial crash, or the worldwide pandemic. Such an understanding of emergency stretches it. It stretches it temporally, beyond just the immediate moment of danger, to permeate everyday politics, where it operates at borders (Salter 2008), within policing (Nøkleberg 2022), through bureaucratic processes (Bigo 2002), and in courtrooms (de Goede and de Graaf 2013). More importantly for our purposes, such an understanding also stretches emergency into the “politics of possibility” (Amoore 2013), where the goal is not simply to identify the dangers that are coming but to prepare for the dangers that cannot be identified until it is too late. Emergency politics, in other words, becomes about speculation.

Speculation refers to a distinct change in how we conceptualize security, which “invites one to anticipate what one does not yet know, to take into account doubtful hypotheses and simple suspicions” (Ewald 2002, 288). This is more than just considering risks, where technologies of insurance, network analysis, disaster planning, and others are employed to calculate danger or to essentially make certainty out of uncertainty. Speculation, in contrast, embraces uncertainty. It does not seek to “get[. . .] the future right,” but to “imagine or map out as many possible futures as could plausibly be imagined” (Grusin 2004, 28). Speculation focuses on what could be, what might be, or the multiple what if scenarios. This aligns with Beck’s (1992) notion of the “risk society,” where risks have become so uncontrollable and incalculable that prediction is impossible but also exceeds it. Specifically, Beck’s emphasis is on objective, given risks and their modern incalculability, which lacks attention to how risks are constructed, or “rendered knowledgeable and thinkable” (Aradau and van Munster 2007, 96). Speculation, however, draws attention to this process of making risk “thinkable.” It highlights how risks can be “simultaneously incalculable and demanding new methodologies of calculation and imagination” (de Goede 2008, 160). Yet it also renders it ambiguous, revealing the multiplicity of possible understandings (Best 2008, 361). Importantly, speculation is not truly about the future but also about the present: It enables action and influences current understandings by creating, visualizing, and drawing upon imagined futures (de Goede 2008, 159). The suggestion of what if thus influences the what is.

This speculation can take various forms.3 It can operate as an anticipatory security practice prior to an emergency, as articulated within practices such as “preparedness” (Collier and Lakoff 2008) or “preemption” (Amoore 2013). This typically involves formal forecasting techniques, such as the algorithmic modeling, risk profiling, and data analysis frequently conducted by government security agencies (Amoore and de Goede 2008; Amoore 2013). But it also includes more imaginative techniques. Post-9/11 security practice, including the war on terror, sought to rectify the “failure of imagination” blamed for the attacks, particularly through an expansion of imaginative scenario planning (9/11 Commission 2004, 336; see Aradau and van Munster 2011). To return to health emergencies, the WHO added “Disease X”—“an unknown pathogen that could cause a serious international epidemic”—to their list of priority pathogens in the R&D Blueprint, a guide for the global health research community “on where to focus their energies to manage the next threat” (Dr. Swaminathan, quoted in WHO 2022). As Bougen and O’Malley (2008) argue, this is effectively the “bureaucratization” of imagination, translating uncertainty into risk-based security initiatives.

Yet speculation also operates within emergencies. Emergencies, after all, are deeply ambiguous: There is often significant uncertainty around the danger and best response, and even when uncertainty is answered with information, there remains “the challenge of interpreting, debating, and communicating that information” all within a context requiring rapid decisions (Best 2008, 361). Health emergencies only add to this ambiguity, as threats of novel and/or scientific complex pathogens require translation by (occasionally contested) experts to the layperson. Moreover, as scholarship on exceptionalism demonstrates, such moments operate as “passages to the limit,” where fears of a possible violent death create a “radical openness” in which new truths can be created and previously unthinkable decisions can be made (Huysmans 1998, 581). In such a situation, “imagination acquires epistemic primacy” (Aradau and Van Munster 2011, 85). As this suggests, this form of speculation goes beyond the security and policy establishment. Grusin (2004, 41), for instance, highlights the role of the media in “premediation,” or going beyond just “reporting what has happened” to “premediat[e] what may happen next.” Specifically, he outlines how the post-9/11 US news media landscape shifted to devote significant time and attention to multiple possible future scenarios, including further anthrax cases, terrorist attacks, and forms the Iraq War might take (Grusin 2004, 41–6). These imaginative scenarios operated within the uncertainty and ambiguity prior to the advent of the Iraq War, producing possible futures to capture “the fullest extent of the national security threat [. . .] before it ever happened, or even if it never did” (Grusin 2004, 41–2).

It is this second form of speculation that is particularly pertinent to our understanding of misinformation’s production, spread, and acceptance. In a situation of danger, we increasingly look to the what ifs—the unlikely but devastating scenarios—to understand both the threat itself and what we should do about it. This can produce misinformation in and of itself, as speculated scenarios for the future are reiterated as facts in the present. Speculation also creates a conducive context for misinformation. Speculation provides numerous “affectual reasons” to accept and share misinformation (Monsees 2023, 154). People seek to confirm their speculation, leading to less critical engagement with aligned misinformation (confirmation bias). Communities speculating together will share misinformation simply to reaffirm their identity as part of the community (whether a partisan identity or more broadly), and, more fundamentally, the underlying logic of danger within emergency-driven speculation lowers the threshold for legitimacy. Even if you doubt a piece of misinformation, can you risk ignoring it entirely if the possible result is catastrophic harm?4

Emergency speculation thus does not explain misinformation in and of itself—as none of the arguments outlined above do—but provides another important piece to the puzzle. In addition, focusing on speculation during emergency provides an important normative shift to the discussion on misinformation. Much of the literature has considered the production, spread, and acceptance of misinformation as irrational: a marker of one’s incapacity to identify “good” information, an embrace of hyperpolarized “post-truth politics,” or a failure to address one’s own confirmation bias, for instance. Speculation, however, is ubiquitous, particularly during moments of crisis. It reflects our desire to prepare for uncertain futures and minimize our risks. When legitimized, we call this “prevention” or “preparedness.” When contested, we label it “conspiracy.” By considering misinformation as entwined with speculation, we can thus observe it as an effort to understand and act in moments of fear and uncertainty.

Two Cases of Misinformation: Ebola and COVID-19

To consider the prominent role of emergency-driven speculation in misinformation, we turn to two separate health emergencies: the 2013–2016 outbreak of Ebola in Guinea, Liberia, and Sierra Leone and the COVID-19 pandemic, which commenced in 2020 and remains ongoing at the time of writing. Both cases saw the significant spread of misinformation, including clear cases of misinformation articulated by key political figures and prominent media sources. Specifically, I consider here the American context for each emergency. Keeping to the US context allows us to limit variations such as national context, media sources, and political systems. We can thus consider how other, case-specific forms of difference—the fact that the chosen diseases had very different epidemiological profiles and geographic spread, as well as the responses being under different administrations—may or not affect speculation and misinformation.

Here, I focus on one prominent piece of misinformation within each case—Ebola as an airborne disease and COVID-19 deaths and hospitalizations as lower than official numbers suggested—and illuminate how these claims were ultimately connected to wider speculative (“what if”) narratives, namely, speculation regarding an American outbreak of Ebola and speculation that COVID-19 responses were more damaging than the pandemic itself. This analysis is part of a wider discourse analysis of the US Ebola response (August to November 2014) and the early COVID-19 response (March to August 2020), where I analyzed the speeches, press conferences, and statements of the respective White Houses (Obama during Ebola; Trump during COVID-19 in 2020), transcripts from Congressional hearings, and Fox News’ media coverage. From this analysis, I focused on explicit claims of misinformation, speculation concerning these ideas, and larger speculative narratives concerning the outbreak or pandemic. I coded these manually through NVivo, both to note the frequency of the representations and to highlight the linkages between the three.

Of course, misinformation is not confined to politician statements and the media. I thus supplemented this analysis with one of the posts from the social media site X (formerly Twitter) concerning a “high point” of each claim. For the Ebola case, I focus on October 2014, when the first West African traveler was diagnosed in the United States. For COVID-19, I focus on April 2020, when protests began to form around COVID-19 restrictions and the Trump administration introduced the “Opening Up America Again” guidelines. To obtain these posts, I used X’s advanced search function to identify relevant posts from these time periods. For the Ebola case, I used the search terms “Ebola” and “airborne,” while for COVID-19, I focused on a hashtag campaign that sought to “prove” that experts were lying about COVID-19 figures: #FilmYourHospital.5 Posts were taken from the “Top” tab of results, with this search conducted multiple times to cover the respective month. As a result, 517 posts concerning the Ebola case and 563 posts discussing COVID-19 were collected and analyzed. Analysis was again through manual coding, recorded in a spreadsheet. Such an analysis, of course, is inevitably partial and leaves open questions of randomization, but the goal of this analysis is not to provide a comprehensive and decisive examination of misinformation. Rather, it is to obtain a broad enough slice of these interactions so as to glimpse the relationship these inaccurate claims have with speculation.

Ebola

It is little surprise that the Ebola outbreak has seen comparatively little scholarly analysis of misinformation when considered against COVID-19. Moreover, of the existing analysis, much focuses on the role of misinformation in West Africa, particularly misinformation regarding its source and treatments (see Oyeyemi et al. 2014; Chandler et al. 2015). This is important, but misses how misinformation also plagued the response of one of the largest state responders: the United States. Within the few analyses that have considered the US context, one particular claim is highlighted: the suggestion that Ebola was an “airborne” disease, similar to influenza (Crook et al. 2016, 352). Within this claim, of course, divergence existed. For some, Ebola was always an airborne disease, with suggestions otherwise either dishonesty or scientific incompetence. For others, Ebola had mutated to become airborne during the course of the outbreak. Ebola, of course, was not an airborne disease, but spread through bodily fluids. Moreover, the prospect of it mutating to become airborne was extremely unlikely, necessitating extensive mutations that would be “a genetic leap in the realm of science fiction” (Garrett 2014).

Outright misinformation was relatively rare in the statements of the Obama administration and even on Fox News. One notable exception was that of George Will, who insisted multiple times on Fox News Sunday that “there are now doctors who are saying, we’re not so sure that it can’t be in some instances transmitted by airborne” (Fox News 2014e). What was more common in the media coverage and political debate, however, was speculation. The possibility that Ebola could mutate to become more deadly—or even completely shift its form of transmission to become airborne—was a commonly feared prospect. It was discussed 40 times within the Fox News coverage during this period and 20 times during Congressional hearings. Experts such as Centres for Disease Control and Prevention (CDC) Director Thomas Frieden and National Institute of Allergy and Infectious Diseases (NIAID) Director Anthony Fauci were asked as to the likelihood of such mutation (CEC 2014, 138; Fox News 2014b). Also notable here was the reiteration of speculative claims from particular experts such as Dr. Michael Osterholm and Dr. David Sanders, both of whom speculated—Osterholm (2014) in a New York Times op-ed and in a CNN article on the “nightmare that could happen” (Cohen 2014), and Sanders on two separate appearances on the Fox News program Hannity (Fox News 2014c,g)—the possibility of Ebola mutating to become airborne. Importantly, speculation of mutation also appeared in the speeches and statements of the Obama administration. In fact, Obama’s initial choice to send the US military to aid with the response in West Africa was framed as necessary due to the risk of mutation: “there is the prospect that the virus mutates, it becomes more easily transmittable and then it could be a serious danger to the United States” (NBC 2014). This representation of mutation as a possible risk in need of immediate response was thus shared across partisan lines and between political and media actors, forming a discourse of threat. As argued by CSS scholars, the possible future threat was brought into the present as justification for action.

Within the social media analysis, misinformation was far more prominent (193 posts), but a striking finding here is its lack of dominance within the results. While misinformation formed the highest percentage of posts collected (37 percent), speculative posts—those that either questioned if Ebola would become airborne or reiterated the speculation of media, politicians, or expert figures—constituted almost a third of this data (28 percent). The risk of possible mutation (“if Ebola goes airborne”) was the most common claim here. Some posts (nine in all) explicitly connected the two, taking quotes that suggested Ebola might mutate and insisting that it already had. Also notable is the percentage of misinformation posts that actually drew from expert speculation. Recognitions by the CDC, WHO, and the head of the UN mission for Ebola Emergency Response, Anthony Banbury (in Rushton 2014), that mutation was possible (but extremely unlikely to result in airborne transmission) were reframed as an “admission” of airborne Ebola (thirty-one posts).

Evident from this already is a blurring of the boundaries between speculation and misinformation, where speculation either explicitly became misinformation or encouraged its spread. The former is easily apparent in the social media analysis, namely, the use of expert speculation to “admit” airborne Ebola. It was also apparent in Fox News coverage where commentators erroneously suggested that “what the government has been saying about needing direct contact with bodily fluids is not exactly accurate” while using speculative evidence (“at some point, the blood droplets could then become airborne”) (Sean Hannity, in Fox News 2014a). The latter, however, requires tracing the discourse, namely, how speculative representations were linked to inaccurate ones in a larger “structure of signification” to produce meaning (Milliken 1999, 229). This linkage, however, first required differentiation. Namely, speculative representations were contrasted against other knowledge claims, both specific knowledge claims (that Ebola was known to transmit via bodily fluids) and more general claims to knowledge by prominent public health experts. In terms of specific claims, multiple media commentators and members of Congress paired their speculation about Ebola’s method of transmission with an insistence that “there are things about Ebola we don’t know,” including how it spread (CEC 2014, 2, 127–8). When experts responded to such claims with the assurance that “we” did know these “things,” they were challenged as only having “a lack of evidence as opposed to negative evidence” (CEC 2014, 127–8). Speculation thus involved the construction of ignorance.6 This was more than just the representation of the future as uncertain; it also represented the present as unknown, ambiguous, and open to interpretation. It was in this void that speculation could blur into misinformation. In addition, speculative claims challenged the authority of contesting experts, namely, through representing them as avoiding, dismissing, or keeping quiet about the prospect of airborne Ebola for “political” reasons. This was notable in the comments from Osterholm (2014), who suggested in his speculation about mutation that “virologists [were] loath to discuss [this] openly but are definitely considering [it] in private.” This, as critics noted, undermined trust in other experts, particularly when they challenged such claims (Larimer 2014).

Speculation, moreover, was not confined to the possibility of mutation. Indeed, in the Fox News coverage and Congressional hearings, only 39 percent of speculative claims focused specifically on mutation or airborne transmission. More generally, speculation concerned the “what if” of an American outbreak (154 appearances):

I recently visited with the leadership of one of Texas’s largest health care systems about their precautions to counter Ebola. I asked how many Ebola patients they could reasonably handle with these protocols. “Six or so,” they responded. What if, God forbid, we saw 8,000, as in West Africa? They had no answer. We shouldn’t risk finding out what that answer might be. (Cruz 2014)

This “what if” scenario of an American outbreak took multiple forms. Some—including the Commander of US Southern Command—speculated that Ebola would spread to other “third world countries,” namely, those in Central America whose inhabitants would then migrate to the United States (Gen. John Kelly, quoted in Fox News 2014d). Others implied that every possible visa holder in West Africa—“if we have [. . .] up to 150 West Africans coming in every single day, our people are at risk” (Bill O’Reilly, in Fox News 2014f)—would travel to the United States for better healthcare.

Mutation and airborne transmission were thus part of this story as another “what if.” Actors frequently bounced between these possible “scenarios” in their wider speculation of Ebola’s danger, linking them together via their common anticipatory logic: the possible American outbreak (see Sean Hannity and Ben Carson, in Fox News 2014a). All of these scenarios had to be considered equally, and all had to be considered as viable as a threat as the actual present danger faced by West Africans. Accordingly, while misinformation (that Ebola was an airborne disease or had mutated to become one) and speculation (that Ebola could become airborne) were entwined, they were both ultimately pieces of evidence for this wider speculative discourse, which represented the United States at risk and the current federal response as inadequate in protecting the American homeland from this potential catastrophe.

COVID-19

The role of misinformation during COVID-19, particularly in encouraging disobedience toward public health measures, has been extensively explored (Islam et al. 2020; Gisondi et al. 2022). In contrast to the Ebola case, misinformation was more prevalent within mainstream media coverage and political discourse, with both Fox News and President Donald Trump spreading blatantly false information. Part of this, of course, was the simple fact that COVID-19 is a novel disease. Scientific research was ongoing, consensus was lacking, and verifying false claims was harder than usual. Scholarly analysis has also focused on the role of hyperpartisanship (Golos et al. 2022; Freiling et al. 2023) and social media (Marcon and Caulfield 2021) in facilitating the production and spread of misinformation. Numerous false claims were made and repeated during this time. For the purposes of this analysis, I focus on one particularly prominent claim: that COVID-19 hospitalization and death figures were being purposefully exaggerated.7 I chose this piece of misinformation for two reasons: First, it was especially prominent to the point of being listed as a “persistent” form of misinformation in the media (Iati 2020; Lewis 2020); and second, it neatly mirrors the Ebola case in challenging expert depictions of threat. In contrast to Ebola, however, this claim did not represent the threat of the disease as worse than claimed, but as less. It thus provides a slightly different angle to consider the linkage of misinformation to speculation during moments of health emergency.

As mentioned, misinformation was more prevalent in “mainstream” sources than during Ebola. President Trump himself claimed that official death and hospitalization figures were fraudulent, both in his own X posts (@realDonaldTrump 2021) and those he retweeted, including two on August 30—one later removed by then Twitter (now X)—that suggested that the CDC had “quietly updated” its data to “admit that only 6 percent” of all deaths “actually died” from COVID-19 (see Reuters 2020). Overall, however, explicit claims of misinformation on these topics were infrequent in Trump and White House press conferences, with only twelve statements coded as overt misinformation between March and August. Fox News coverage of COVID-19 saw more misinformation. Commentators and reporters claimed 31 times between the months of March and May that death tolls and hospitalization figures were purposefully exaggerated. One notable case of this was the repeated appearance of State Senator (and medical doctor, as anchors stressed) Scott Jensen on The Ingraham Angle, where he claimed that unrelated deaths—including those from influenza—were being recorded as COVID-19 deaths to obtain greater Medicare funding (Fox News 2020e,g,i).

Like the Ebola case, speculation was far more prominent. While Trump did explicitly claim the numbers were fraudulent, he more frequently questioned the figures as an incomplete (“many of the people who aren’t that sick don’t report”) and overly cautious (“just in case”) picture, with twenty-three speculative claims about the death and hospitalization numbers. Fox News commentators similarly slipped between statements of “fact” and supposition but provided far more references to the latter than the former (158 speculative claims). In some cases, this was pure speculation. The lack of early testing was drawn into an imaginative scenario where COVID-19 had already spread widely, with most cases being asymptomatic and thus the death rate much lower (see Tucker Carlson, Fox News 2020a). Other cases, however, blurred speculation into misinformation. For example, while Tucker Carlson (Fox News 2020d) did not claim that doctors were “classifying conventional pneumonia deaths as COVID-19 deaths,” he did suggest that “it seems entirely possible.”

Examining the #FilmYourHospital hashtag on X during April 2020 saw significantly more misinformation, with 336 posts (63 percent) being overt forms of misinformation (claims that hospitals were empty and COVID-19 was a “hoax”). Speculative posts (those that simply questioned hospitalization figures) formed approximately 20 percent of the posts (112 posts). Yet, again, the split between the two was not always so neat. Some users framed their posts as questions, yet otherwise seemed to believe the premise of #FilmYourHospitals (that hospitals were empty), a blending of misinformation and speculation (twenty-five posts). This was a particularly common occurrence in response to viral videos of healthcare workers dancing: “if [people are dropping dead every second], why do they have time to choreograph, rehearse, and then publish these dances?” (@Chloe7Action 2020).

Tracing the discursive linkages between speculation and misinformation reveals similar representations to that during Ebola. Speculative claims were reiterated as factual ones, while those making claims about high hospitalization and death figures had political motivations, whether to “scare the living daylights out of you” (Sean Hannity, in Fox News 2020c) or for economic gain (Fox News 2020h,j,l). On X, users worried about the power of the “deep state” and the supposedly malevolent intentions of key figures such as Bill Gates or Anthony Fauci (eighteen posts). Importantly, a similar construction of ignorance was also apparent:

Several months in, there’s still a huge amount we don’t know about coronavirus. We don’t know how easily it spreads, how long it can survive outside the body. We don’t know what the real death rate is. We don’t know how many people currently have it, or how many have had it and recovered and most importantly, we don’t know the best way to treat it. (Tucker Carlson, in Fox News 2020e)

There was accuracy in this construction of ignorance: COVID-19 was still a relatively novel and unknown disease, and hospitalization/death figures in particular changed dependent on the criteria. Yet this was also a careful framing of the knowledge production that was occurring at the time, one that allowed speakers to deny particular knowledge claims while making others. Fox News commentators represented themselves as the real truth-seekers, speculating based on “real facts, real science” (Laura Ingraham, Fox News 2020e) in contrast to “the discredited professional class,” relying on “stale conventional wisdom” (Tucker Carlson, in Fox News 2020f) and “counting on people not to actually look at the data” (Laura Ingraham, in Fox News 2020k). Speakers thus used this construction of ignorance to represent themselves as authoritative knowledge producers while positioning others claiming this subject position as frauds. Rational speculation that the hospitalization and death figures were inaccurate thus blurred into misleading statements of “fact” that they were deliberately so.

While the blurring of misinformation and speculation is notable, this did not occur in a discursive vacuum. Both sets of claims were ultimately linked to a larger speculative narrative regarding COVID-19: “if the numbers are not right, then maybe that affects the decision-making about when the country can gradually get back to business” (Laura Ingraham, Fox News 2020b). What if COVID-19 was not as bad as suggested? This speculative narrative appeared 207 times in the analysis. Specific speculative claims regarding the hospitalization and death rates overlapped with this broader narrative 47 times. Moreover, contained within this “what if” were representations of COVID-19 as equivalent to the seasonal flu (seventy-nine references) and other outbreaks the United States had survived (seven comparisons). Clear linkages between these representations and the speculative claims/misinformation regarding the mortality and hospitalization figures were drawn: “At one time we were going to have fatalities 10 times the flu. Now we’re going to have fatalities closer to the flu” (Rudy Giuliani, Fox News 2020d).

Speculation went beyond just COVID-19’s severity. If COVID-19 was not as bad as claimed, then the emergency measures implemented—and their devastating economic and social impact—were not justifiable. Trump (in White House 2020) repeatedly asserted that the United States would see more death from suicides, drug abuse, poverty, and domestic violence “than anything that we're talking about with respect to the virus.” Commentators and guests on Fox News similarly asserted that “the cure is worse than the problem” (this phrase appeared roughly 21 times), citing the same problems. X users were especially concerned with this, suggesting that the “lies” of high hospitalization and death figures demonstrated how unnecessary, arbitrary, and damaging such emergency measures were. Some posts even employed the #FilmYourHospital hashtag to declare their support for protests that occurred that month against stay-at-home measures. On both X and Fox News, some suggested that opposing politicians (namely Democrats) and experts (namely, Anthony Fauci) were purposefully exaggerating COVID-19’s danger to implement a preferred “new normal”: “states like California, they’re using the virus as a pretext to push a far-left agenda” (Sean Hannity, Fox News 2020j). Ultimately, this concern was the most commonly appearing representation, coded 340 times across both White House and Fox News statements and 59 times in the social media posts. The result was thus not merely a downgrading of COVID-19’s threat, but the replacement of it with another threat: the measures themselves, as well as the experts and politicians who advocated for them.

Conclusion: The Hydra of Misinformation

While a necessarily brief analysis of two complex health emergencies, this examination of misinformation within the US debates over Ebola and COVID-19 demonstrates speculation as an important dynamic. Both cases saw their public debates dominated by speculative discourses that countered expert guidance: the prospect of an American Ebola outbreak and the possibility that COVID-19 was being used to dramatically alter normal American life. These “what ifs” facilitated the production and spread of incorrect claims (both speculation and outright misinformation), but also grew out of these assertions to begin with. Ultimately, we cannot properly understand how so many could believe in airborne Ebola or fervently insist that COVID-19 death and hospitalization numbers were fraudulent without locating these claims within these broader speculative narratives.

More generally, and to return to the initial goal of this paper, if we are to truly grasp—and address—the problem of misinformation in global health crises, we must consider how misinformation sits within discourses of threat speculation. This connection between speculation and misinformation has precursors, namely in work that has considered other scientific controversies (see, for instance, Larson 2020 on the role of “rumors” in vaccine hesitancy). Yet we must also consider the unique context of emergency politics in this. As CSS scholars have outlined in detail, contemporary understandings of risk and threat increasingly draw from imaginative scenarios and possibilistic assessments. Speculation becomes central to knowledge production, particularly during health emergencies where ambiguity is heightened by the scientific complexity of the danger. The notion that misinformation—both overt claims and speculative ones that blur into them—would result is thus not so farfetched. In one important way, this paper has sought to more firmly connect these literatures. It has also sought to use this connection to move away from dismissive portrayals of those who spread and believe misinformation as irrational, highlighting how such belief can result from the mere desire to prepare for a possibly catastrophic future.

Yet it remains important to recognize that speculation is not a single answer to this problem, just as the “infodemic,” “post-truth politics,” and motivated reasoning explanations do not provide single, comprehensive answers. All these components play an important role in facilitating the production, spread, and acceptance of misinformation and its deliberately deceptive sibling, disinformation. Indeed, this is one of our current problems in effectively addressing misinformation. Misinformation is like a mythological hydra, a creature with many heads. Cutting off one head—by fact-checking, by moderating social media, by inoculation—does not kill the beast. These assaults can help mitigate its damage, but the problem is a multifaceted one. By recognizing this and identifying each of the components involved, we stand a better chance of properly managing the problem.

Acknowledgments

An early version of this paper was presented at the 2023 International Studies Association conference and received the Global Health section’s 2024 prize for Best Early Investigator Paper. I am grateful for the insightful feedback provided by the panel’s discussant and audience, and the members of the Global Health section’s prize committee. I would also like to thank the anonymous reviewers for their constructive reviews, and the editors of IPS for a very smooth and constructive process.

Footnotes

1

One key area of dispute, for instance, lies around the idea of “post-truth” politics and whether critical scholarship such as that within STS is to blame for it (see Latour 2004; Collins et al. 2017; Jasanoff and Simmet 2017).

2

See CASE Collective (2006, 443) and Browning and McDonald (2013, 236)..

3

I am grateful to Reviewer 4 for highlighting this distinction.

4

See, for instance, Amoore’s (2013, 57–9) discussion of former British Prime Minister Tony Blair’s justification for entering the Iraq war based on misinformation about weapons of mass destruction.

5

The #FilmYourHospital hashtag involved users encouraging others to take photos or videos of empty waiting rooms and parking lots to “prove” that hospitals were empty.

6

As a larger body of scholarship from STS and feminist scholars has argued, ignorance is not merely the absence of knowledge but “a force all its own which often blocks knowledge, stands in its place, and tacitly or more explicitly affirms a need or a commitment not to know” (Code 2014, 154). See Aradau 2017 for an overview and examination of this construction of ignorance.

7

It is necessary to note that, as with quantitative statistics in general (see Merry 2016), hospitalization and death figures were always problematic, and valid questions could be—and were—asked about how such figures were obtained and what biases were contained. The focus here is thus on the claim that such numbers were fraudulent and were wildly exaggerated.

References

@CHLOE7ACTION
.
2020
. “
Because Apparently People Are Dropping Dead Every Second. If That Were the Case…
.”
Twitter, April 20. Accessed December 15, 2022. https://twitter.com/Chloe7Acton/status/1251964164299460608
.

@REALDONALDTRUMP
.
2021
. “
The Number of Cases and Deaths of the China Virus Is Far Exaggerated in the United States because of @CDCgov’s Ridiculous…
.”
Twitter, January 3. Accessed September 15, 2022. https://twitter.com/realDonaldTrump/status/1345720107255926784?s=20
.

9/11 COMMISSION
.
2004
. “
The National Commission on Terrorist Attacks upon the United States Report
.” .

Amoore
 
Louise
,
De goede
 
Marieke
.
2008
. “
Transactions after 9/11: The Banal Face of the Preemptive Strike
.”
Transactions of the Institute of British Geographers
.
33
(
2
):
173
85
.

Amoore
 
Louise
.
2013
.
The Politics of Possibility
.
Durham, NC
:
Duke University
.

Aradau
 
Claudia
,
Van Munster
 
Rens
.
2007
. “
Governing Terrorism through Risk: Taking Precautions, (Un)Knowing the Future
.”
European Journal of International Relations
.
13
(
1
):
89
115
.

Aradau
 
Claudia
,
Van Munster
 
Rens
.
2011
.
Politics of Catastrophe: Genealogies of the Unknown
.
Hoboken, NJ
:
Taylor & Francis
.

Aradau
 
Claudia
.
2017
. “
Assembling (Non) Knowledge: Security, Law, and Surveillance in a Digital World
.”
International Political Sociology
.
11
(
4
):
327
42
.

Bakir
 
Vian
,
Mcstay
 
Andrew
.
2018
. “
Fake News and the Economy of Emotions
.”
Digital Journalism
.
6
(
2
):
154
75
.

Beck
 
Ulrich
.
1992
.
Risk Society: Towards a New Modernity
.
London
:
Sage Publications
.

Bennett
 
W. Lance
,
Livingston
 
Steven
.
2018
. “
The Disinformation Order: Disruptive Communication and the Decline of Democratic Institutions
.”
European Journal of Communication
.
33
(
2
):
122
39
.

Best
 
Jacqueline
.
2008
. “
Ambiguity, Uncertainty, and Risk: Rethinking Indeterminacy
.”
International Political Sociology
.
2
(
4
):
355
74
.

Bigo
 
Didier
.
2002
. "
Security and Immigration: Toward a Critique of the Governmentality of Unease
."
Alternatives: Global, Local, Political
,
27
(
1
):
63
92
.

Bougen
 
Phillip
,
O'Malley
 
Pat
.
2008
."
Bureaucracy, Imagination and U.S. Domestic Security Policy
."
Security Journal
,
22
:
101
118
.

Browning
 
Christopher S.
,
Mcdonald
 
Matt
.
2013
. “
The Future of Critical Security Studies: Ethics and the Politics of Security
.”
European Journal of International Relations
.
19
(
2
):
235
55
.

C.A.S.E. Collective
.
2006
. “
Critical Approaches to Security in Europe: A Networked Manifesto
.”
Security Dialogue
.
37
(
4
):
443
87
.

Chandler
 
Clare
,
Fairhead
 
James
,
Kelly
 
Ann
,
Leach
 
Melissa
,
Martineau
 
Frederick
,
Mokuwa
 
Esther
,
Parker
 
Melissa
,
Richards
 
Paul
,
Wilkinson
 
Annie
.
2015
. “
Ebola: Limitations of Correcting Misinformation
.”
The Lancet
.
385
(
9975
):
1275
7
.

Code
 
Lorraine
.
2014
. “
Ignorance, Injustice and the Politics of Knowledge: Feminist Epistemology Now
.”
Australian Feminist Studies
.
29
(
80
):
148
60
.

Cohen
 
Elizabeth
.
2014
. “
Ebola in the Air? A Nightmare That Could Happen
.” .

Collier
 
Stephen J.
,
Lakoff
 
Andrew
.
2008
. “
The Problem of Securing Health
.” In
Biosecurity Interventions: Global Health and Security in Question
,
edited by
 
Lakoff
 
Andrew
and
Collier
 
Stephen J.
.
New York, NY
:
Columbia University Press
.

Collins
 
Harry
,
Evans
 
Robert
,
Weinel
 
Martin
.
2017
. “
STS as Science or Politics?
Social Studies of Science
.
47
(
4
):
580
6
.

Committee on Energy and Commerce (CEC)
.
2014
.
Examining the US Public Health Response to the Ebola Outbreak
.
Hearing before the Subcommittee on Oversight and Investigation, 113th Congress
.
Washington, D.C.
:
U.S Government Publishing Office
.

Crook
 
Brittani
,
Glowacki
 
Elizabeth M.
,
Suran
 
Melissa
,
Harris
 
Jenine K
,
Bernhardt
 
Jay M.
.
2016
. "
Content Analysis of a Live CDC Twitter Chat During the 2014 Ebola Outbreak
."
Communication Research Reports
.
33
(
4
):
349
55
.

Cruz
 
Ted
.
2014
. “
Op-ed: Ban Flights from Ebola-Stricken Nations
.”
TribTalk, October 15. Accessed September 19, 2022. http://www.tribtalk.org/2014/10/15/ban-flights-from-ebola-stricken-nations/
.

De Goede
 
Marieke
,
De Graaf
 
Beatrice
.
2013
. “
Sentencing Risk: Temporality and Precaution in Terrorism Trials
.”
International Political Sociology
.
7
(
3
):
313
31
.

De Goede
 
Marieke
.
2008
. “
Beyond Risk: Premediation and the Post-9/11 Security Imagination
.”
Security Dialogue
.
39
(
2–3
):
155
76
.

Ecker
 
Ullrich K. H.
,
Lewandowsky
 
Stephan
,
Cook
 
John
,
Schmid
 
Philipp
,
Fazio
 
Lisa K.
,
Brashier
 
Nadia
,
Kendeou
 
Panayiota
,
Vraga
 
Emily K.
,
Amazeen
 
Michelle A.
.
2022
. “
The Psychological Drivers of Misinformation Belief and Its Resistance to Correction
.”
Nature Reviews Psychology
.
1
(
1
):
13
29
.

Ewald
 
FranÇois
.
2002
. “
The Return of Descartes’s Malicious Demon
.” In
Embracing Risk
,
edited by
 
Baker
 
Tom
and
Simon
 
Jonathan
,
273
302
.
Chicago, IL
:
University of Chicago Press
.

Eyal
 
Gil
.
2019
.
The Crisis of Expertise
.
Cambridge
:
Polity Press
.

Eysenbach
 
Gunther
.
2002
. “
Infodemiology: The Epidemiology of (Mis)Information
.”
The American Journal of Medicine
.
113
(
9
):
763
5
.

Eysenbach
 
Gunther
.
2020
. “
How to Fight an Infodemic
.”
Journal of Medical Internet Research
.
22
(
6
):
e21820
.

Fischer
 
Frank
.
1990
.
Technocracy and the Politics of Expertise
.
Newbury Park, CA
:
Sage Publications
.

Fischer
 
Frank
.
2019
. “
Knowledge Politics and Post-Truth in Climate Denial
.”
Critical Policy Studies
.
13
(
2
):
133
52
.

Fischer
 
Frank
.
2022
. “
Post-Truth Populism and Scientific Expertise
.”
International Review of Public Policy
.
4
(
1
):
115
22
.

Fox News
.
2014a
. “
Hannity, Broadcast Transcript
.”
Factiva database, October 13
.

Fox News
.
2014b
. “
On the Record with Greta, Broadcast Transcript
.”
Factiva database, October 14
.

Fox News
.
2014c
. “
Hannity, Broadcast Transcript
.”
Factiva database, October 14
.

Fox News
.
2014d
. “
The O’Reilly Factor, Broadcast Transcript
.”
Factiva database, October 15
.

Fox News
.
2014e
. “
Fox News Sunday, Broadcast Transcript
.”
Factiva database, October 19
.

Fox News
.
2014f
. “
The O’Reilly Factor, Broadcast Transcript
.”
Factiva database, October 20
.

Fox News
.
2014g
. “
Hannity, Broadcast Transcript
.”
Factiva database, October 20
.

Fox News
.
2020a
. “
Tucker Carlson Tonight, Broadcast Transcript
.”
Factiva database, March 24
.

Fox News
.
2020b
. “
The Ingraham Angle, Broadcast Transcript
.”
Factiva database, March 24
.

Fox News
.
2020c
. “
Hannity, Broadcast Transcript
.”
Factiva database, March 27
.

Fox News
.
2020d
. “
The Ingraham Angle, Broadcast Transcript
.”
Factiva database, March 27
.

Fox News
.
2020e
. “
The Ingraham Angle, Broadcast Transcript
.”
Factiva database, March 31
.

Fox News
.
2020f
. “
Tucker Carlson Tonight, Broadcast Transcript
.”
Factiva database, March 31
.

Fox News
.
2020g
. “
Tucker Carlson Tonight, Broadcast Transcript
.”
Factiva database, April 7
.

Fox News
.
2020h
. “
The Ingraham Angle, Broadcast Transcript
.”
Factiva database, April 8
.

Fox News
.
2020i
. “
The Ingraham Angle, Broadcast Transcript
.”
Factiva database, April 13
.

Fox News
.
2020j
. “
Hannity, Broadcast Transcript
.”
Factiva database, April 15
.

Fox News
.
2020k
. “
The Ingraham Angle, Broadcast Transcript
.”
Factiva database, April 28
.

Freiling
 
Isabelle
,
Nicole
 
M. Krause
,
Dietram
 
A. Scheufele
,
Dominique
 
Brossard
.
2023
. “
Believing and Sharing Misinformation, Fact-Checks, and Accurate Information on Social Media
.”
New Media & Society
.
25
(
1
):
141
62
.

Garrett
 
Laurie
.
2014
. “
Five Myths about Ebola
.” .

Gazendam
 
Aaron
,
Ekhtiari
 
Seper
,
Wong
 
Erin
,
Madden
 
Kim
,
Naji
 
Leen
,
Phillips
 
Mark
,
Mundi
 
Raman
,
Bhandari
 
Mohit
.
2020
. “
The ‘Infodemic’ of Journal Publication Associated with the Novel Coronavirus Disease
.”
Journal of Bone and Joint Surgery
.
102
(
13
):
e64
.

Gisondi
 
Michael A.
,
Rachel
 
Barber
,
Jemery
 
Samuel Faust
,
Ali
 
Raja
,
Matthew
 
C. Strehlow
,
Lauren
 
M. Westafer
,
Michael
 
Gottlieb
.
2022
. “
A Deadly Infodemic: Social Media and the Power of COVID-19 Misinformation
.”
Journal of Medical Internet Research
.
24
(
2
):
e35552
.

Goldenberg
 
Maya J.
 
2021
.
Vaccine Hesitancy: Public Trust, Expertise, and the War on Science
.
Pittsburgh, PA
:
University of Pittsburgh Press
.

Golos
 
Aleksandra M.
,
Hopkins
 
Daniel J.
,
Bhanot
 
Syon P.
,
Buttenheim
 
Alison M.
.
2022
. “
Partisanship, Messaging, and the COVID-19 Vaccine
.”
American Journal of Health Promotion
.
36
(
4
):
602
11
.

Grusin
 
Richard
.
2004
. “
Premediation
.”
Criticism
.
46
(
1
):
17
39
.

Guess
 
Andrew M.
,
Lerner
 
Michael
,
Lyons
 
Benjamin
,
Montgomery
 
Jacob M.
,
Nyhan
 
Brendan
,
Reifler
 
Jason
,
Sircar
 
Neelanjan
.
2020
. “
A Digital Media Literacy Intervention Increases Discernment between Mainstream and False News in the United States and India
.”
Proceedings of the National Academy of Sciences
.
117
(
27
):
15536
45
.

Huysmans
 
Jef
.
1998
. “
The Question of the Limit: Desecuritisation and the Aesthetics of Horror in Political Realism
.”
Millennium: Journal of International Studies
.
27
(
3
):
569
89
.

Iati
 
Marisa
.
2020
. “
8 Facts about the Coronavirus to Combat Common Misinformation
.”
Washington Post, December 4, Accessed September 10, 2022. https://www.washingtonpost.com/health/2020/12/05/coronavirus-misinformation-facts/
.

Islam
 
Md Saiful
,
Sarkar
 
Tonmoy
,
Khan
 
Sazzad Hossain
,
Kamal
 
Abu-Hena Mostofa
,
Murshid Hasan
 
S.M.
,
Kabir
 
Alamgir
,
Yeasmin
 
Dalia
, et al.  
2020
. “
COVID-19—Related Infodemic and Its Impact on Public Health
.”
The American Journal of Tropical Medicine and Hygiene
.
103
(
4
):
1621
29
.

Iyengar
 
Shanto
,
Massey
 
Douglas S.
.
2019
. “
Scientific Communication in a Post-Truth Society
.”
Proceedings of the National Academy of Sciences
.
116
(
16
):
7656
61
.

Iyengar
 
Shanto
,
Westwood
 
Sean J.
.
2015
. “
Fear and Loathing across Party Lines
.”
American Journal of Political Science
.
59
(
3
):
690
707
.

Jasanoff
 
Sheila
,
Simmet
 
Hilton R.
.
2017
. “
No Funeral Bells: Public Reason in a ‘Post-Truth’ Age
.”
Social Studies of Science
.
47
(
5
):
751
70
.

Jasanoff
 
Sheila
.
2004
. “
The Idiom of Co-Production
.” In
States of Knowledge
,
edited by
 
Jasanoff
 
Sheila
.
London
:
Routledge
.

Milliken
.
1999
. "
The Study of Discourse in International Relations: A Critique of Research and Methods
."
European Journal of International Relations
.
5
(
2
):
225
254
.

Jerit
 
Jennifer
,
Zhao
 
Yangzi
.
2020
. “
Political Misinformation
.”
Annual Review of Political Science
.
23
(
1
):
77
94
.

Krause
 
Nicole M.
,
Isabelle
 
Freiling
,
Dietram
 
A. Scheufele
.
2022
. “
The ‘Infodemic’ Infodemic: Toward a More Nuanced Understanding of Truth-Claims and the Need for (Not) Combatting Misinformation
.”
The ANNALS of the American Academy of Political and Social Science
.
700
(
1
):
112
23
.

Kunda
 
Ziva
.
1990
. “
The Case for Motivated Reasoning
.”
Psychological Bulletin
.
108
(
3
):
480
98
.

Larimer
 
Sarah
.
2014
. "
Will the Ebola virus go airborne? (And is that even the right question?)
."
Washington Post
16 September.
Retrieved from Factiva database
.

Larson
 
Heidi J.
 
2020
.
Stuck
.
New York, NY
:
Cambridge University Press
.

Latour
 
Bruno
,
Woolgar
 
Steve
.
1979
.
Laboratory Life : The Construction of Scientific Facts
.
Princeton, NJ
:
Princeton University Press
.

Latour
 
Bruno
.
2004
. “
Why Has Critique Run Out of Steam? From Matters of Fact to Matters of Concern
.”
Critical Inquiry
.
30
(
2
):
225
48
.

Lewandowsky
 
Stephan
,
Ecker
 
Ullrich K.H.
,
Cook
 
John
.
2017
. “
Beyond Misinformation: Understanding and Coping with the ‘Post-Truth’ Era
.”
Journal of Applied Research in Memory and Cognition
.
6
(
4
):
353
69
.

Lewandowsky
 
Stephan
,
Ecker
 
Ullrich K.H.
,
Seifert
 
Colleen M.
,
Schwarz
 
Norbert
,
Cook
 
John
.
2012
. “
Misinformation and Its Correction: Continued Influence and Successful Debiasing
.”
Psychological Science in the Public Interest
.
13
(
3
):
106
31
.

Lewis
 
Tanya
.
2020
. “
How the U.S. Pandemic Response Went Wrong and What Went Right during a Year of COVID
.” .

Makhortykh
 
Mykola
,
Urman
 
Aleksandra
,
Ulloa
 
Roberto
.
2020
. “
How Search Engines Disseminate Information about COVID-19 and Why They Should Do Better
.”
Harvard Kennedy School Misinformation Review
.
1
(
3
):
1
12
.

Marcon
 
Alessandro R.
,
Caulfield
 
Timothy
.
2021
. “
The Hydroxychloroquine Twitter War: A Case Study Examining Polarization in Science Communication
.”
First Monday
.
26
(
10
): https://firstmonday.org/ojs/index.php/fm/article/view/11707

Merry
 
Sally Engle
.
2016
.
The Seductions of Quantification
.
Chicago, IL
:
University of Chicago Press
.

Monsees
 
Linda
.
2023
. “
Information Disorder, Fake News and the Future of Democracy
.”
Globalizations
.
20
(
1
):
153
68
.

National Broadcasting Company (NBC)
.
2014
. “
Meet the Press, Broadcast Transcript
.”
Factiva database, September 7
.

Neal
 
Andrew W.
 
2009
.
Exceptionalism and the Politics of Counter-Terrorism
.
London
:
Routledge
.

Neblo
 
Michael A.
,
Wallace
 
Amd Jeremy L.
.
2021
. “
A Plague on Politics? The COVID Crisis, Expertise, and the Future of Legitimation
.”
American Political Science Review
.
115
(
4
):
1524
9
.

Nichols
 
Thomas M.
 
2017
.
The Death of Expertise
.
New York, NY
:
Oxford University Press
.

Noble
 
Safiya Umoja
.
2018
.
Algorithms of Oppression: How Search Engines Reinforce Racism
.
New York, NY
:
NYU Books
.

NØkleberg
 
Martin
.
2022
. “
Expecting the Exceptional in the Everyday
.”
Security Dialogue
.
53
(
2
):
164
81
.

Osterholm
 
Michael T.
 
2014
. “
What We’re Afraid to Say about Ebola
.”
New York Times, September 12
. https://www.nytimes.com/2014/09/12/opinion/what-were-afraid-to-say-about-ebola.html

Oyeyemi
 
Sunday Oluwafemi
,
Elia
 
Gabarron
,
Rolf
 
Wynn
.
2014
. “
Ebola, Twitter, and Misinformation: A Dangerous Combination?
BMJ
.
349
:
g6178
.

Parker
 
Lisa
,
Jennifer
 
A. Byrne
,
Micah
 
Goldwater
,
Nick
 
Enfield
.
2021
. “
Misinformation: An Empirical Study with Scientists and Communicators during the COVID-19 Pandemic
.”
BMJ Open Science
.
5
(
1
):
e100188
.

Reiljan
 
Andres
.
2020
. “
‘Fear and Loathing across Party Lines’ (Also) in Europe: Affective Polarisation in European Party Systems
.”
European Journal of Political Research
.
59
(
2
):
376
96
.

Reuters
.
2020
.
"Fact check: 94% of individuals with additional causes of death still had COVID-19
." 4 September. https://www.reuters.com/article/world/fact-check-94-of-individuals-with-additional-causes-of-death-still-had-covid-1-idUSKBN25U2I4/

Rushton
 
Katherine
.
2014
. “
Ebola ‘Could Become Airborne’: United Nations Warns of ‘Nightmare Scenario’ as Virus Spreads to the US
.” .

Salter
 
Mark
B..
2008
. "
When the exception becomes the rule: borders, sovereignty, and citizenship
."
Citizenship Studies
,
14
(
4
):
365
380
.

Scheufele
 
Dietram A.
,
Krause
 
Nicole M.
,
Freiling
 
Isabelle
.
2021
. “
Misinformed about the ‘Infodemic?’ Science’s Ongoing Struggle with Misinformation
.”
Journal of Applied Research in Memory and Cognition
.
10
(
4
):
522
6
.

Lancet
 
The
 
2020
. “
The COVID-19 Infodemic
.”
The Lancet Infectious Disease
.
20
(
8
):
P875

Vosoughi
 
Soroush
,
Roy
 
Deb
,
Aral
 
Sinan
.
2018
. “
The Spread of True and False News Online
.”
Science
.
359
(
6380
):
1146
51
.

Vraga
 
Emily K.
,
Bode
 
Leticia
.
2020
. “
Defining Misinformation and Understanding Its Bounded Nature
.”
Political Communication
.
37
(
1
):
136
44
.

Walter
 
Nathan
,
Tukachinsky
 
Riva
.
2020
. “
A Meta-Analytic Examination of the Continued Influence of Misinformation in the Face of Correction
.”
Communication Research
.
47
(
2
):
155
77
.

West
 
Jevin D.
,
Bergstrom
 
Carl T.
.
2021
. “
Misinformation in and about Science
.”
Proceedings of the National Academy of Sciences
.
118
(
15
):
e1912444117
.

White House
.
2020
. “
Remarks by President Trump, Vice President Pence, and Members of the Coronavirus Task Force in Press Briefing
.”

World Health Organization (WHO)
.
2020a
. “
Novel Coronavirus (2019-nCoV) Situation Report –13
.”
February
2. https://www.who.int/publications/m/item/situation-report---13

World Health Organization (WHO)
.
2020b
. “
Immunizing the Public against Misinformation
.”
August
25.

World Health Organization (WHO)
.
2022
. “
WHO to Identify Pathogens That Could Cause Future Outbreaks and Pandemics
.”
November
21.

Zimmerman
 
Fabian
,
Kohring
 
Matthias
.
2020
. “
Mistrust, Disinforming News, and Vote Choice
.”
Political Communication
.
37
(
2
):
215
37
.

Author notes

Jessica Kirk is a Postdoctoral Research Fellow at the Centre for Governance and Public Policy, Griffith University, Australia.

This is an Open Access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted reuse, distribution, and reproduction in any medium, provided the original work is properly cited.