Skip to Content
6 questions for the medical historian
Attendants await patients in the influenza ward of the U.S. Naval Hospital on Mare Island, California, December 10, 1918.

6 questions for the medical historian

Professor Catherine Mas explores the roots of our “war” on contagion and the lessons we might take away from the current pandemic.

May 5, 2020 at 1:00pm

img-7675.jpegCatherine Mas is a professor in the Green School of International & Public Affairs who specializes in the history of medicine. Her research and teaching focus on the intersections of health, race and religion in the Americas. Mas already had plans to teach a course in the fall on the topic of global health when the current crisis hit.

You’ve written about the use of military metaphor for disease. Why do we as Americans gravitate to such language?
It’s not a given that we think of our bodies as “fighting” off a bad cold or “battling” cancer. Or that disinfectants “kill” bacteria. Or that our white blood cells “attack” pathogens that “invade” the body. This way of thinking about disease and immunity dates back at least to the eighteenth century, when the most dreaded plague confronting colonial America was smallpox—highly contagious, deadly and disfiguring. 

In the midst of the outbreak, Puritan minister Cotton Mather looked into his microscope and, like other scientific observers, described the unfamiliar in familiar language: “unseen armies of numberless things, ready to seize and prey upon us!”   

This language of warfare stresses the boundaries between the body and a dangerous external environment. In the case of epidemic disease, warfare imagery lends itself to an us-versus-them narrative of disease outbreak.

You’ve warned that a danger lies in such personification. How? 
Throughout modern American history, epidemics have stirred xenophobic sentiments. As the military metaphor heightens people’s fears of disease as a foreign threat, immigrants and those perceived as “foreign” have often become a scapegoat. That was the case in 1832, when Irish Catholic immigrants were blamed for a cholera outbreak, and again in 1892, when another cholera outbreak led to a surge of anti-Semitism in New York. When San Francisco faced an outbreak of bubonic plague in 1900, local leaders saw Chinese immigrants as a health menace, and some even suggested burning down Chinatown as a reasonable solution. Today, as we “fight” COVID-19, this tragic historical pattern has manifested once again, in reports of heightened prejudice towards Americans of Asian descent. 

Does the language of warfare have any value in terms of our preparedness for a pandemic?
Being “at war” with a virus is more than just metaphor. As sociologist Andrew Lakoff describes in his 2017 book Unprepared, our current system of global health security is bound up with specific military concerns of the twentieth and twenty-first centuries.

When World War II ushered in the atomic age, the threat of a nuclear attack led the United States to invest heavily in creating a system of civil defense, to protect us against a potential catastrophic event. In the 1950s, emergency “preparedness” planners developed computer programs to simulate attack patterns, map vulnerabilities in major cities, and coordinate an emergency response across municipal, state and federal governments. Initially, the focus was on an adversarial threat--the Soviet Union and nuclear weapons during the Cold War—but in the 1960s and ’70s, preparedness planning expanded to a broader range of national emergencies, from a devastating hurricane, for example, to a viral pandemic.

So, bioterrorism has kind of been mixed in with fears of seemingly “natural” spread of contagious disease?
The War on Terror has definitely shaped new concerns about the threat of bioterrorism. In the late 1990s, reports of Soviet bioweapons programs revealed that the smallpox virus had been transformed into a weapon. Even though such a biological attack was improbable, it quickly became the primary focus of civil defense experts, who worried that the United States was especially vulnerable to a smallpox attack because of a 20-year lapse in routine vaccination since its eradication in the 1970s. In 1999, the government established the Office of Bioterrorism Preparedness and Response, which funneled $40 million a year to local public health departments to prepare for a bioterrorist event. 

Following the 9/11 attacks and a series of anthrax attacks in the fall of 2001, President George W. Bush made “bioterrorism preparedness” an urgent priority. In 2002, the White House ramped up funding and called for the CDC to develop the Smallpox Vaccination Program.

How did such prioritizing of public health as a means to prevent a possible terror attack pay off, or did it?
With the renewed funding that came out of the post-9/11 threat of bioterrorism, health officials re-directed security concerns to more probable dangers, including the threat of a viral flu pandemic. The Department of Health and Human Services began a program of “pandemic preparedness,” which involved cultivating transnational partnerships to improve global influenza surveillance, building the national stockpile of antiviral medications, improving vaccine capacity, and investing in basic virology research. American health officials coordinated with the WHO to broaden its global security initiatives and accelerate planning for pandemic influenza. And in the U.S., political momentum was building in support of pandemic preparedness, with congressmembers arguing that a permanent framework was needed to respond to emerging diseases, one that would coordinate action between public health experts, medical industries and international organizations.

In 2005, Senators Barack Obama and Richard Lugar advocated for public health preparedness by painting the issue in terms of an urgent military concern. The two men wrote, “another kind of threat lurks beyond our shores, one from nature, not humans—an avian flu pandemic.” By late 2006, Congress passed the Pandemic and All-Hazards Preparedness Act, which would address these building concerns.

What have we learned from the current pandemic that is critical to avoiding the next?
The pandemic has exposed an infrastructural crisis in our health care system—from failures in testing to overcrowded hospitals and medical supply shortages. It has amplified disparities and systemic racism in health care, with staggering rates of COVID-19 deaths among black and Latino populations. What’s more, the economic fallout from the pandemic will disproportionately hurt poor and already marginalized communities. As jobless numbers climb and millions lose employer-based health insurance, the lack of a social safety net in America will mean that large populations will be at even greater health risk.

These are internal security threats that a military metaphor of disease and public health action has obscured or ignored. What if, rather than waiting for a foreign threat to mobilize resources and invest in public health, we placed a higher value on maintaining the health of all Americans? That might mean investing in a more equitable health care system with an outlook of prevention of all kinds of health problems, rather than mere “preparedness” for an as-yet-unknown catastrophe.

We might heed the words of German President Frank-Walter Steinmeier, who commented in a recent televised address: “No, this is not a war. Nations do not stand against nations, nor soldiers against soldiers. Rather it is a test of our humanity. It brings out the best and worst in people. Let’s show each other the best in us.

Gallery: images of the 1918 flu pandemic, one of deadliest events in human history