The last 20 months turned every dog into an amateur epidemiologist and statistician. Meanwhile, a group of bona fide epidemiologists and statisticians came to believe that pandemic problems might be more effectively solved by adopting the mindset of an engineer: that is, focusing on pragmatic problem-solving with an iterative, adaptive strategy to make things work.
In a recent essay, “Accounting for uncertainty during a pandemic,” the researchers reflect on their roles during a public health emergency and on how they could be better prepared for the next crisis. The answer, they write, may lie in reimagining epidemiology with more of an engineering perspective and less of a “pure science” perspective.
Epidemiological research informs public health policy and its inherently applied mandate for prevention and protection. But the right balance between pure research results and pragmatic solutions proved alarmingly elusive during the pandemic.
We have to make practical decisions, so how much does the uncertainty really matter?
Seth Guikema
“I always imagined that in this kind of emergency, epidemiologists would be useful people,” Jon Zelner, a coauthor of the essay, says. “But our role has been more complex and more poorly defined than I had expected at the outset of the pandemic.” An infectious disease modeler and social epidemiologist at the University of Michigan, Zelner witnessed an “insane proliferation” of research papers, “many with very little thought about what any of it really meant in terms of having a positive impact.”
“There were a number of missed opportunities,” Zelner says—caused by missing links between the ideas and tools epidemiologists proposed and the world they were meant to help.
Giving up on certainty
Coauthor Andrew Gelman, a statistician and political scientist at Columbia University, set out “the bigger picture” in the essay’s introduction. He likened the pandemic’s outbreak of amateur epidemiologists to the way war makes every citizen into an amateur geographer and tactician: “Instead of maps with colored pins, we have charts of exposure and death counts; people on the street argue about infection fatality rates and herd immunity the way they might have debated wartime strategies and alliances in the past.”
And along with all the data and public discourse—Are masks still necessary? How long will vaccine protection last?—came the barrage of uncertainty.
In trying to understand what just happened and what went wrong, the researchers (who also included Ruth Etzioni at the University of Washington and Julien Riou at the University of Bern) conducted something of a reenactment. They examined the tools used to tackle challenges such as estimating the rate of transmission from person to person and the number of cases circulating in a population at any given time. They assessed everything from data collection (the quality of data and its interpretation were arguably the biggest challenges of the pandemic) to model design to statistical analysis, as well as communication, decision-making, and trust. “Uncertainty is present at each step,” they wrote.
And yet, Gelman says, the analysis still “doesn’t quite express enough of the confusion I went through during those early months.”
One tactic against all the uncertainty is statistics. Gelman thinks of statistics as “mathematical engineering”—methods and tools that are as much about measurement as discovery. The statistical sciences attempt to illuminate what’s going on in the world, with a spotlight on variation and uncertainty. When new evidence arrives, it should generate an iterative process that gradually refines previous knowledge and hones certainty.
Good science is humble and capable of refining itself in the face of uncertainty.
Marc Lipsitch
Susan Holmes, a statistician at Stanford who was not involved in this research, also sees parallels with the engineering mindset. “An engineer is always updating their picture,” she says—revising as new data and tools become available. In tackling a problem, an engineer offers a first-order approximation (blurry), then a second-order approximation (more focused), and so on.
Gelman, however, has previously warned that statistical science can be deployed as a machine for “laundering uncertainty”—deliberately or not, crappy (uncertain) data are rolled together and made to seem convincing (certain). Statistics wielded against uncertainties “are all too often sold as a sort of alchemy that will transform these uncertainties into certainty.”
We witnessed this during the pandemic. Drowning in upheaval and unknowns, epidemiologists and statisticians—amateur and expert alike—grasped for something solid as they tried to stay afloat. But as Gelman points out, wanting certainty during a pandemic is inappropriate and unrealistic. “Premature certainty has been part of the challenge of decisions in the pandemic,” he says. “This jumping around between uncertainty and certainty has caused a lot of problems.”
Letting go of the desire for certainty can be liberating, he says. And this, in part, is where the engineering perspective comes in.
A tinkering mindset
For Seth Guikema, co-director of the Center for Risk Analysis and Informed Decision Engineering at the University of Michigan (and a collaborator of Zelner’s on other projects), a key aspect of the engineering approach is diving into the uncertainty, analyzing the mess, and then taking a step back, with the perspective “We have to make practical decisions, so how much does the uncertainty really matter?” Because if there’s a lot of uncertainty—and if the uncertainty changes what the optimal decisions are, or even what the good decisions are—then that’s important to know, says Guikema. “But if it doesn’t really affect what my best decisions are, then it’s less critical.”
For instance, increasing SARS-CoV-2 vaccination coverage across the population is one scenario in which even if there is some uncertainty regarding exactly how many cases or deaths vaccination will prevent, the fact that it is highly likely to decrease both, with few adverse effects, is motivation enough to decide that a large-scale vaccination program is a good idea.
An engineer is always updating their picture.
Susan Holmes
Engineers, Holmes points out, are also very good at breaking problems down into critical pieces, applying carefully selected tools, and optimizing for solutions under constraints. With a team of engineers building a bridge, there is a specialist in cement and a specialist in steel, a wind engineer and a structural engineer. “All the different specialties work together,” she says.
For Zelner, the notion of epidemiology as an engineering discipline is something he picked up from his father, a mechanical engineer who started his own company designing health-care facilities. Drawing on a childhood full of building and fixing things, his engineering mindset involves tinkering—refining a transmission model, for instance, in response to a moving target.
“Often these problems require iterative solutions, where you’re making changes in response to what does or doesn’t work,” he says. “You continue to update what you’re doing as more data comes in and you see the successes and failures of your approach. To me, that’s very different—and better suited to the complex, non-stationary problems that define public health—than the kind of static one-and-done image a lot of people have of academic science, where you have a big idea, test it, and your result is preserved in amber for all time.”
Zelner and collaborators at the university spent many months building a covid mapping website for Michigan, and he was involved in creating data dashboards—useful tools for public consumption. But in the process, he saw a growing mismatch between the formal tools and what was needed to inform practical decision-making in a rapidly evolving crisis. “We knew a pandemic would happen one day, but I certainly had not given any thought to what my role would be, or could be,” he says. “We spent several agonizing months just inventing the thing—trying to do this thing we’d never done before and realizing that we had no expertise in doing it.”
He envisions research results that come not only with exhortations that “People should do this!” but also with accessible software allowing others to tinker with the tools. But for the most part, he says, epidemiologists do research, not development: “We write software, and it’s usually pretty bad, but it gets the job done. And then we write the paper, and then it’s up to somebody else—some imagined other person—to make it useful in the broader context. And then that never happens. We’ve seen these failures in the context of the pandemic.”
He imagines the equivalent of a national weather forecasting center for infectious disease. “There’s a world in which all the covid numbers go to one central place,” he says. “Where there is a model that is able to coherently combine that information, generate predictions accompanied by pretty accurate depictions of the uncertainty, and say something intelligible and relatively actionable in a fairly tight time line.”
At the beginning of the pandemic, that infrastructure didn’t exist. But recently, there have been signs of progress.
Fast-moving public health science
Marc Lipsitch, an infectious disease epidemiologist at Harvard, is the director of science at the US Centers for Disease Control’s new Center for Forecasting and Outbreak Analytics, which aims to improve decision-making and enable a coordinated, coherent response to a pandemic as it unfolds.
“We’re not very good at forecasting for infectious diseases right now. In fact, we are quite bad at it,” Lipsitch says. But we were quite bad at weather forecasting when it started in the ’50s, he notes. “And then technology improved, methodology improved, measurement improved, computation improved. With investment of time and scientific effort, we can get better at things.”
Getting better at forecasting is part of the center’s vision for innovation. Another goal is the capability to do specific studies to answer specific questions that arise during a pandemic, and then to produce custom-designed analytics software to inform timely responses on the national and local levels.
These efforts are in sync with the notion of an engineering approach—although Lipsitch would call it simply “fast-moving public health science.”
“Good science is humble and capable of refining itself in the face of uncertainty,” he says. “Scientists, usually over a longer time scale—years or decades—are quite used to the idea of updating our picture of truth.” But during a crisis, the updating needs to happen fast. “Outside of pandemics, scientists are not used to vastly changing our picture of the world each week or month,” he says. “But in this pandemic especially, with the speed of new developments and new information, we are having to do so.”
The philosophy of the new center, Lipsitch says, “is to improve decision-making under uncertainty, by reducing that uncertainty with better analyses and better data, but also by acknowledging what is not known, and communicating that and its consequences clearly.”
And he notes, “We’re gonna need a lot of engineers to make this function—and the engineering approach, for sure.”