In This Issue
Winter Issue of The Bridge on Frontiers of Engineering
December 25, 2021 Volume 51 Issue 4
The NAE’s Frontiers of Engineering symposium series forged ahead despite the challenges of the pandemic, with virtual and hybrid events in 2021. This issue features selected papers from early-career engineers reporting on new developments in a variety of areas.

Fighting the Pandemic with Mathematical Modeling: A Case Study at Cornell University

Tuesday, January 4, 2022

Author: Peter I. Frazier

Modeling showed that opening for in-person instruction would actually be safer than moving fully online.

Government response to the covid-19 pandemic has been chaotic worldwide. Past experience was no longer a reliable guide as decision makers were faced with unfamiliar trade-offs and hard-to-quantify risks. ­Government ­policies on masking, testing, distancing, and vaccination varied sub­stantially, perhaps as much due to the cognitive biases of decision makers and their constituents as to differences in circumstances. Consequently, history may show that the human toll, in terms of both mortality and economic losses, was significantly larger than it needed to be. I argue that, collectively, we can do better and that engineers and scientists can help.

In a microcosm of the broader world, colleges and universities in the ­United States were challenged in the summer of 2020: Would it be safe to invite students back for the fall semester? If so, what interventions would ensure safety? Would students comply with distancing and masking restrictions? Would arriving students import and serve as hosts to a viral epidemic that would leap to faculty, staff, and the larger community?

In this article I describe how science and engineering answered these questions at Cornell University, sharing experience that may help others appreciate science’s ability to guide policy.

Modeling to Determine In-Person or Online Instruction

Opinions in Ithaca, where Cornell is located, were divided on the best path forward, as they were in many communities. Many residents felt that reopening the campus would be profoundly dangerous; others felt that not reopening would harm students’ educational outcomes and mental health as well as residents’ ­ability to earn a living. These perspectives reflected similar divisions in public opinion around the country, pitting those who feared the health consequences of infection against those who feared the economic and social costs of pandemic interventions (Ferragina and Zola 2021; Vezzoni et al. 2020).

Campuswide Interdisciplinary Involvement

In the face of this division, a group of engineers, scientists, public health professionals, healthcare providers, and university administrators came together at Cornell. Among this group, I and several other faculty and students formed the Cornell COVID-19 Mathematical Modeling Team (briefly, the modeling team) to contribute mathematical modeling and data science.[1] Together, we hoped to use data and science to help the university chart a path forward.

We understood that simply asking students to wear masks and maintain social distancing might not prevent an outbreak (Yamey and Walensky 2020). Early analysis and work by other academics suggested that testing students and employees regularly and isolating positive individuals might catch asymptomatic spreaders and protect against viral spread (Frazier et al. 2020; Gollier and Gossner 2020).

Simultaneously, leaders in Cornell’s College of ­Veterinary Medicine realized that a substantial ­capacity to conduct polymerase chain reaction (PCR) tests (ordinarily used to test dairy cows and other animals for virus) could be used to test for SARS-CoV-2 in people. Building on past knowledge of pooled testing and current work exploring its potential for covid-19 (Cleary et al. 2020; Gollier and Gossner 2020), university ­leaders hypothesized that the local testing capability could be significantly expanded through pooled testing, the application of a single PCR test on samples from multiple individuals. If the test came back negative all samples would be deemed negative; otherwise follow-up tests would be conducted on the individual samples.

Would the Reopening Plan Work?

Perhaps student parties would lead to overwhelming infection rates, tests would be too inaccurate, too many students would be unwilling to mask or isolate, testing apparatus would break under the large test volumes, the number of students needing quarantine or isolation would exceed the rooms available to safely house them, or the virus would simply be too infectious. We also worried that personnel wouldn’t be able to keep up with large lab test volumes and surges in positive students needing isolation and contact tracing (extremely long workdays occurred frequently in several units on campus and in the local health department and we worried about burnout and staff quitting); that testing frequency would need to be increased because of elevated social contact but the lab would not be able to provide the increased capacity; or that delays in the time from sample collection to contacting positives and their contacts would prevent prompt isolation and quarantine. We weren’t sure how many tests we could perform or positives we could isolate and trace per day reliably on a regular basis and we were depending on teams to hold up under large workloads.

As the modeling team contemplated this question in the summer of 2020, we found that the accuracy of modeling predictions was fundamentally limited by significant uncertainty about parameters. Nevertheless, modeling showed that, even under pessimistic assumptions about parameters, regular testing could protect the population against a large outbreak (figure 1).

Frazier figure 1.gif
 

FIGURE 1 Contour plot showing the predicted cumulative number of infections in the Cornell University population vs. two key parameters: the fraction of the population that would be tested per day (y-axis) and the number of contacts that students would have per day once instruction started (x-axis). Contours (colored lines) indicate predicted infections at some parameter combinations. Predicted infections rise as daily test percentage falls or contacts per day rise. We estimated that students would have roughly 8 contacts per day (black star) and that we would see roughly 1000 cases when testing 2x/week. If students were more social than estimated, increased testing frequency (to 3x/week or more) was predicted to keep cases low: the “3x/wk” line stays above the “1000” line up until 13 contacts per day, indicating that predicted infections are below 1000 with testing 3x/week as long as students have fewer than 13 contacts per day.

Moreover, surveys indicated that, even if instruction were all online, several thousand students were likely to return to the area. Regular covid testing, social distancing, and masking would have been difficult to ­mandate for these students if instruction were fully online (Model­ing Team 2020a). Analysis showed that the students would thus be at significant risk of an outbreak and that opening for in-person instruction would actually be safer than moving fully online (figure 2; Modeling Team 2020a, 2020b).

Frazier figure 2.gif
 

FIGURE 2 For each of 200 likely parameter sets, the number of simulated infections in the Cornell University population under a return to in-person instruction (x-axis) and fully virtual instruction (y-axis). Black dots (blue squares) show parameter sets where virtual instruction results in more (fewer) infections. The proliferation of black dots shows that there are many parameter sets where infections are much higher under virtual instruction.

The Decision to Reopen

Supported by this analysis the university decided to reopen for residential instruction with a behavioral compact for returning students, as articulated by the university’s president and provost in an editorial in the Wall Street Journal (Kotlikoff and Pollack 2020). Modeling and data supported additional decisions, such as the testing frequencies for student and employee groups and the amount of quarantine and isolation housing to reserve (Modeling Team 2020c).

Reopening was successful. Fewer than 300 cases in a population of 30,000 were reported over the 2020 fall semester, significantly fewer than many other open universities of comparable size, and similar to or fewer than several comparably sized universities that moved fully online.[2] Illustrating the danger of fully online instruction for a college town, ­Michigan State University in East Lansing experienced an outbreak in fall 2020 among undergraduate students living locally: although their classes were fully online, 640 cases were ­reported in a single week (­Nadworny 2020; Stanley 2020).[3]

Cornell’s success is broadly consistent with the success of regular asymptomatic screening, also deployed at a handful of other universities in the fall 2020 semester (Candanosa 2020; Denny et al. 2020), as well as concurrent academic research (Chang et al. 2021; Paltiel et al. 2020). The favorable outcome led to the broader adoption of this strategy through the spring and fall 2021 semesters at other universities.

Four Elements of Cornell’s Successful Mathematical Modeling Effort

Four key aspects of our mathematical modeling effort were critical for success: stakeholder engagement, respect for uncertainty, interpretable models, and observable systems. I share these with the hope that they may be useful for others in similar modeling efforts.

Stakeholder Engagement and Communication

First, we prioritized engagement and communication with the university administration and the public. Engagement with those in the administration who were ultimately responsible for the reopening decision started with our listening to their questions and concerns, with an ear toward understanding their approach to making trade-offs and their operational constraints. This understanding informed our analysis. We also prioritized responsiveness, providing analysis quickly in response to questions.

Engagement with the public included releasing the details of our analysis (Modeling Team 2020a, 2020b, 2020c, 2021), town hall and faculty senate meetings, interviews with media, a website where we solicited and responded to questions from the community (Modeling Team 2020e), and reports to our representative in the state legislature (Modeling Team 2020d).

We heard most prominently from stakeholders who were afraid or angry; they were the most motivated to be heard. Patience was a virtue that supported us: when fearful stakeholders lashed out, sometimes with personal accusations, we stayed calm, acknowledged their concerns, and rearticulated the scientific basis for our ­analysis. We were buoyed by the insight that such attacks arise naturally from fear and are not accurate comments on the person attacked.

While some people remained skeptical, transparency helped many understand that the administration’s decisions came from good intentions and a basis in evidence. This eased fears while moving conversations toward civility.

Respect for Uncertainty

Our analysis acknowledged that uncertainty was substantial and identified decisions that were robust to this uncertainty. Whether an infection creates a large cluster depends critically on input parameters, several of which were unknown at the time when a reopening decision needed to be made, even after we “mined” the most up-to-date data and literature. Thus, we prioritized decisions that would work well across a broad range of the most likely parameter values.

Later analysis also leveraged Bayesian methods, quantifying the effect of uncertainty by sampling from prior probability distributions over uncertain parameters.

Interpretable Models

We strived for interpretability. Alongside a detailed compartmental simulation implemented in Python, we used simpler mathematical models like the classical susceptible-infected-recovered model (Allen et al. 2008). We implemented them in spreadsheets and even on calculators and napkins. They were fast enough to run during meetings with stakeholders, gave intuition and prompted confidence in our results, and helped us understand which aspects of reality were most important to build into our more complex simulation models.

Observable Systems

We collected data from asymptomatic screening and other sources in a HIPAA-compliant database. This enabled fast analysis of the evolving situation and supported decisions that significantly enhanced safety: testing of higher-transmission student groups more often (Modeling Team 2021), planning to ensure staffing for contact tracing and housing capacity for quarantine and isolation, and interventions to improve test compliance.

Conclusion

I hope that this experience inspires others. Engineers and scientists are uniquely capable of bringing science to bear on complex policy challenges.

The benefits we observed from pursuing close stakeholder engagement, respecting uncertainty, using interpretable models, and building observable systems may be useful to other engineers who tackle policy questions. I encourage others contemplating such an effort: while advocating for a meaningful policy change is challenging, the world needs your help. You are up to this challenge.

Acknowledgments

I am deeply grateful to the Cornell COVID-19 Modeling Team and to everyone at Cornell University who fought the pandemic.

References

Allen LJS. 2008. An introduction to stochastic epidemic models. In: Mathematical Epidemiology, eds Brauer F, van den Driessche P, Wu J. Berlin: Springer.

Candanosa RM. 2020. Here’s why Northeastern is testing everyone on the Boston campus for the Coronavirus. News@Northeastern, Aug 19.

Chang JT, Crawford FW, Kaplan EH. 2021. Repeat SARS-CoV-2 testing models for residential college populations. Health Care Management Science 24:305–18.

Cleary B, Hay JA, Blumenstiel B, Harden M, Cipicchio M, Bezney J, Simonton B, Hong D, Senghore M, Sesay AK, and 3 others. 2020. Using viral load and epidemic dynamics to optimize pooled testing in resource-constrained settings. medRxiv PMC 7273255.

Denny TN, Andrews L, Bonsignori M, Cavanaugh K, Datto MB, Deckard A, DeMarco CT, DeNaeyer N, Epling CA, Gurley T, and 13 others. 2020. Implementation of a pooled surveillance testing program for asymptomatic SARS-CoV-2 infections on a college campus—Duke University, Durham, North Carolina. Morbidity and Mortality Weekly Report 69(46):1743–47.

Ferragina E, Zola A. 2021. The end of austerity as common sense?: An experimental analysis of public opinion shifts and class dynamics during the Covid-19 crisis. New Political Economy, https://doi.org/10.1080/13563467.2021.1952560.

Frazier P, Zhang Y, Cashore M. 2020. ­Feasibility of ­COVID-19 Screening for the US Population with Group Testing. Available at https://docs.google.com/document/d/1hw5K5V7XOug_ r6CQ0UYt25sz QxXFPmZmFhK15ZpH5U0/edit.

Gollier C, Gossner O. 2020. Group testing against ­COVID-19. EconPol Policy Brief 24.

Kotlikoff MI, Pollack ME. 2020. Why Cornell will reopen in the fall. Wall Street Journal, Jun 30.

Modeling Team [Cornell COVID-19 Mathematical ­Modeling Team]. 2020a. Addendum: COVID-19 mathematical model­ing for Cornell’s fall semester. Ithaca NY. Available at https://covid.cornell.edu/_assets/files/covid_19_­ modeling_addendum.pdf.

Modeling Team. 2020b. COVID-19 mathematical modeling for Cornell’s fall semester. Ithaca NY. Available at https://covid.cornell.edu/_assets/files/covid_19_modeling_ main_report.pdf.

Modeling Team. 2020c. Gateway testing and quarantine capacity. Ithaca NY. Available at https://covid.cornell.edu/_assets/files/modeling_update_ for_gateway_testing.pdf.

Modeling Team. 2020d. Update for Assemblywoman Lifton. Ithaca NY. Available at https://covid.cornell.edu/_assets/files/update_for_ assemblyw oman_lifton.pdf.

Modeling Team. 2020e. Updates from the modeling team. Ithaca NY. Available at https://theuniversityfaculty.­cornell.edu/faculty- senate/archives-and-actions/archived-agenda-and-minutes/ online-senate-meeting-august-5/frazier-­modeling/.

Modeling Team. 2021. Mathematical modeling for ­Cornell’s spring semester. Ithaca NY. Available at https://­covid.cornell.edu/_assets/files/general- audienc e-spring-­modeling-20210216.pdf.

Nadworny E. 2020. 2 Michigan colleges face coronavirus outbreaks in the 1st week of school. NPR, Sep 15.

Paltiel AD, Zheng A, Walensky RP. 2020. Assessment of SARS-CoV-2 screening strategies to permit the safe reopening of college campuses in the United States. JAMA Network Open 3(7):e2016818.

Stanley SL Jr. 2020. Aug. 18: Fall semester plans change. East Lansing: Michigan State University. Available at https://president.msu.edu/communications/messages-­ statements/2020_community_letters/2020-08-18-plans-change. html.

Vezzoni C, Ladini R, Molteni F, Sani GMD, Biolcati F, ­Chiesi AM, Guglielmi S, Maraffi M, Pedrazzani A, Segatti P. 2020. Investigating the social, economic and political consequences of the COVID-19 pandemic: A rolling cross-section approach. Survey Research Methods 14(2):187–94.

Yamey G, Walensky RP. 2020. Covid-19: Re-opening universities is high risk. BMJ 370.


[1]  The modeling team was just one of many Cornell groups responding to the pandemic at the university. Others included a newly formed Committee for Teaching Reactivation Options, the Animal Health Diagnostic Center (from which the Cornell COVID-19 Testing Laboratory was created), Cornell Health, Student and Campus Life, Human Resources, Cornell Information Technologies, the Office of General Counsel, University Relations, Institutional Research and Planning, the Cornell Center for Data Science, and the Master of Public Health program, as well as several other groups and many other individuals. Cornell also forged strong partnerships with the local hospital system, Cayuga Health System, and the Tompkins County Health Department.

[2]  College Crisis Initiative, COVID-19 Data Dashboard (https://collegecrisis.shinyapps.io/dashboard/); and New York Times ­Survey of US Colleges and Universities, GitHub repository (https://github.com/nytimes/covid-19-data/tree/master/ colleges).

[3]  Michigan State University Testing and Reporting (https://msu.edu/together-we-will/testing-reporting/, accessed Oct 2020).

About the Author:Peter Frazier is the Eleanor and Howard Morgan Professor in the School of Operations Research and Information Engineering at Cornell University.