After months of issuing reports modeling the future behavior of COVID-19, Gov. Kate Brown and Oregon health officials last week issued a tacit admission that the pandemic had blindsided them.
On Friday, Nov. 13, Gov. Kate Brown announced a two-week freeze on public activity throughout Oregon, citing an "alarming spike" in COVID-19 cases and a modeling report bearing dire projections.
It was a huge change. Just two weeks before, the state's modeling showed Oregon as having the coronavirus at a standstill — neither gaining or losing ground. The new projection was far worse than the previous worst-case scenario, and Brown went from confidently describing plans to reopen schools on Oct. 30 to stressing on Friday that "it's a very dangerous situation."
Throughout the pandemic, Brown's announcements about the virus often have coincided with the projections' twice-monthly release. At times she's cited the report to portray her decisions as backed by scientific certitude.
Emails released under Oregon's public records law, however, show the job of running models for the Oregon Health Authority has been like piloting a high-speed luge through an obstacle course — often colliding with technical difficulties, unexpected results, and numbers that don't quite add up.
Oregon's projections of expected viral trends have swung dramatically at times, but never more starkly than last week. The latest report showed the spread of disease — the reproduction rate of the coronavirus in Oregon — jumping nearly 50% in just two weeks' time.
For the first time since the start of the pandemic, the state used different modeling software for last week's projections. Publicly framed as a matter of efficiency, the change was driven by months of stress and long hours aggravated by the challenges of the state's previous modeling software, emails obtained by the Portland Tribune show.
The newly adopted software used last week painted a significantly more alarming picture of the disease's spread in Oregon than other models estimated. But officials say the newly deployed software is more sensitive, and its findings are backed by what doctors are seeing. Experts for months have been predicting a seasonal surge similar to the cold and flu, and in the last two weeks, hospital intensive care units in the Portland area have become clogged with severe COVID-19 cases — a trend likely to be followed by an increase in deaths.
State health officer Dr. Dean Sidelinger told the Tribune that he and other officials are confident in the modeling, but know its limitations that are detailed in public records.
"Models are only as good as the assumptions and the data that goes into them," Sidelinger said. "The modeling is definitely science based, but then there's an art to how do we interpret it. … We use it for planning purposes, and I think for that it's definitely helping us."
In the beginning: An information gap
On March 11, Brown wanted information, and her top epidemiologist, Sidelinger, was under pressure to provide it.
Days before, Washington Gov. Jay Inslee had estimated that 1,000 people were infected with the coronavirus in his state. For a week, Oregon reporters had asked for an equivalent number in this state.
Sidelinger "urgently" wanted an Oregon estimate before he met with the governor that afternoon, wrote modeling manager Julie Maher in an email to the source of Inslee's numbers, the Bellevue-based Institute for Disease Modeling, or IDM. Billionaire philanthropists Bill and Melinda Gates fund the nonprofit.
IDM couldn't meet that deadline, its top coronavirus researcher responded, but it would take the job. Mike Famulare stressed the urgency for Oregon to lock down.
"The only problem that is doubling in size every week is COVID," he wrote, adding that it will "exceed the burdens of any other social impacts if left unchecked."
On March 12, Brown declared a number of restrictions, including a ban on large gatherings. Four days later, she closed K-12 schools statewide. One week after that, she announced a new batch of restrictions titled "Stay home, save lives."
A partnership was born: IDM agreed to work unpaid for Oregon. The state would plug in numbers of confirmed infections, deaths and COVID-19 hospitalizations each day. IDM's epidemiological modeling software would spit out scenarios of what the disease likely was doing.
The projections had a public-health purpose beyond just providing information: to prompt people to change their behavior for the common good.
"Epidemiology is a science of possibilities and persuasion, not of certainties or hard proof," journalist Charles Duhigg wrote in April in The New Yorker magazine.
On March 17, IDM produced its first preliminary report for Oregon. But state officials wanted them weekly, and Maher and research analyst Erik Everson soon took over the job. Outside Washington, no other state worked as closely with the nonprofit.
Modeling: Educated guesswork, assumptions
Maher and Everson used IDM software called Covasim, plugging in data, then essentially tweaking assumptions so that the trend lines fit the data — a process called calibration. They then extended those lines into the future and offered different possible trends — while stressing that they were offering projections, not predictions.
Each release of the reports has been eagerly awaited, with members of the public, reporters, lawmakers and local and state health officials poring through them for clues to the future.
But with a new and evolving virus and unproven tests, the models have relied on data that sometimes is limited or of poor quality, as well as assumptions that have changed, sometimes dramatically, since March.
Records show that producing the report entailed long hours gathering data, reading the latest research, and battling discrepancies and scenarios that changed dramatically from week to week. The two analysts sometimes struggled to fit the software model to real-life data about outbreaks, hospitalizations and COVID-19 test results.
Some of the issues:
• On March 31, Maher asked IDM why numbers in IDM's modeling calibrations didn't match the total infections being reported by the model.
• On April 6, Sidelinger said the modeling software "severely undercounts" the number of hospitalizations in Oregon.
• On May 1, Everson and Maher noted that updated software was producing "volatility" including "very different calibration results" based on the same data.
• On May 6, Maher and Everson wrote to IDM, warning that projections showing that diagnoses of infections were catching up to the actual projected number — an impossibility given that many cases of the coronavirus go undiagnosed. They noted a "big downward shift" in projected numbers "between last week and this week's scenarios (which) seemed rather surprising."
• On July 7, Maher wrote "It looks like we (were) overpredicting hospitalizations for a while (mid-April —mid-June)."
• On Aug. 5, Maher informed Sidelinger of some erroneous percentages in a report published the week before, asking if the report posted online could be replaced.
• On Sept. 14, Everson asked IDM for help, noting a "considerable discrepancy" in the software's daily tally of cumulative infections.
Expecting precision was misguided, IDM senior research scientist Assaf Oron responded to one of Maher's questions. The modeling "forecasts account for many sources of uncertainty ... but as a simplified model for a complex reality they cannot account for all such sources."
This, he wrote, is why IDM claims a confidence level of only 80% — meaning a wide range of possible outcomes. Since future behavior influences results, he added, it didn't "seem right to pretend to know 'the truth with 95 % confidence.'"
As for the bugs in the software, IDM frequently updated it, forcing the state's analysts to adapt. Emails show they frequently nipped and tucked assumptions to make the curves fit the latest data — routinely changing them as new reports were published.
Daniel Klein, a senior research manager for IDM, echoed Maher and Everson in telling the Portland Tribune the emails reflect the routine work of modeling and adapting to new software upgrades.
"This is totally normal stuff that happens all the time," he said. "A lot of this back and forth is to get the model to reflect what's happening in the data."
Staff struggles to keep up
The stress in dealing with coronavirus modeling took its toll.
In a May 13 email to Sidelinger, Maher asked for a week off from modeling work, noting the challenge of the modeling's "bugs," upgrades and making the calibration curves fit the data. The emails are full of similar comments. The software was designed for sophisticated, detailed analysis, making it challenging to use although Oregon's modeling used a simplified version for less ambitious purposes.
"The ongoing updates to fix bugs & improve the code has made calibration time consuming & comparing from week to week difficult," Maher wrote. "We could really use some time to regroup, better understand the increased complexity of the Covasim code.… I also could benefit from a couple of days off to rest my brain."
The state soon switched to twice-monthly projection reports rather than weekly reports.
In a July 14 email, Everson called the work "exhausting."
Sidelinger told the Tribune the work is emotionally draining because of the stakes involved and the impact on people's lives. As of Nov. 17, the state has reported 778 deaths associated with COVID-19, many survivors report debilitating effects, and unemployment has skyrocketed.
One of the "major limitations" of the state's modeling is its reliance on week-old data to make sure it's complete, Sidelinger said. That means the reports can be out of date immediately. Increased transmission can rapidly balloon into exponential growth after the data cut-off— or not, as outbreaks are controlled and behavior changes.
Of course, Brown hasn't always heeded the modelers. Her flurry of moves in May to reopen the state came immediately following a projections report noting the major risks of doing so. Documents later showed Sidelinger and his staff wanted reopening delayed until the virus was under better control.
The modeling has helped Maher and Everson flag significant developments using real-life data, guiding and improving health officials' response.
"We're seeing changes in how the virus is spreading among different age groups, communities and localities in Oregon," Everson wrote in a June 30 email to IDM, asking to discuss potential modeling improvements.
They also learned about the limitations of the modeling, even as medical professionals have complained that coronavirus tests often produce false negatives.
In an email to IDM, the Oregon analysts noted research finding a sensitivity, or accuracy, rate for detecting the virus of only 70% to 80%.
IDM, however, responded its model assumed 100% sensitivity.
One IDM supervisor in August conceded that the "rush" to provide modeling for use in the pandemic had led to "temporarily abandoning" more scientific approaches used in the past. "We do our best to make reasonable assumptions ... but it remains a work in progress."
After repeatedly exploring whether the state could stop issuing its own reports using Covasim and instead rely on other sources, emails show, Maher in August began asking about software called Rainier. The much simpler program, also produced by IDM, estimates the rate of disease spread.
On Sept. 17, three days after Everson sent an email notifying IDM that the numbers again were not adding up in Covasim, the nonprofit sent an email giving the state access to Rainier.
In the future the state will issue projections every three weeks, and they won't be nearly as specific, allowing "for a more timely turnaround," said the state's latest report.
Already used in Washington, how the different software performs in Oregon remains to be seen. Sidelinger said that considering the surge of cases in the last week, "that modeling report is probably on the optimistic side of things."
You count on us to stay informed and we depend on you to fund our efforts. Quality local journalism takes time and money. Please support us to protect the future of community journalism.