Connecting state and local government leaders
A new analysis by a nonprofit group finds troubling gaps and inconsistencies with the information states are sharing with the public, and offers recommendations for how it can be improved.
Most state governments are failing to report data on the coronavirus pandemic that is crucial for guiding response efforts and keeping the public informed about the risks from the disease, according to a new analysis by a public health group.
Resolve to Save Lives, a nonprofit initiative led by a former top official at the U.S. Centers for Disease Control and Prevention, looked at coronavirus data dashboards in each of the 50 states, the District of Columbia and Puerto Rico, to see how well the online websites conform to a set of best practices that the group outlines in its report.
These practices include things like prioritizing key metrics, clear organization of information and reporting rates rather than counts.
The report acknowledges that the data dashboards were set up during a historic public health emergency, but says that with the nation now months into the coronavirus crisis the data that states are sharing remains inconsistent, incomplete and in many cases inaccessible. One-in-five of the data portals also suffered from time lags, with states not posting same-day data by 5 p.m.
“The lack of common standards, definitions, and accountability reflects the absence of national strategy, plan, leadership, communication, or organization and results in a cacophony of confusing data,” said Tom Frieden, president and CEO of Resolve to Save Lives, and a former CDC director from 2009 to 2017 during President Obama’s time in office.
The report identifies 15 “essential indicators” that states should track, nine of which it says should be reported immediately and six as soon as possible if data isn’t currently available.
Cyrus Shahpar, the director of Prevent Epidemics at Resolve to Save Lives, said that the “media and others are paying entirely too much attention to less important data” and that consistent reporting of the essential indicators could help shift the focus onto more important metrics as the nation battles the virus. “It is about knowing our risk and improving our response,” Shahpar said.
Some examples of the indicators that the group recommends tracking and reporting include: new confirmed and probable cases, with per capita rates and by date; screening and testing rates by date; the percentage of screening and diagnostic tests that are positive by date; and daily hospitalization rates. The report calls for including seven-day averages and per capita rates for measures where appropriate.
Across the 52 dashboards that the analysis looked at, the 15 indicators add up to 780 pieces of information, which the report recommends breaking down further by variables like age, gender and race. Overall, the report says states reported just 2% of the exact essential indicators, while 38% were reported in some way but had limitations, or were not sufficiently detailed, and 60% were not reported at all.
“These indicators provide public health and medical practitioners, policy makers and community leaders evidenced-based targets to aid decision making,” Georges Benjamin, executive director of the American Public Health Association, said in a statement.
The association, along with the Johns Hopkins Center for Health Security, Trust for America’s Health, the Association of Schools and Programs of Public Health and the Big Cities Health Coalition have reviewed and endorsed the list of essential indicators in the report.
Much of the missing data, the analysis found, is related to testing, or contact tracing—the process that public health authorities use to identify and get in touch with people who have come in close contact with people who are known to have the highly contagious virus.
More than 90% of states report some information on trends with the daily number of tests for active coronavirus infections, the analysis found. But how they report this information varies.
For instance, some states don’t make clear whether they’re reporting the number of tests conducted, or the number of people tested. They also sometimes don’t distinguish if they’re conducting tests for active infections or for antibodies that indicate past exposure to the virus.
Data about how long it takes to get test results back is also lacking, the report says. Testing turnaround time matters, because the more time that passes after someone is tested before they get the results, the greater the odds that a person won’t go into isolation if they are in fact infected. This raises the chances that they’ll infect other people they come in contact with.
In recent weeks, as coronavirus cases have surged in some states and authorities urged residents to get tested, people across the country have complained about long wait times to get back results, sometimes longer than a week or two.
The contact tracing data is important because it sheds light on the effectiveness of public health efforts to control the spread of the disease. The report explains that there are common indicators for assessing contact tracing programs, used in countries ranging from Uganda to Singapore. Here in the U.S., the report describes reporting on contact tracing as “abysmal.”
It’s not only the data on testing and contact tracing where states are falling short though, according to the research. Even reporting on basic information, like new cases by date, was “surprisingly inconsistent” across the state dashboards, the analysis found. Data on infections and deaths in congregate living facilities, like nursing homes and prisons, are hampered by inconsistencies as well.
More broadly, the report says that many of the data dashboards were complex to navigate and that even experienced health researchers had trouble finding key information.
A full copy of the Resolve to Save Lives report can be found here.
Bill Lucia is a senior reporter for Route Fifty and is based in Olympia, Washington.