Many things about this college semester have been new. For one, in most other years students would not regularly check a webpage to track the spread of disease on their campuses.
But while COVID-19 dashboards are now ubiquitous, they are not created equal.
That’s why Howard Forman and Cary Gross, both professors at Yale University College of Medicine, started We Rate COVID Dashboards, a project to evaluate and assess how well higher ed is communicating about spread on campus. The project, which is both a website and a Twitter account, assigns a letter grade to dashboards based on a rubric of nine criteria. Are the cases separated into students and staff? Is it updated every weekday?
“Around the summer I wanted to start tracking colleges because I knew they were opening and found enormous variation in college reporting,” said Forman. “Importantly my own university had a terrible dashboard when I first noticed it.” (Yale later improved and earned itself an A.)
Having a COVID dashboard allows universities to communicate with their students, employees, parents and the public. But if a dashboard provides incomplete information, it can be difficult for those stakeholders to make decisions, and difficult for the public to compare across institutions.
We Rate COVID Dashboards grades colleges and universities on how easy a dashboard is to read, how often it is updated, what data are reported and other criteria.
So far, the dashboards have been all over the map. While some have excelled and some have failed (often for having no dashboard at all) the distribution of the ratings slightly resembles a normal curve, with a plurality of institutions achieving a B.
Forman and Ayotomiwa Ojo, a dual M.D./M.P.P. student at Harvard University and the project’s chief of operations, say they haven’t found any trends so far in which institutions have good dashboards and which have poor ones. Notably, colleges and universities that are well resourced have not excelled in the rankings as much as might be thought.
“I can’t even say the Ivy League schools, which we would think are more well resourced, that they’re definitely doing a lot better than everyone else,” said Ojo.
There are so far five institutions that have achieved an A-plus or above. Wagner College, Tulane University, George Mason University and Ohio State University all achieved an A-plus. Amherst College is the only institution so far to have achieved an A-plus-plus.
All 21 of the colleges and universities that were given failing grades by the project, including Spelman College, Georgetown University and University of California, Santa Barbara, earned them by having no dashboard at all.
But many of those institutions, including Spelman and Georgetown, did not open their campuses to students this semester. (Some of the ratings also appear to be out-of-date. Georgetown, for example, does have a COVID dashboard.)
Forman and Ojo say it’s important that institutions that do not have students on campus still have dashboards to provide transparency to faculty, staff and anyone else who needs to enter campus. Dashboards can also indicate the level of testing that is available for those people. Colleges that want to open in the spring or next fall would be wise to build their dashboard infrastructure now, they said.
But, Forman and Ojo said, over all, colleges and universities are improving.
“We’re very impressed with the fact that many universities have taken this seriously. We’ve noticed many universities going from having no dashboard to quickly developing a dashboard, even going so far as to improve it over time,” Forman said. “What we wanted to see is that universities are moving in the right direction, and they absolutely are.”
“That growth is why we’re doing this. Because we want to help schools think about or create a framework for the information they want to display,” Ojo said.
Some colleges have asked to be rerated by the project after improving their dashboards.
Officials at Amherst College, the only institution to receive an A-plus-plus, said seeing their rating from the project helped them make improvements.
“Early on we knew from the beginning of the semester it would be important to be very transparent both with our own internal Amherst College community but also with the local community and our neighbors,” said Matt Hart, director of emergency management at Amherst. “When we learned that we were given an A, we discovered that there were only a couple of minor tweaks that we would have to make to get to that A-plus-plus, and we figured it couldn’t hurt, particularly if it could encourage other institutions to follow suit.”
As the project has grown, so has the team behind it. Hannah Todd, a med student at Baylor College of Medicine; Sarah Pitalfi, an undergraduate at Yale; and Eric Newberry, a data analyst at Ohio State, were all brought on to help in the effort.
Feedback for the project has mostly been positive, but Forman said that sometimes people expect the ratings to be something they are not.
“Some of the most aggressive comments we saw were when we gave a high rating to a dashboard and people on that campus or maybe parents of kids were like, ‘This may seem good, but it’s horrible there,’” he said. “I think people are misunderstanding -- we are not rating how good the program is. We are not rating whether they are doing good public health practices or whether their screening or testing program is better or worse than others. We are only rating them on transparency, on how much data they are sharing with the public.”
“If colleges say they are isolating and quarantining people, we can’t ascertain whether that’s really happening,” Ojo said.
Over all, the project team said, they just want to see colleges exercise transparency.
“I think that the public needs to hold institutions accountable, whether they are public or private institutions,” said Forman.
“We are doing this with the hopes that everyone will get an A-plus-plus,” said Ojo. “That is our No. 1 goal.”