The pace of scientific publishing has accelerated dramatically in response to the COVID pandemic. Journals have sped up time from submission to publication, and scientists have uploaded thousands of papers to open-access preprint servers without first going through the normal peer-review process. As the volume and speed of scientific publishing has increased, it’s perhaps inevitable that mistakes will slip through -- mistakes that can have serious stakes and consequential outcomes in the context of a highly politicized pandemic.
“Good science is hard to do, and in an outbreak there’s this tendency to say some science is better than none,” said Alex John London, the Clara L. West Professor of Ethics and Philosophy at Carnegie Mellon University, who coauthored a May 1 article in Science, “Against pandemic research exceptionalism,” that argued for a need for scientists to collaborate to effect more rigorous study design. “That is not always true. Some science can be worse than no science. Some data, some evidence can be worse than no evidence or no data if it’s misleading in important ways.”
The publication Retraction Watch has been tracking notices of retractions and expressions of concern about COVID-19-related research. At least three major medical journals have retracted published studies in the last week alone.
Two studies published in two of the world's most prestigious medical journals, The New England Journal of Medicine and The Lancet, were retracted June 4 after scientists raised concerns about the reliability and provenance of the proprietary database of patient records the study relied upon and efforts to independently audit the data were unsuccessful. The database is owned by a health-care data analytics company called Surgisphere. Using Surgisphere data, the retracted NEJM article found no increased risk of in-hospital death among COVID-19 patients taking certain common blood pressure medications. The retracted Lancet article found that treating patients with hydroxychloroquine or chloroquine yielded no benefits and that its use was associated with an increased risk of ventricular arrhythmias.
The World Health Organization temporarily halted clinical trials using hydroxychloroquine after the Lancet article was originally published, a move that reflected the influence a single study can have in the fast-changing area of coronavirus research.
The lead author on both papers, Mandeep R. Mehra, a professor of medicine at Harvard University, said an entity contracted to conduct an independent review of the Surgisphere data, the Medical Technology & Practice Patterns Institute, was unable to do so.
"On June 3, MTPPI informed us that Surgisphere would not be able to transfer the data required to conduct this audit 'because of agreements with its clients and the fact that the documents contain confidential information.' Since we do not have the ability to verify the primary data or primary data source, I no longer have confidence in the origination and veracity of the data, nor the findings they have led to," Mehra wrote in a statement about the decision to retract the articles.
A total of 174 clinicians, medical researchers, statisticians and ethicists from around the world had signed on to an open letter flagging apparent inconsistencies between the Surgisphere data and publicly available government data. An investigation by The Guardian subsequently raised questions about the company, reporting that a public search revealed that several of Surgisphere's listed employees appear to have little or no scientific or data background -- one employee appeared to be a science fiction author, and another an "adult-content" model -- along with other apparent oddities. "Until Monday, the 'get in touch' link on Surgisphere’s homepage redirected to a WordPress template for a cryptocurrency website, raising questions about how hospitals could easily contact the company to join its database," The Guardian reported.
Surgisphere defended its studies in a statement on its website that has since been removed. The statement also said it has a "real-time database of over 240 million anonymized patient encounters from over 1,200 healthcare organizations in 45 countries." The company did not reply to a request for comment. Its CEO, Sapan Desai, is listed as an author on both studies.
Mehra apologized for relying on the data provided by Surgisphere and Desai. "I have always performed my research in accordance with the highest ethical and professional guidelines," he said. "However, we can never forget the responsibility we have as researchers to scrupulously ensure that we rely on data sources that adhere to our high standards. It is now clear to me that in my hope to contribute this research during a time of great need, I did not do enough to ensure that the data source was appropriate for this use."
In another case of a high-profile retraction -- almost drowned out in the drama over Surgisphere -- Annals of Internal Medicine retracted an article on June 2 about the effectiveness of surgical and cotton masks in blocking the coronavirus based on a sample of four patients. Before it was retracted, the article was cited by dozens of news articles, nearly 10,000 Twitter users and the World Health Organization, according to Retraction Watch.
James Heathers, a research scientist at Northeastern University and cohost of the Everything Hertz podcast on methodology and scientific life, said the increased speed of publication and the increased scrutiny of what’s being published means more published articles are going to come under question.
“It’s to be expected in a way when you have this kind of gold rush for attention right now, which is completely unavoidable when people are trying to solve a problem and everyone wants a solution as soon as possible, but it does mean there is this component of hastiness and a lot of people are being tripped over,” Heathers said.
Heathers added that there is a broader outcome every time there's a headline that says "Major Study, Serious Questions" or "Massive Retraction" in light of the huge interest these studies are generating among the general public.
"With everyone that’s invested in it, the continual backtracking is a terrible look even if some of it is unavoidable," Heathers continued. "We have to walk that line between fast and accurate, or fast and definitive." He added that while it's normal in science for one study to undercut another, it's "also very, very good at undercutting the collective narrative that we know what we’re doing."
Serge Horbach, a postdoctoral researcher and sociologist of science at Radboud University, in the Netherlands, found that medical journals halved their time to publication for COVID-19 articles.
"While the acceleration of journals’ publication process is laudable from the perspective of quick information dissemination, it also raises concerns relating to the quality of the peer review process and the quality of the resulting publications," Horbach wrote.
"The mere fact that you see articles now appearing just a few days after they’ve been submitted, they’re going through the entire process of finding reviewers, doing the reviews, communicating it back to authors, making changes to the manuscript according to the reviews, and copyediting -- all of this just in a few days for some journals -- that really makes one wonder whether we can still hold the same quality criteria as we usually do," he said.
“What we’re really seeing is an acceleration of all the processes,” from peer review to calls for corrections or retractions, said Ivan Oransky, the co-founder of Retraction Watch and a distinguished writer in residence at New York University’s Carter Journalism Institute. “The problem is that journals haven’t -- at least publicly -- been willing to say that speed could lead to issues. In fact, they’ve been going the opposite direction and really encouraging speed. While you can understand why they would do that, I think we should be honest and up front that speed could lead to these issues.”
A separate but related issue has to do with the broad use of preprint servers to disseminate research quickly prior to peer review. Scientists have been uploading papers to preprint servers in unprecedented numbers: a group of researchers found that in the first four months of the pandemic, scientists had published more than 16,000 articles on COVID-19, at least 6,000 of which are hosted on preprint servers (these data are reported in a preprint). The researchers found that preprints have been widely circulated on social media, Twitter especially, and that news outlets have extensively reported on them, representing a "marked change in journalistic practice."
Advocates of preprints argue they are a valuable way to disseminate information quickly and to receive substantive feedback on a draft from a broad range of scientists. But some scholars and researchers have also raised concerns about unvetted, early science published on preprint servers spreading disinformation and being used to hijack public debate.
Two journalism professors at Northeastern University, Aleszu Bajak and Jeff Howe, argued in a May 14 op-ed in The New York Times that a preprint by Stanford University medical professors that estimated a COVID-19 mortality rate of 0.12 to 0.2 percent -- a far lower rate than has been generally accepted -- was quickly "weaponized" by right-wing commentators and activists who used it to protest against lockdowns.
The preprint -- which was subsequently replaced with a second, updated version -- was widely criticized, with scholars and journalists raising concerns about the statistical methodology, the approach to recruiting study participants via Facebook and the accuracy (or lack thereof) of an antibody test the researchers used to establish COVID-19 infection rates. In addition to criticizing the methodology and study design, articles published in The Mercury News and BuzzFeed outlined questions about the study’s funding and about whether the authors hyped their preliminary results to promote an antilockdown political agenda.
A spokeswoman for the Stanford School of Medicine said Stanford is aware of “serious concerns” related to the study and that the “matter is being reviewed by the appropriate oversight mechanisms at Stanford.” The corresponding author and a professor in Stanford's medical school, Eran Bendavid, declined an interview request, citing the ongoing review and the need for confidentiality to preserve the integrity of the review process.
"To be clear, there has been no finding of wrongdoing by the university as a result of this study, which I believe was done in accordance with good research practices, university policies, and university approval," Bendavid said via email. "I am confident that this will be shown to be true and that my academic and professional reputation will be upheld."
London, the ethicist at Carnegie Mellon, said scholarly publishing is grappling with new time pressures and new pressures from the general public.
"The idea that you run your study, you gather the data, then you take time writing it up, you send it to the journal, the journal sends it to a couple of referees, they all take their time, and six months later the paper gets published -- that is not an attractive possibility in the context of an evolving pandemic," he said.
"That's a sort of extreme," London continued. "Is the best alternative a different extreme, where scientists upload their work to a preprint server, without significant peer review? They can make really striking claims about what their work shows, the media then report on those really striking claims so the information sort of now leaves the confines of the scientific community and goes out into the larger world of patients and policy makers and pundits and takes on a life of its own." Even if later corrected, he said, "it can be very difficult to have the corrected version grab the same headlines as the initial striking claim."
Oransky, of Retraction Watch, argues in light of the recent journal retractions that the problem is not preprints per se. "A lot of people in publishing would like this to somehow be a mandate or a referendum on preprints and how dangerous they are," he said. "But I don’t know. I’m seeing a lot of really problematic stuff in prestigious peer-reviewed journals."