You have /5 articles left.
Sign up for a free account or log in.

Just because scholars who seek to publish in open-access journals are open to new forms of peer review, that doesn't mean they all see eye-to-eye -- or know what to expect. As one sting operation shows, many such journals are unable to reject obviously flawed submissions, even as they promise thorough review processes. Meanwhile, other journals are even criticized for being too much like the traditional publishing they aim to reform.

Ocorrafoo Cobange, a biologist at the Wassee Institute of Medicine in Asmara, recently had an article about the medical properties of a chemical extracted from a lichen accepted for publication -- by more than half of the 304 open-access journals he submitted it to. Of course, Cobange is not real, and neither is the Wassee Institute. They are both inventions of John Bohannon, the Harvard University biologist and writer who documented the study in this week’s edition of Science.

“Acceptance was the norm, not the exception,” Bohannon wrote. Not only did the Journal of Natural Pharmaceuticals see the article fit for publication, but so did journals “hosted by industry titans ... prestigious academic institutions ... [and] journals for which the paper’s topic was utterly inappropriate.”

The culprit -- a lack of a rigorous peer review process. Bohannon estimates 60 percent of the accepted submissions showed “no sign of peer review,” and that even among the journals that reviewed the article, 70 percent accepted it anyway.

“Any reviewer with more than a high-school knowledge of chemistry and the ability to understand a basic data plot should have spotted the paper’s shortcomings immediately,” Bohannon wrote. “Its experiments are so hopelessly flawed that the results are meaningless.”

With so many journals failing to notice the hoax, Bohannon argues the results “reveal the contours of an emerging Wild West in academic publishing.” 

'It's Tough'

Early this summer, the editors of the Journal of Digital Humanities took interest in a project outside its usual content feeds: an online summer course on the postcolonial digital humanities run by Roopika Risam, assistant professor of world literature and English at Salem State University, and Adeline Koh, assistant professor of postcolonial literature at Richard Stockton College of New Jersey. When the online course concluded, Koh and Risam submitted a set of blog posts about reading discussions, believing, as Risam said, “the review process stopped with us.” They were later told the posts needed to be subjected to an outside review.

“What wound up causing the misunderstanding was that we weren’t aware this was an experiment,” Risam said.

The Journal of Digital Humanities gets most of its content from the aggregation project Digital Humanities Now, but its editors will request revisions or expanded versions where needed. About one-third of the journal’s content undergoes “extensive revisions.” Koh and Risam, meanwhile, were under the impression that the journal used a post-publication review model, and thought their work was being singled out.

Editor Joan Fragaszy Troyano said she could not comment on specific submissions without permission from the contributors, but pointed to a statement the journal released after Koh wrote about the issue on her personal website.

Even though other scholars in the field suggested the editors were motivated by racism or sexism, both parties now consider the controversy ended. Yet the incident shows that as editors of open-access journals search for a review process that ensures both diversity and quality, contributors are sometimes left scratching their heads.

Lucas Waltzer is a member of the collective of administrative staff, faculty members and graduate students that edits The Journal of Interactive Technology & Pedagogy. He described the three-issue-old publication's review process in two words: “It’s tough.”

The collective handles editing duties on a rotating basis. Its more than 20 members, spread across the entire City University of New York system, are all expected to serve as a sounding board for ideas, though a handful are picked to edit each issue. Waltzer and Mikhail Gershovich, assistant director for educational technology and director, respectively, of the Bernard L. Schwartz Communication Institute at Baruch College, edited the journal's most recent issue.

“We’re a young journal, so we haven’t really solidified our processes yet,” Waltzer said. To avoid misunderstandings, he said, editors connect contributors with their reviewers and provide consistent updates about where on the path to publication they are located. For each issue, the journal produces a “behind the seams” feature that focuses on a specific article’s journey through the editorial process.

“We want to encourage authors who want to push scholarship in new and exciting ways, and we realize there aren’t a lot of avenues to do that, and they need to feel supported after they put themselves out there,” Waltzer said. “We do put these pieces through a rigorous editorial process. I know we feel that our authors are appreciative of that.”

Even with these guiding principles in place, miscommunications still occur -- such as when one contributor’s article was passed from one set of issue editors to the next, resulting in contradictory feedback.

“We immediately were apologetic,” Waltzer said. “The way we did that is just by responding as quickly as possible to author inquiries. Also, being honest about where we were in the process.”

MediaCommons and NYU Press in 2011 received a grant from the Andrew W. Mellon Foundation to study best practices in open, online peer-to-peer reviews in the humanities. (“The draft report was put, appropriately, through an open review process,” Kathleen Fitzpatrick, director of scholarly communication for the Modern Language Association, wrote in an e-mail.) The anti-conclusion: “[O]pen review is essential for modeling a conversational, collaborative discourse," but "more experimentation is needed before any conclusions might be drawn.”

Koh and Risam's contribution is called DHThis, a site that substitutes the formal review process for a crowdsourced alternative. The site is not intended to take the place of journals that peer review submissions, but rather as an experiment to see what kind of content would be promoted if readers themselves decided.

“What I think that ideally would happen ... is that publication, criticism and discussion could all take place in the same space,” Risam said. “Online allows that to happen at a quicker pace.

DHThis shares two features with sites like the social aggregator Reddit: Its users are free to “upvote” or “downvote” and comment on submissions, and one of the submissions on its front page is a cat GIF.

“It’s bringing together a community of anyone who wants to participate to submit articles that have been published,” Risam said. “More so than being a form of peer review, it’s a way for the crowd to curate digital humanities scholarship. What I’m already seeing is connections that are being made between scholars who have been on the site and discovered work that relates to theirs. To me, if that’s what’s happening, then the site is doing what it’s supposed to do.”

Next Story

More from News