Refusing to Be Evaluated by a Formula

Rutgers faculty members, citing philosophical concerns and errors, are pushing back against the use of Academic Analytics to evaluate their productivity.

December 11, 2015

With the advent of Google Scholar and other metrics for faculty productivity, advancing one’s career as a professor is much more of a numbers game than it used to be. Still, the traditional system of peer review in hiring, tenure and promotion decisions has retained a good deal of nuance. Scholars in the same field as those they’re evaluating know that while one project may not be as prestigious as another, for example, a good degree of academic innovation might be worth a little professional risk.

But is that system under threat? Full-time faculty members at Rutgers University at New Brunswick say that it may be, in light of the university’s contract with a faculty productivity monitoring company called Academic Analytics.

Rutgers professors say they don't need the system, which is based on a patented algorithm for measuring faculty productivity, and that what little data they’ve been able to obtain to so far include some serious errors. On Monday, the faculty of the School of Arts and Sciences will vote on a faculty union-backed resolution asking the university not to use Academic Analytics data in personnel and curricular decisions, and to give faculty members access to data collected by the company.

“I think in everybody in academia has an interest in collective and individual measures of productivity, but these are very blunt forms of measurement,” said David Hughes, a professor of anthropology and president of Rutgers’ American Association of University Professors and American Federation of Teachers-affiliated faculty union. “Most universities already have a very robust system of assessing faculty output, with [numerous] categories for your scholarship, so why would we want to dilute that for these indexes?”

Moreover, Hughes said, “Why is this worth half a million dollars?”

Calls for accountability of state dollars at public universities and demands for easier quality comparisons among institutions make easy faculty productivity metrics increasingly attractive for some administrators. And they’re willing to pay for them. In 2013, Rutgers signed a $492,500, four-year contract with Academic Analytics, a New York-based company founded by Lawrence Martin, former dean of the Graduate School at the State University of New York at Stony Brook, and Anthony Olejniczak, a fellow anthropologist. Their premise was that college and universities needed a more dynamic set of data, updated on an annual basis, than is included in the National Research Council’s periodic rankings of graduate programs.

Academic Analytics’ first database was released in 2005, and client universities received paper reports on program and departmental productivity. Within a year, the company had 10 institutional customers, and the buzz was particularly strong among up-and-coming research universities that wanted to highlight the work happening on their campuses and boost their reputations. Now the company has 385 institutional customers in the U.S. and abroad, representing about 270,000 faculty members in 9,000 Ph.D. programs and 10,000 departments.

Data now available online include numbers of scholarly books and journal articles, citations, research funding by federal agencies, and awards earned by faculty members. Comparisons can be made between disciplines and institutions over all. Of course, individual departments have long looked at just these types of measures. But faculty members say that when they do the reviewing, they know in a way no formula can which journal article really made a difference to a field, which grant was particularly influential and which research helped a local community (even if it didn't win a big grant). This type of information, they say, is why they don't need a formula.

Early on, critics of the program pointed to the fact that there was no consideration of teaching or service in Academic Analytics’ formula. Faculty members also wanted to know just how the data was being used, since many institutions signed contracts without consulting professors.

That’s what professors say happened at Rutgers. Various faculty members say they informally heard about the agreement starting in 2013, but that it remains unclear to them exactly how administrators have been using Academic Analytics over the past two years. Others say they’ve asked for access to their personal data, but haven’t received it. But it's clear Rutgers is using the information, to some degree. A 2014 job ad for an associate dean of research in the School of Social Work, for example, says that a "recent analysis by Academic Analytics found the Rutgers social work Ph.D. faculty to be the third most productive in publications, citations, and research among social work faculty" in the U.S.

Those who support Monday’s resolution say they didn’t have time to address the issue immediately in 2013, as the school's faculty members only meet as a whole once a semester and they were busy with more immediate concerns -- such as fending off the university’s deal with Pearson to provide online education programs. But now it’s time to establish limits on how Academic Analytics can be used, they say.

The resolution includes three parts. First, Rutgers can’t use Academic Analytics data in tenure and promotion decisions affecting faculty of the School of Arts and Sciences. Hughes said it’s already included in the union contract that a professor’s entire portfolio, except for external letters, is open to him or her, so any use of the data already would be a violation of the contract. But the faculty wants assurances, he said.

The second faculty request is that Rutgers won’t use data from Academic Analytics in decisions affecting the composition of the faculty, graduate or undergraduate curricula, or grant writing in the School of Arts and Sciences. That’s essentially to protect programs that may by the data alone appear unproductive, and therefore as potential targets for funding cuts. Last, Rutgers must distribute personal data collected by Academic Analytics to each faculty member by March 1.

The resolution touches on faculty members’ more philosophical qualms with the company, saying that the “entirely quantitative methods and variables employed by Academic Analytics -- a corporation intruding upon academic freedom, peer evaluation and shared governance -- hardly capture the range and quality of scholarly inquiry, while utterly ignoring the teaching, service and civic engagement that faculty perform.” But it also notes more practical concerns, such as that “taken on their own terms, the measures of books, articles, awards, grants and citations within the Academic Analytics database frequently undercount, overcount or otherwise misrepresent the achievements of individual scholars.”

Indeed, several scholars who were able to informally obtain their personal data, such as through a department chair, said it was incorrect. Hughes said he’d been credited for three journal articles in a given period when he’d only written one, for example, and undercredited on other kinds of publications.

Yolanda Martínez-San Miguel, a professor of Latino and Hispanic Caribbean studies, also observed irregularities in her profile. She said she wasn’t particularly concerned about how Academic Analytics might impact her career prospects, since she’s already a full professor and well-known in her field. But she said she worried about the service’s impact on the research choices of younger scholars who are still trying to make their mark.

Martínez-San Miguel said that at least some of the errors in her profile were the result of an algorithm that doesn’t value many of the interdisciplinary and Spanish-language journals she’s published in, and wondered how that might affect the next generation of scholars. “Will they only publish in journals that are ranked, and does that preclude taking intellectual risks?" she asked.

Hughes said he had similar concerns, saying he wondered how fellow anthropologists who produced films instead of articles, for example, would be evaluated by Academic Analytics. On paper, he said, there doesn’t appear to be a metric for alternative publications. That’s a potential problem in numerous other fields, such as history, which has made inroads to support digital projects.

Still, getting full access to one's profile may be harder than expected. Rutgers’ contract with Academic Analytics, obtained by the union though a public open records request, says that the portal may be accessed only by those who hold “a position that involves [strategic] decision making and evaluation of productivity,” as approved by the company. The contract also limits what data may be distributed or shared.

Academic Analytics declined to comment Thursday. It’s unclear when it began making individual data available, since Martin, the co-founder, expressed concerns to Inside Higher Ed in 2006 about that kind of data being sold to administrators “for the moment,” lest they “would hand out pink slips without thinking.”

Hughes said he was confident the resolution would pass Monday. A university spokesman didn’t offer details Thursday about how Academic Analytics is being used at Rutgers, but he said that the university will review the resolution if it’s passed on by the faculty.

Julia Carpenter-Hubin, the assistant vice president of  institutional research and planning at Ohio  State University, sits on Academic Analytics’s advisory board. She said via email that Ohio State’s own subscription level gives it access to information up to the scholarly productivity of its Ph.D. graduates, as well to information about the faculty with whom its faculty members collaborate and the ability to search for faculty members based on academic expertise. Department chairs are given access to the database as their units go through program review, and upon request. For most departments, she said, “the data then inform discussions about areas of excellence and areas for improvement. …Because every discipline has its own idiosyncrasies, scholars in the disciplines are in the best position to interpret the data for their area and to adjudicate the quality of the information.”

Carpenter-Hubin said a college or university could use Academic Analytics to feed a promotion and tenure system so that data automatically populated a faculty member’s dossier, but only if the faculty member could then review and edit the data. That's because institutional data over all could be 99 percent correct, but only 50 percent correct regarding a particular faculty member. In the automatic population scenario, she added, “the purpose of using Academic Analytics data would be to reduce the burden for faculty of creating a [promotion and tenure] portfolio — not providing a more authoritative source.” 

Overall, she said, Academic Analytics’ data “are not perfect, but over the years, the datasets have become more complete, both in terms of faculty who are included and with regard to the scholarly work of those faculty. It's a big job to do the matching of articles, books, grants and other scholarly work to individual faculty, but they have not only gotten more accurate over the years, they are also able to produce their datasets in a more timely way.”


Back to Top