- Tweaking the Methodology
- New Methodology Boosts U.S. Universities in World Ranking
- More Questions on Rankings
- At UBC, roiling from a president's resignation, board chair is accused of interfering in academic affairs
- Methodology Change for Ph.D. Rankings
- Refusing to Rank
- 'U.S. News' Sees Drop in Participation
- QS changes rankings rules following recruitment effort by Irish university
U.S. Decline or a Flawed Measure?
New version of British rankings of universities worldwide suggests that American dominance is eroding, but is the methodology meaningful? Is Berkeley really No. 39?
Most higher education leaders say that institutional rankings are highly questionable, given the many intangibles in what make a college or university “best” for a given person or course of study. But what about national trends? Can international rankings of universities provide a picture of the relative rise and fall of nation’s universities?
The Times Higher Education/QS rankings, out today, suggest that there are national patterns that can be discerned – and the picture is one of decline for American institutions. Since narratives about American decline always attract attention, these rankings are likely to cause a stir.
Some of the patterns are striking, and there is abundant evidence that the rise of universities in other countries will inevitably broaden the global leadership. But some experts on rankings say that this study shouldn’t be taken too seriously because of its reliance (even more than the rankings of U.S. News & World Report) on reputational surveys. And even a top editor at the Times Higher acknowledged in an interview that some of the measures used favor institutions in Europe and Asia over those of the United States.
Here’s what this year's Times Higher rankings found:
- The United States and Britain continue to dominate the very top ranks with one university in Cambridge, Mass., leading the rankings and one in the original Cambridge in second place.
- The number of North American universities in the top 100 fell to 36 from 42 in just a year.
- The list saw increases in universities from Europe (39, up from 36) and Asia (16, up from 14 last year).
In ranking universities, Times Higher uses this formula:
- 20 percent is based on a per capita analysis of citations of research conducted by faculty members at each university. This provides an indication of “the density of research excellence on a campus,” Times Higher says.
- 20 percent is based on faculty-student ratio, to provide “a sense as to whether an institution has enough teaching staff to give students the attention they require.”
- 5 percent is based on the percentage of international faculty members.
- 5 percent is based on the percentage of international students.
- 40 percent is based on a worldwide survey of academics, who are asked to name the 30 institutions they consider the best in the world.
- 10 percent is based on another international survey – this one of employers of graduates.
The 50 percent of the formula based on reputation exceeds even the much-criticized percentage used by U.S. News (25 percent).
And that’s part of why rankings experts question the methodology. The Institute for Higher Education Policy has conducted extensive research both on rankings and on the evolution of a global higher ed infrastructure in which the U.S. is not as dominant as it once was. Alisa F. Cunningham, vice president of research for the institute, said that the Times Higher’s rankings are of “limited value” and that all the much discussed flaws of reputation surveys (voting based on old information, voting to favor your own institution, voting on criteria that aren’t those being used, etc.) are only accentuated in international surveys.
“You’ve got entirely different contexts in different parts of the world, and you don’t know what those contexts are,” she said.
Reputational surveys are “the least reliable way to do these comparisons,” she added.
Another reason to be wary of these rankings, Cunningham said, is their volatility (which is of course what gets them more attention). Cunningham said that the great universities of the world – whether in the United States or elsewhere – change gradually, not radically, from year to year. So any methodology that suggests that universities that are centuries old are notably better or worse from year to year is questionable, she said. “They don’t change that way,” she said.
Phil Baty, Deputy Editor of the Times Higher, said in an e-mail interview that some of the measures do favor certain regions. For example, he noted that the citations index favors institutions where most faculty members are in medicine or hard sciences, while putting at a disadvantage institutions where much of the faculty scholarship is in the humanities or social sciences (a characteristic that applies to most American universities). Likewise, he noted that European and Asian universities are more likely than others to have large percentages of foreign faculty members.
But as to the criticism about relying on surveys, Baty said that was a strength of the Times Higher rankings.
“When the rankings were conceived six years ago, a guiding principal was that academics know best when it comes to identifying the world’s best universities. So we were happy to include a heavy element of opinion in the rankings formula," Baty said. "In some ways, giving a strong weighting to the academic opinion survey helps meet some of the biggest criticisms of the university rankings in general – that you can’t reduce all the wonderful and less tangible things that a university does into a simple scientific formula. Universities are always about more than the sum of their parts."
Robert M. Berdahl, president of the Association of American Universities, said that at his association (which includes research universities in the United States and Canada), "we don’t generally place a great deal of stock in the public rankings of universities, but we don’t ignore them either. They are important to the extent that shape public perceptions of the qualitative hierarchy of institutions, but they all have flaws and biases."
Berdahl said that a "heavy reliance on reputational surveys, for example, is not terribly reliable, in part because it depends so heavily on who is surveyed."
The best way to do international comparisons, he said, is "program by program, using the most objective criteria possible."
The issue raised by the Times Higher about an erosion of U.S. dominance is an important one, Berdahl said, even if he doesn't agree with the findings about specific universities or the methodology.
"The United States has to be concerned about this. We know that other nations are investing substantial amounts in building research universities, while the U.S. has been disinvesting," he said. "If we cease to be the nation of choice for the best and brightest international students, or even the best American students, we will quickly cease to have the universities that are the choice for the best faculty and we will be caught in a downward spiral."
But Berdahl, a former chancellor at the University of California at Berkeley, said he just can't buy the numbers in the Times Higher's survey. "While I think that there has been some relative slippage as a result of a decline in funding in the U.S. and the investment elsewhere, the rankings indicated by the Times seem to me to be wildly off the mark," he said. "No one I know would rank Berkeley anywhere near as low as 39th in the world. I admit I’m biased; but this is too far from the mark to be taken terribly seriously."
Search for Jobs