Rankings Go Global

Kevin Carey sees flaws in the increasingly international business of rating colleges -- but also sees legitimate reasons for the comparisons.

May 6, 2008

Despite a century of proud history, the state university had never received the recognition its leaders felt it deserved. So when a prominent news outlet published a new set of college rankings in 2005 putting the university in the first tier, the entire local community celebrated. The vice chancellor went so far as to order signs touting the achievement placed around the city, including a billboard on the edge of campus meant to catch the eye of visitors arriving on the main highway from the airport.

Then, disaster. New rankings from the same publication arrived a year later, and the university's standing plummeted by 80 places. The institution itself hadn't changed -- the magazine had simply corrected a mistake made in calculating the previous year's score, while also changing the rankings formula. In the court of public opinion, however, such technicalities didn't matter. State officials were aghast, the university's reputation was besmirched and the vice chancellor was quickly sacked.

Another case of the U.S. News & World Report rankings run amok? Actually, no -- the university was the University of Malaya in Kuala Lumpur, the news outlet The Times Higher Education Supplement, in the U.K. Once a purely American innovation -- or problem, depending on how you look at it -- lists of "best colleges" are everywhere. Even as the Times Higher is competing to establish the definitive worldwide college rankings, scores of nations from Kazakhstan to Peru are fast developing new systems to evaluate and publicly rank their institutions of higher education. College rankings have gone global.

To observers of international higher education, this should come as no surprise. Widely cited statistics from the Organization for Economic Cooperation and Development (OECD) suggest that many nations are closing in on the United States' decades-old lead in college attainment. Forty-six countries in Europe and elsewhere are in the midst of a massive effort, through the "Bologna Process," to create what amount to a common market for higher education. Rankings reflect a desire among individual countries to see how they stack up against the world's best, and eventually join them.

That benchmarking instinct created the biggest competitor to the Times Higher: Shanghai Jiao Tong University. In 2003, professors and graduate students in the university's higher education policy unit developed a new method of comparing research institutions across national borders, using measures like the number of faculty and graduates awarded the Fields Medal or the Nobel Prize, and the number of articles and citations in prestigious scholarly journals. Their goal was simply to determine how China's relatively modest but rapidly growing university system stacked up against its global competitors. It is purely a research ranking, using no measures relating to the education of students. But the list was quickly seized by other nations as a way of benchmarking themselves against their rivals and setting goals for improvement. Magazines like The Economist republished the list, and soon ministries of education began talking of having a certain number of institutions in the "Global Top 100," however defined.

The Times Higher, meanwhile, seems to have adopted somewhat of a "rank first, ask questions later" approach, revising its methods on the fly and leaving people like the unlucky Malaysian vice chancellor to suffer the consequences. Half of each institution's score is based on a non-scientific email survey of academics and employers, and the results reflect a response rate of less than one percent. There's arguably a bias towards universities located in the United Kingdom and its former colonies (which represent three of the top five and nine of the top 25), and individual institutions have seen huge variations in their placement from year to year. Nonetheless, the Times Higher rankings are widely cited in the foreign media. Much like FIFA world football rankings, they matter everywhere except in the United States.

Both the Shanghai and Times Higher lists reflect traditional American strength -- Shanghai Jai Tong puts 17 American universities in the top 20 and 54 in the top 100. But the fact that so many other countries are trying to move onto the list suggests that they're no longer willing to accept American dominance of higher education. Part of their strategy is to use internal rankings as a catalyst for replicating the American competitive dynamic, and they're not leaving rankings up to whims of for-profit newsmagazines.

For example, Kazakhstan -- known to most Americans chiefly as the home of "Borat" -- recently embarked on an ambitious effort to rank its 160 institutions of higher education. In 2006, Kazakhstan's National Accreditation Center at the Ministry of Education and Science published new rankings based on more than 40 indicators of student, faculty, and research quality. Unlike the U.S. News rankings, which primarily use input measures like funding levels and admissions selectivity, the Kazakhstan rankings incorporate government-sponsored surveys of nearly 10,000 students, faculty, and employers. And Kazakhstan isn't an aberration. From cradles of higher education like Germany and the U.K. to aspirant nations like Slovakia, Ukraine, Thailand, Tunisia, Nigeria, and Peru, a rapidly growing number of countries are playing the college rankings game. Attaur Rehman, chairman of the Pakistan Higher Education Commission, summed up the impetus behind the new government-sponsored Pakistani rankings simply: they will "enable us to improve the standard of institutions."

Like Kazakhstan, many countries are adding new wrinkles that go beyond the traditional U.S. News measures. The Center for Higher Education Development in Germany gives students the ability to create customized rankings to match their individual preferences and academic priorities, keyed to data from surveys of hundreds of thousands of students and professors. Students choose five criteria from a list of over 25 elements such as research reputation, library facilities, student-teacher contact, local housing costs, course availability, etc. The Canadian magazine Maclean's, which publishes that nation's version of the U.S. News rankings, also lets students shape rankings to their preferences.

Many countries are experimenting with "bibliometrics," the burgeoning science of using vast publication databases to measure how often scholarly papers are cited by other scholars. (Similar measures are used in by the Times Higher). Scholars in Madrid are among those developing "Webometrics" rankings of world universities, based on inbound hyperlinks, the volume and richness of Web-based material, and results from Google Scholar.

Yet none of these new measures are widely used in the country that pioneered college rankings -- the United States. Ironically, our unusually competitive higher education sector suffers from a near-monopoly in the rankings realm. While U.S. News has recently signaled a willingness to consider new measures, the magazine is very unlikely to fundamentally alter the list in way that diverges from the current focus on wealth, fame, and exclusivity. The genius of U.S. News was to rationalize the existing status hierarchy in higher education, to create a mathematical way of calculating degrees of difference from Harvard. As much as the academy likes to complain about the rankings, it cherishes the values that put Ivy League universities on top of the list.

The rise of global college rankings should put to rest any idea that higher education can somehow boycott its way back to the halcyon days before rankings ruined everything. International competition in higher education is becoming more intense, and in the information age, the number of measures and methods available to compare colleges and universities will only grow. There's already talk of a higher education version of the Program For International Student Assessment, one more potential measure to add to the growing list. As long as companies can publish magazines and students can choose colleges, someone will create college rankings that people will read and care about.

And in the long run, that's a good thing. While many critiques of specific rankings methods are legitimate, the more generalized anti-rankings sentiment in higher education is not, reflecting an aversion to competition and accountability that ill serves students and the public at large. It's perfectly reasonable to compare one university to another and make judgments about which is best. If colleges and universities don't want the terms of their success dictated by U.S. News, the Times Higher, or universities on the other side of the world, the only alternative is step forward and stand behind something better. Until then, the influence of college rankings, from within our borders and without, will only grow.


Kevin Carey is research and policy manager at Education Sector.


Be the first to know.
Get our free daily newsletter.


Back to Top