You Think We're Rankings-Obsessed?

‘We want to become one of the top 10 universities in the world."

"The government wants a first class university for international prestige."

"What do we need to achieve by 2013? Two universities ranked in the top 20 worldwide."

February 1, 2010

‘We want to become one of the top 10 universities in the world."

"The government wants a first class university for international prestige."

"What do we need to achieve by 2013? Two universities ranked in the top 20 worldwide."

Statements like those will sound familiar to anyone who's been listening to leaders in American higher education and state governments talk for the last few years, as public research universities, particularly, strive to rise in published rankings of institutions.

But the comments above -- from, respectively, the president of Lithuania, a Korean university chief, and political leaders in Ireland -- are but one manifestation of an obsession with rankings that is arguably stronger worldwide than it is in the United States, more directly affecting government policy and creating a broad array of effects, positive and negative, an Irish researcher asserted last week at the Council for Higher Education Acceditation's international forum in Washington.

Ellen Hazelkorn, vice president for research and enterprise and dean of the Graduate Research School at Dublin Institute of Technology and head of its Higher Education Policy Research Unit, takes a scholar's view of the rankings, approaching them neither as an advocate nor a critic and pointing out their positives as well as their problems.

But perhaps because she appeared on a panel at CHEA with Robert Morse -- who, though well-regarded by researchers is, first and foremost, the public face of U.S. News & World Report's preeminent college rankings -- Hazelkorn's primary argument that the rankings do matter left an overpowering impression that they do more harm than good to the worldwide ecology of higher education.

"The pursuit of 'world class excellence' has become a mantra," she said. "But the history of rankings shows us that measuring the wrong things can produce distortions and perverse actions by governments."

The rise of rankings is hardly surprising, Hazelkorn said, given how globalization of the economy has intensified competition among countries and how national leaders the world over have embraced the view that higher education is the engine of their economies. "If you believe that's the case, then issues about the productivity, status, and quality of higher education and university-based research become indicators of [a country's] competitiveness," she said. "How many institutions you have in the top 10, or 50, or 100" of international rankings of universities is used to gauge your national competitiveness.

Countries' focus on where their institutions fare in international higher education rankings has some significant benefits for colleges and universities and their constituents. Nations that seek to improve their institutions sometimes pour enormous sums of money into them, and efforts to drive institutional improvement are often accompanied by "enhanced public accountability and transparency," which can benefit students and taxpayers, Hazelkorn said. Morse, of U.S. News, also characterized rankings like his magazine's as a tool for increased accountability.

But the emphasis on rankings has wrought a set of problems, too, Hazelkorn argued. By focusing attention on as few as 100 universities among more than 17,000 postsecondary institutions worldwide, she said, the rankings may exacerbate the divide between the haves and have-nots, prompting governments to funnel money disproportionately to the institutions striving for world-class prominence (usually research universities that are already wealthy and well-established) rather than colleges that may need the money more or serve needier students.

Colleges that are chasing rankings prestige often focus their attention on recruiting high-achieving students, since rankings like U.S. News and its international equivalents typically reward institutions based on the standardized test scores and other credentials of their enrolling students.

With their narrow interpretation of research quality and the bibliometrics they use to measure it, international rankings like the Times Higher Education/QS World University Rankings and that produced by Shanghai Jiao Tong University greatly privilege biological sciences over other fields, since bioscience publications are much likelier to have multiple authors that give them added weight in citation counts, for example. (Phil Baty, editor of the Times Higher Education World University Rankings, which will henceforth be produced with Thomson Reuters, says that the new rankings will "end the previous clear bias in favour of hard sciences and against the arts, humanities and social sciences, when we measure research quality. From 2010 and beyond, we will normalise our citations data to take into account the very different citations volumes in different disciplines.")

For all their flaws, rankings are unlikely to fade in importance, Hazelkorn argued, as long as countries are concerned about how they stack up against others. One possible shift, though, would be to focus more expansively on countries' educational systems rather than individual institutions, as the Lisbon Council and QS SAFE rankings have done. Such an approach, Hazelkorn said, defines quality more broadly, to include access and teaching quality and other things that the single-institution rankings often slight.


Be the first to know.
Get our free daily newsletter.


Back to Top