You have /5 articles left.
Sign up for a free account or log in.
The European Union-sponsored U-Multirank releases its second annual ranking of global universities today.
With 31 indicators across five dimensions -- teaching and learning, research, knowledge transfer, international orientation, and regional engagement -- the U-Multirank project aims to rank a diversity of types of universities on a wider range of measures than do the traditional global rankings, which exclusively rank research universities and rely heavily on publications data as well as (in the case of two of the three major world rankings) reputational surveys.
A total of 683 institutions actively participated in U-Multirank data collection, an increase from 518 last year. An additional 527 universities, for a total of 1,210, did not actively participate but are included in the newest ranking based on publicly available publications and patent data. This is the case for many of the world’s best-known universities, such as Harvard and Yale Universities and the Universities of Cambridge and Oxford.
U-Multirank includes data from universities in 83 countries, though participation in the initiative outside Europe is only spotty. France, Germany and Spain have the largest numbers of institutions actively providing data, with more than 50 each. There are 12 British universities actively participating and -- outside Europe -- 11 from Australia, 4 from Canada, 3 from China, 16 from Japan and 14 from the U.S. (up from 9 last year). Participating U.S. universities are Boston and Dartmouth Colleges; Carnegie Mellon, Colorado State, Fairfield, James Madison, Oregon State, Texas A&M and Tufts Universities; Olin College of Engineering; the State Universities of New York at Stony Brook and Buffalo; and the Universities of Connecticut and Hawaii at Manoa. An additional 155 U.S. universities are ranked only on those measures based on publicly available patent and publications databases.
Twelve of the 31 indicators reflect information derived from those databases, while the other 19 are based on institutionally submitted data (a full list of indicators is here). U-Multirank also relies on a third source of information -- student surveys -- for its separate field-based rankings, which cover business, electrical engineering, mechanical engineering, physics and, new this year, computer science, medicine and psychology. U-Multirank partners with the participating universities to administer the surveys, which ask students to answer specific questions about the teaching, research orientation and integration of work experience into their programs, among other topics. (The field-based rankings only include those universities actively participating in data collection.)
U-Multirank does not weigh indicators in order to assign universities a single composite score, but rather groups universities into one of five categories -- A (very good) to E (weak), based on their performance on each individual indicator. Harvard University has the highest absolute number of research publications and the Massachusetts Institute of Technology the largest number of awarded patents, but on other dimensions new, less familiar leaders emerge: for example, Reutlingen University of Applied Science, in Germany, excels in terms of co-publications with industrial partners.
Top performers in some of the dimensions that are based on university-submitted data include many specialized institutions: Moscow Aviation Institute leads the way in terms of bachelor’s graduation rates; Siberian Institute of Business in terms of income from continuous professional development; France’s IÉSEG School of Management, Lille, in terms of student mobility; Germany’s University of Magdeburg in terms of art-related outputs; and Polytechnic Namibia in terms of master’s graduates working in the region.
Yet as Ellen Hazelkorn, author of Rankings and the Reshaping of Higher Education: The Battle for World-Class Excellence (Palgrave 2015), pointed out, the U-Multirank results are only as good as the universities that choose to participate. “For example, last year they came up with ‘the best business schools,’ and there were three business schools in Poland [in the top 20]. You kind of scratch your head and you think, is that true? No, that’s what’s in the database.”
Hazelkorn, the director of the Higher Education Policy Research Unit at Dublin Institute of Technology, said that, theoretically speaking, U-Multirank’s got the right idea -- that universities should be judged on a wider array of indicators and that there are multiple ways of measuring excellence. But she is skeptical about the meaningfulness of some of the U-Multirank indicators. And she ultimately comes back to the point that no matter how good the ranking is, its value depends on whether or not it covers a sufficiently large population of universities for the comparisons it generates to be useful. “My view on it is they should stick to Europe,” she said. “Get it to work for Europe. Forget conquering the world.”
Within Europe, opinions are mixed. Most of the 21 members of the elite League of European Research Universities -- institutions that have generally done quite well on the traditional rankings -- have opted against supplying data to U-Multirank (the two exceptions are the Universities of Barcelona and Zurich).
A recent European University Association survey of its members’ experiences of using U-Multirank found that the initiative struggles with the same challenges as traditional rankings do in regards to the comparability of data: “Whether a university took part in [U-Multirank] or not, all expressed major concerns regarding the interpretation of the [U-Multirank] indicators across different institutions and countries and thus the validity of the data provided,” the E.U.A.’s survey report concluded. The E.U.A. survey also noted that collecting data for the ranking required “considerable resources,” with 73 percent of institutions saying that it required more time and effort than they anticipated. Furthermore, the benefits for participating universities were unclear, with 4 out of 10 having no concrete plans to use the results.
Still, more than 85 percent of the E.U.A. member institutions already collecting data for U-Multirank said they planned to continue to participate. The top reasons why universities said they were participating in the initiative were to increase their institution’s visibility and international profile and in order to benchmark themselves against international institutions. The third most common reason was that universities want to be a part of a project that they perceive to be important for higher education: "We feel that there is a need for a balanced assessment of universities in terms of different activities (education, research, innovation, etc.), which e.g. Q.S. and T.H.E. cannot give," one Swedish university reported (referring to the Q.S. and Times Higher Education world rankings). "U-Multirank has the potential to fill this need."
Frans van Vught, one of the U-Multirank project leaders and an honorary professor and former president at the University of Twente, in the Netherlands, said the growth in the number of institutions included in this year’s ranking exceeded expectations. “This whole process is being sponsored by the European Commission and we had funding for an increase of about 150 extra institutions; we’ve gone way beyond that already. We are not too eager to grow too fast. It’s a lot of work do all these analyses and data manipulation.” (On the issues of comparability and validity, for example, van Vught emphasized that the rankers employ a number of statistical checks on institutionally reported data, including comparisons against figures in national databases where available.)
“There are lots of universities in the world,” van Vught said. “I think there are about 20,000. We have a little more than 1,200 -- which is the largest ranking in the world already.”