• The World View

    A blog from the Center for International Higher Education


Playing the Rankings Game in Pakistan

Rankings have progressively lost meaning, particularly in countries lacking a strong academic tradition or where academic corruption is rarely punished.

October 16, 2016

Students and parents in Pakistan and numerous other developing countries, have become increasingly confused and puzzled. They see mediocre universities ranked at the top while those generally considered much better are relegated to the middle. Since local university websites often misrepresent data, people turn to international ranking organizations believing them to be neutral. But they soon discover that the rankings of those very organizations are quite bizarre. 

Something is clearly fishy. Today over one dozen commercial organizations based in the United States, Europe, and China purport to rank universities by academic quality. From some distant office, a handful of employees pass judgment on hundreds – perhaps thousands – of universities they have never visited. Instead they fire off forms hoping that university officials will fill them truthfully. 

The websites of these businesses say nothing of how their own expenses and salaries are met. Unless confronted, they do not reveal they often have service contracts with some of the universities they evaluate. Yet their published rankings are still taken with seriousness by prospective students and their parents, as well as by private and government agencies in deciding policies and grant distributions.

The very idea of ranking universities is questionable. UNESCO’s 2010 report states: “Global university rankings fail to capture either the meaning or diverse qualities of a university or the characteristics of universities in a way that values and respects their educational and social purposes, missions and goals. At present, these rankings are of dubious value, are underpinned by questionable social science, arbitrarily privilege particular indicators, and use shallow proxies as correlates of quality.”

I agree. Rankings have progressively lost meaning, particularly in countries lacking a strong academic tradition or where academic corruption is rarely punished. There are countless examples of individuals and institutions who manage to turn reality on its head by manipulating numbers. Here’s an example.


Bogus Productivity

A recently released report by Thomson-Reuters, a Canada based multinational media firm, says, “In the last decade, Pakistan’s scientific research productivity has increased by more than 4 times, from approximately 2000 articles per year in 2006 to more than 9000 articles in 2015. During this time, the number of Highly Cited Papers (HCPs) featuring Pakistan based authors increased tenfold from 9 articles in 2006 to 98 in 2015.”

This puts Pakistan well ahead of Brazil, Russia, India, and China in terms of HCPs. As the reader surely knows, every citation is an acknowledgement by other researchers of important research or useful new findings. The more citations a researcher earns, the more impact he/she is supposed to have had upon that field. Research evaluations, through multiple pathways, count for 50-70 percent of a university’s ranking (if not more).

If Thomson-Reuters has it right, then Pakistanis should be overjoyed. India has been beaten hollow. Better still, two of the world’s supposedly most advanced countries–Russia and China–are way behind. This steroid propelled growth means Pakistan will overtake America in just a decade or two.

But just a little analysis shows something is amiss. Surely a four-fold increase in scientific productivity must have some obvious manifestations. Does one see science laboratories in Pakistani universities four times busier? Are there four times as many seminars presenting new results? Does one hear animated discussions on scientific topics four times more frequently?

Nothing’s visible. Academic activity on Pakistani campuses might be unchanged or perhaps even less today, but is certainly not higher than ten years ago. So where–and why–are the authors of the HCP’s hiding? Could it be that these hugely prolific researchers are too bashful to present their results in departmental seminars or public lectures? The answer is not too difficult to guess.


The Paper King

Pakistan’s hyper-productivity owes to a new breed of university teacher – the Paper King. This internet age phenomenon is the direct consequence of an incentive structure created in recent years by higher education authorities. This system place a near-total emphasis upon numbers. Cash prizes, academic promotions, foreign trips, research grants, and national prizes have become increasing linked to the number of publications and citations points earned by a faculty member.

The “king” rose to his throne because of these incentives, not because of greater creativity or mastery of his discipline. He can generate countless research papers without doing real, hard research. This requires mastery of several steps: selective cut and paste, choosing research topics of low relevance and noticeability, trivially changing parameters, inventing data, or plagiarizing ideas. Some chop up one piece of actual research into many publishable bits. Carefully selecting or manipulating a journal makes publication is a cinch. Refereeing exists only in name.  Some kings start their own bogus on-line journals; others become editors.

Surrounded by junior teachers, PhD students and assorted flunkeys, the king enjoys a special place in the university. University administrations woo him because they know he knows how to get Thomson-Reuter’s attention. Most importantly, higher education authorities also use Thomson-Reuter’s data to rate and rank.

From Paper King to HCA (Highly Cited Author) is then a short journey. The king in Islamabad reaches out to friendly kingdoms everywhere from Jakarta to Shanghai, Teheran to Toronto. He cites their papers and they duly return his favor, a win-win situation. The king’s citation count rockets up, Thomson-Reuters announces a “path-breaking” gain, the king collects his national awards and cash prizes, and his university rises in the international rankings.

Going through the list of HCA’s cited by Thomson-Reuters, I see several individuals from Pakistani universities who are known to the national community of scientists for their prodigious rate of publication. One boasts of four mathematics papers every month. The papers are vacuous, but the prolific authors are eagerly sought by Saudi Arabian universities. With citation counts thus enhanced, some Saudi universities have ended up being ranked higher than German ones, a travesty that shakes one’s faith in university evaluations, if it does not destroy it completely.


Quality and Numbers

No one doubts that publishing research articles in good journals and counting citations is important in assessing individual and institutional academic achievement. Yes, numbers matter and having PhD students certainly helps generate a culture of research. Without research, a teaching institution atrophies.

But, as experience is now showing, associating research quality with numbers is creating more problems than it solves. Social scientists call it Campbell’s Law: “The more any quantitative social indicator is used for social decision-making, the more subject it will be to corruption pressures and the more apt it will be to distort and corrupt the social processes it is intended to monitor.” A corollary: robust systems may suffer some distortion but weaker ones can be willfully deformed and massively manipulated by prevailing local interests.

Of course, a university is about more than research – there’s also teaching. So why not judge by teaching quality? Unfortunately, even with intimate knowledge, judging teaching within a single department of a single university is notoriously difficult. It is plagued by subjectivity and by the very limited direct personal experiences of those called upon to assess teaching performance.

To conclude: the desire to quantify university performance in research and teaching has turned out to be seriously misleading. Until the time that someone can devise sufficiently robust metrics and procedures, keep your ears to the ground and listen carefully to intelligent students or teachers from within that university. They can tell you much more than so-called ranking organizations.



Pervez Hoodbhoy received his undergraduate degrees in mathematics, physics, and electrical engineering and PhD degree in nuclear physics from MIT. He has been teaching physics and mathematics in Pakistan for 43 years. 


Back to Top