Rising Up Against Rankings

Think it's impossible for educators to fight a popular magazine's irresponsible rating system? Look north, and you'll see it can be done, writes Indira Samarasekera.

April 2, 2007

Canadian universities are listening with great interest as the call to boycott U.S. News & World Report rankings continues to increase in volume among our colleagues to the south. Many of our American colleagues say that they would like to resist the rankings, but fear it can’t be done, especially if only a few institutions act. I write to let you know that institutions can take on the rankings. About a year ago,  a growing number of Canadian institutions began to raise the same alarm, ultimately resulting in 25 of our 90+ institutions -- including many of our leading universities -- banding together to take just such a stand against the fall rankings issue of Maclean's, our Canadian equivalent.

Why we did it:

It’s time to question these third-party rankings that are actually marketing driven, designed to sell particular issues of a publication with repurposing of their content into even higher sales volume special editions with year-long shelf life.

While postsecondary education always like grades and ranks -- they’re the trophies in our competitive arena – presidents and other top administrators at our institutions also have an obligation to do what's right for our institutions in terms of championing our values and investing our resources.

Currently, many American colleges and universities have new presidents -- as there were here in Canada a year ago. It is the role and obligation of a new president to question the status quo, especially long-standing practices that may have started a decade or two ago and have since evolved into a much larger administrative burden with less advantage or validity than they appeared to have at their inception.

Setting the stage:

For years Maclean's collected various sets of data for its fall undergraduate institution rankings issue – some objective, some subjective, some pertinent, some irrelevant – and turned them into aggregated averages to arrive at one overall score for each institution. These aggregated scores are listed in "league tables," supplemented with some editorial coverage on our universities (and advertising by many of our institutions) to create the rankings issue. Sound familiar?

This annually annoying methodology is initiated with a request to each institution to assist them by collecting and reporting data to them in the format Maclean’s desires, typically not the format that we use in institutional research, thus requiring a special effort and investment of time and resources.

Assistance is also requested in administering a student survey for the fall undergraduate rankings issue and a graduate survey to our alumni for the spring graduate school rankings, a product line extension added in 2004 to double the burden. As an alternative they ask us to provide e-mail addresses to the magazine if we don’t conduct the survey for them.

The showdown:

The new presidents’ examination of this process was triggered by the request for data and survey assistance for the spring 2006 graduate school rankings. Our uprising started when my colleagues at the University of Calgary, the University of Lethbridge and I -- presidents of the three largest universities in Alberta -- wrote a letter to Maclean's and met with the rankings editor and the publisher in January 2006 to express our concerns about the methodology of their undergraduate and graduate surveys and rankings.

Along with raising technical issues regarding methodology, we pointed out that a vastly different educational and grading system in Alberta – one of the highest performing K-12 systems in the world – make comparisons of the grades of our incoming undergraduate students with the grades of incoming students in other provinces inappropriate. Our high schools employ a different grading system – believed to be more rigorous – and a student’s final achievement level is defined by a graduation exam not used in other provinces. In the case of the graduate survey, we argued that surveying alumni reflects an institution's past, not its present, particularly in a province such as Alberta, where the government has poured billions of dollars into postsecondary education in the last few years.

In our letter and meeting we offered to deploy the expertise at our institutions, from statistics to education evaluation, to improve the methodology. We also advised the editor that we would not participate further if the methodology remained unchanged. We got no reply.

In the meantime, we enlisted the support of David Naylor, who had recently assumed the role of president at University of Toronto, a major research university that has historically landed at the top of the overall rankings. He weighed in, supporting our Alberta perspective from a national vantage point, affirming: Institutions have different strengths and aggregated rankings diminish those differences. Having this support was crucial. Rankings czars love to pretend the only reason to criticize their work is if you didn’t come out on top, so our movement gained credibility with Toronto’s backing.

As President Naylor wrote in a newspaper op-ed last spring: "As academics, we devote our careers to ensuring people make important decisions on the basis of good data, analyzed with discipline. But Canadian universities have been complicit, en masse, in supporting a ranking system that has little scientific merit because it reduces everything to a meaningless, average score."

Equally important to our concerns about methodology were our growing concerns, as public universities, about using our resources to respond to the increasing number of data requests for rankings as more and more magazines, newspapers and associations are jumping into the entrepreneurial game of rankings. Using taxpayer money to feed sales-generating exercises by for-profit organizations does not align with our values or our responsibility to be accountable to the public -- no matter how much it is alleged the public loves the rankings.

As the deadline for the spring graduate student issue approached with no response on addressing the methodology, the presidents of the Universities of Alberta, Toronto and Calgary were joined by McMaster University, and together we officially declined to participate in the graduate survey. When faced with a demand to supply data for rankings with dubious methodology, we could no longer assist in misleading the public and our prospective students.

Into the fray:

We did not go public with our decision; Maclean's itself started a buzz about our boycott – a preemptive strike – knowing that controversy sells issues. At this point, we all still anticipated participating in the fall undergraduate rankings and continued trying to obtain a response from Maclean's staff on fixing the methodology for the fall issue. Months wore on as we attempted to work with the magazine, resulting in many unanswered phone calls that culminated with the staff basically dismissing our concerns, asserting that the magazine staff certainly knew more about statistical analysis than some academics.

Faced with this unwillingness to consider the requests of the universities, punctuated by the annual request for a sizeable amount of data for the fall issue, we four once again opted out of that rankings issue. But another buzz was growing among the universities. We were quickly joined by seven other presidents who asserted to Maclean’s that they, too, would withdraw if the methodology didn't change. Solidarity mounted and, in the end, 25 colleges and universities refused to participate in the fall issue.

Truth is, most of us already had much of the data sought on our Web sites, but not always in an easy-to-locate places or formats since they are posted as institutional research. The "boycott schools" countered by organizing themselves to post their data – albeit not reworked into identical form or the way Maclean’s requested it – and heighten ease of access on our sites. (The University of Alberta's information can be found here and also here; for comparison, the University of Toronto data are here.

Just before their fall deadline, Maclean's filed a freedom of information request, but it was too late to for us to respond. Most of us had already posted the data online, and we directed Maclean’s staff to our Web sites. In instances where the magazine staff couldn't find data on our Web site, they chose to use the previous year's data.
Did it work?

We think that it did and continue to hope that collaboration with Maclean’s to improve the methodology and arrive at rankings we all find valid and useful lies in our future. Yet, while many allege that the rankings influence student and parent decisions significantly, particularly international students, at the University of Alberta we have seen no indication of that in our applications. In fact, our international applications are up 36 percent over last year.  

We feel that if we have succeeded in advancing our objective (it’s still early and time will tell) it is because:

  • Institutions of all types were involved, from the leading research institutions to small liberal arts colleges. None of us could have done this alone.
  • All the presidents involved had a joint communications strategy with a unified message, and all stayed on message. We stood united. None caved at the last moment to his or her own advantage.
  • Students at all 25 institutions were on our side.
  • Governing boards, faculty and staff came on board.
  • School counselors were contacted early on, explaining our position and supplying them with information on where to find institutional data on our Web sites.
  • We stood united to the end: we did not react after the issue came out, and all agreed not to use Maclean's rankings to promote our institutions.

Our coalition of the fed up continues to work together.  Our goal: to adopt a common format for institutional data reporting on the Web so all those in the ranking business can take what they want and leave us to our business of research, teaching and service.

Stay tuned to Canada for Part 2 as we've just learned that Maclean's is introducing an issue ranking professional schools and graduate programs. Sound familiar?

So Canadian academics are listening, watching and wishing our U.S. colleagues well. Remember to stay united, don't put anything in writing you don't want FOIAed, involve your stakeholders – students, faculty, governing boards – and make your data public and easily accessible for any who wish to find it. Good luck.


Indira Samarasekera is president of the University of Alberta.


Be the first to know.
Get our free daily newsletter.


Back to Top