You have /5 articles left.
Sign up for a free account or log in.

Unlike novelists and playwrights, journalists tend to lead with the best material we’ve got. So here’s mine: In the controversy between U.S. News & World Report and Sarah Lawrence College, a college is alleging that a national magazine is making up data, so as to allow said magazine to continue conducting its university rankings. But in Canada, one of the country’s leading universities is making up data — in an attempt to prevent a national magazine from conducting university rankings. Readers of Inside Higher Ed were last week urged by Indira Samarasekera, the president of that university, the University of Alberta, to follow the Canadian lead. American university presidents might want to learn a bit more about what is actually going on in Canada before taking that advice.

The example set by a number of Canadian universities in 2006 involves a level of information suppression that I suspect neither the American public nor students, not to mention university administrators, would be able to countenance. It is one thing to decline to turn over SAT scores to U.S. News, when your college is no longer gathering SAT scores to evaluate students. It is quite another to refuse, as some Canadian universities did, to make public your university’s average entering grades, or its graduation rate, or the number of international students, or its faculty count. Yet that is what happened at many Canadian universities in the fall of 2006.

Rankings were intended for students, and to put the focus on students and on quality. Unfortunately, it is administrators, in both Canada and the United States, who have become the most engaged and obsessively implicated audience. That result has not been for the better for either rankings or universities. At Maclean’s, we have good reason to believe that students use our rankings as intended: as one of many sources of information about universities, and one source – an important source, but not necessarily the most important source – to help them and their families make their university decision. Students give our rankings the appropriate weight. And university administrators? Perhaps we should be flattered, but they often seem to accord our rankings a weight that can only be described as disproportionate.

Maclean’s, Canada’s main national newsmagazine, has been conducting a ranking of Canadian universities since 1991. The goal is to provide a broad evaluation of the quality of undergraduate education at 47 ranked universities. All are public institutions. Maclean’s constructed its original methodology in cooperation with Canada’s universities, and has modified and improved it over the years, in consultation with the universities. For example, in 2003, at the request of many universities, Maclean’s added retention rate as a new rankings measure. In 2002, the magazine modified its class-size measure and introduced a grade distribution chart, both steps taken in light of feedback from universities. In her article, Dr. Samarasekera says that Alberta high school grades cannot easily be compared to those from other provinces, in part because Alberta has grading system, where a “student’s final achievement level is defined by a graduation exam not used in other provinces.” However, the Maclean’s ranking for a number of years has attempted to take account of that difference. We ask Alberta universities to provide their first-year students’ average high school leaving grades. We do not request provincial exam results, even though those exam results are used in Alberta university admission decisions. That change was made at the behest of Alberta universities.

Dr. Samarasekera writes that  she has “just learned” about Maclean’s introducing a new issue devoted to research, graduate schools and professional school. In fact, I wrote Dr. Samarasekera and other university presidents to announce the issue last August. It is a response to complaints from many large research universities that the rankings, which evaluate undergraduate education, leave readers with the mistaken impression that undergraduate education is all a university does. In that letter to presidents, Maclean’s also introduced its new online Personalized University Ranking Tool. Dr. Samarasekera and others had argued that, since every student is unique, each should be able to build a unique assessment of universities, based on her own preferred criteria. We agreed. Since last November, students and parents have been able to go online and instantly create a personalized ranking of Canadian universities, picking and choosing from Maclean’s data to build an assessment of what matters most to them.

In the rankings, published each November, we group universities into three categories – medical-doctoral universities, comprehensive and primarily undergraduate. Universities are evaluated based on 22 to 24 indicators, depending on the category. And though the rankings give some weight to the university’s reputation, as judged by a survey of knowledgeable observers and peers, it is considerably less than the weight in the U.S. News survey: 16 percent vs. 25 percent.

If universities were like governments or publicly-traded corporations, the data we seek (and so much more) would all be public. Journalists would focus on the job of covering universities, rather than the task for trying to pry loose some snippets of basic comparative data. The task of the editor of the Fortune 500 -- and I used to be the editor of a magazine that publishes the Canadian equivalent -- is relatively simple, at least in terms of information-gathering. Performance measures for public corporations are all easily available to anyone with a computer and a few minutes to spare. You don’t have to ask companies for permission to publish their top-secret earnings-per-share numbers. The information isn’t top secret.

Some of the data points that Maclean’s uses in its evaluation can be gathered by Maclean’s itself: the aforementioned reputation survey, along with the number of national awards won by students and faculty at each university, from Rhodes Scholarship and Fulbright Awards to 3M Teaching Fellowships and Guggenheim Fellowships. Some of the information is available from third party sources: four measures of library size and expenditure; each university’s operating budget per student; the percentage of budget spent on student services and scholarships and bursaries; and the number and value of research grants obtained from the three national granting councils covering medical grants, science and engineering, and social science and humanities.

And then there is the information that we have always had to request directly from universities. This information is largely quite basic, and I suspect that many readers will be stunned to discover that a number of Canadian universities in 2006 simply refused to make this information public. The key information that Maclean’s must ask universities for directly, and which the boycotting universities refused to provide includes:

  • The number of first- year students.
  • The percentage of first-year students from outside the province.
  • The percentage of first-year students who are international students.
  • The percentage of graduate students who are international students.
  • The average high school grade of first year students, on a scale of 100.
  • The proportion of first-year students with a high school leaving grade of 75 per cent or higher.
  • Graduation rate (percentage of full-time, second-year students who go on to graduate within the expected time frame).
  • Retention rate.
  • Alumni giving, as percentage of alumni who give.
  • Number of faculty.
  • The percentage of students in enrolled in each of six class-size ranges (1 to 25, 26 to 50, 51, to 100, 101 to 250, 251 to 500 and over 500).
  • The percentage of first-year classes taught by tenured or tenure-track professors.
  • The percentage of faculty members with Ph.D.'s

Universities that refused to make such this information public invoked a variety of justifications, and since we don’t have the space to consider them all, allow me to examine some of those cited here.

According to Dr. Samarasekera, comparing university entering averages across Canada is unfair and inaccurate, because Alberta high schools “employ a different grading system -- believed to be more rigorous” than that in other provinces. This is something that Alberta universities have long asserted. What Dr. Samarasekera failed to mention is that for students commencing studies in fall of 2004 -- the most recent year for which University of Alberta entering grades are available -- the University of Alberta, a very large university with more than 35,000 students, had one of Canada’s highest entering grade averages. Alberta’s 2004 entering grades, as displayed in the 2006 Maclean’s rankings, were in fact higher than those of 15 out of 17 ranked Ontario universities, including the University of Toronto, a research powerhouse that until last year was the perennial number one finisher in the medical-doctoral category. And if one compares the 2004 Alberta entering average to the 2005 entering averages of Ontario universities,  one finds that the University of Alberta has a higher entering average than any Ontario university. Given these facts, it is hard to argue that Alberta is suffering because of grade inflation elsewhere.

Readers may be wondering: Why I am citing the University of Alberta’s entering grades for students admitted in the fall of 2004? That’s a long time ago; those students are now juniors. Surely more up to date information is available? No, it isn’t. The University of Alberta has released an average entering grade for the first-year class admitted in the fall of 2005, but it is a made up number. The data is an invention. The University of Alberta knows what the real entering average of the 2005 freshman class is, but it has instead chosen to publish a fabricated number. You can view it on the university's Web site.

According to the university, the entering average in 2005 was 88 per cent. However, it puts a big asterisk next to the 88 per cent, noting that this is not the actual average high school grade admitted students received, but a modified grade, which has been “adjusted for Canadian inter-provincial variables: grading practices, age group participation rates, student preparedness (international PISA survey by the Organization for Economic Cooperation and Development).” So what is the average high school grade of Alberta students? The real, pre-adjusted grade? How has it been adjusted? We don’t know.

Last year, I on two occasions met with Dr. Samarasekera, and I met at other times with her provost and other senior officials. At these meetings, the university demanded that Alberta grades be adjusted. Maclean’s asked them to propose a formula by which this could be done. Don’t just make a demand, we said, make a proposal. Let’s get a national dialogue going. The response: nothing. But the university has, apparently, come up with a formula. They just won’t tell anyone what it is. We have asked the University of Alberta for the formula since last November, and have received no answer. Maclean’s isn’t making up information, but a taxpayer-funded university is. Is this a model U.S. universities wish to follow?

Dr. Samarasekera also complained about a Maclean’s survey of recent graduates, which we conducted in the spring of 2006. She says that one of the reasons the university is unwilling to provide data for the Maclean’s ranking is this graduate survey. She says that “in the case of the graduate survey, we argued that surveying alumni reflects an institution’s past, not its present, particularly in a province such as Alberta, where the government has poured billions of dollars into postsecondary education in the last few years.” Just to be clear, the graduate survey was not part of the fall ranking; it was included in an issue published in the spring. The existence of a graduate survey conducted six months before the rankings may seem like a strange reason to pull out of the rankings. It will seem even more implausible when you learn that, when it comes to student surveys vs. alumni surveys, I listened to Dr. Samarasekera and we have largely followed her advice.

In 2006, in lieu of participating in our graduate survey, Maclean’s asked the University of Alberta and other universities to make public the results of two national student surveys, the Canadian Undergraduate Survey Consortium (CUSC) and the National Survey of Student Engagement (NSSE), which has become popular in both Canada and the U.S. Both are surveys of current students. In 2007, we did not conduct a survey of graduates, but instead asked all universities to make public their most recent CUSC and NSSE results. In other words, we published surveys of current students, commissioned by the universities, imposing no additional costs in time or money on the universities, and providing potential students with information previously accessible only to university heads. Alberta participates in both CUSC and NSSE, and its results, along with those of nearly every other university in Canada, are featured in the Univesity Student Issue, an issue of Maclean’s now on newsstands.

Many universities were unhappy with Maclean’s publishing these student surveys. Some universities do not have the most fantastic results. But as I repeatedly explained to every member of the press who talked to me about the data, the fact that the University of Alberta does relatively poorly on both CUSC and NSSE does not make it a bad university. It does not mean that you should not enroll there. But surely it is a piece of information the students might want to know, and have a right to know. The same goes for this fascinating finding from NSSE: every ranked Canadian university — all 28 of them — has a NSSE benchmark score for “student-faculty interaction” that is lower than the average of American colleges. Does this make these institutions to be avoided? No. But it does raise interesting questions.

On my blog,  you’ll find a more detailed critique of Dr. Samarasekera’s article and arguments. But I want to end on a hopeful note. I believe that there is reason to be hopeful. Out of the battles of 2006, even those Canadian universities that were most opposed to the Maclean’s rankings have come to realize that their way of solving this “problem”– namely through suppression of basic information— is no solution at all. Many universities, including the University of Alberta, are taking part in discussions to create what is being called the Common University Dataset-Canada. If this is done right, and that remains a big if, it would mean that the public – journalists included – would have access useful statistics on each university’s operations and performance, measured on a common basis. Discussions are still in the early stages, but the ideal is a sound one, and it is achievable.

As Dr. Samarasekera wrote earlier this week, in her final bit of advice to American universities, “remember to stay united, don’t put anything in writing you don’t want FOIAed, involve your stakeholders – students, faculty, governing boards – and make your data public and easily accessible for any who wish to find it.” If Canadian universities merely would follow the precepts of the last item on that list, as many of them did not in 2006, all else would be unnecessary. Journalists like me could then spend most of our time covering higher education, instead of engaging in an endless paper chase. We could focus on our jobs.  So could university presidents.

Next Story

More from Views