Accountability, Canadian Style

Government seeks to measure and reward quality. Professors grumble a bit. Ring a bell?
May 10, 2006

"It is essential to balance the government's desire for accountability with the need to respect the dual cornerstones of the university: institutional autonomy and academic freedom."

Sound familiar? It might to anyone who has been tracking the work of the U.S. Education Department commission studying American higher education, whose members' oft-expressed desire to hold colleges more accountable for their performance in educating students effectively and efficiently has elicited occasional howls of protest from associations of colleges or faculty groups. But much as the above statement sounds like some of the things they've uttered, it came instead from a report released Monday by a faculty group in Ontario, about the Canadian province's own fledgling accountability system.

Because all 18 colleges and universities in Ontario are largely financed by the provincial government, the agency that oversees higher education -- known as the Ministry of Training, Colleges and Universities – has, like some state governments in the United States, sought to directly tie some funds for colleges to their performance. (There is no meaningful talk by the U.S. commission – at least yet – of tying federal funds to accountability measures.) A small fraction of the overall funds the government distributes to colleges has for several years been tied to three "key performance indicators:" seven-year graduation rates, and job placement rates for graduates who are six months and two years out of college. A fourth measure, on student default rates, is also factored in.

When a new Liberal Party government assumed power in Ontario in 2003, it promised to significantly step up its spending on higher education, pouring $6.2 billion (Canadian) into colleges and financial aid over five years -- with the caveat, says Sheamus Murphy, senior communications adviser for the minister of training, colleges and universities, Chris Bentley, that “there was a desire to get some results for all this new funding.”

As part of the government’s plan, a sizable portion of the new funds would be distributed through a system in which the ministry would negotiate “multiyear funding agreements” aimed at both “providing some funding stability for a number of years,” Murphy says, and “setting out a certain number of indicators, a mix of common, province-wide and specific, local measures, that will lead to improved quality.”

But as the deliberations by the U.S. Secretary of Education’s Commission on the Future of Higher Education have shown so far, agreeing that quality matters and reaching consensus on how to responsibly and intelligently measure that quality are two different things. And the Ontario experience so far reveals much the same thing, says Henry Mandelbaum, executive director of the Ontario Confederation of Faculty Associations, a coalition of campus-based faculty groups that produced .

The previous government’s use of “key performance indicators” like graduation and placement rates was seriously flawed, the faculty confederation says in its report, “The Measured Academic: Quality Controls in Ontario Universities.” Graduation rates measure the quantity of degrees awarded without providing any sense of the quality of the education, and the job placement rates focus on “simple market outputs that are more contingent on general economic conditions than they are on what happens during a student’s time in university,” the report says.

Most people recognize that these are extremely rough proxies for what would be an indicator of institutional or program quality,” says Barbara Hauser, secretary to the council and senior policy adviser for the Council of Ontario Universities, which represents the central administrations of the province’s 18 universities. “This has never been really a popular measure, but that's what the government is doing.” 

The report released by the faculty organization Monday seeks to ensure that as the new government institutes its own accountability system, it avoids the pitfalls of the existing system, says Mandelbaum.

“We’re concerned that the government is going down that same road now – seeking to find things that are easy to measure, rather than finding things that look at the whole issue of quality,” he says.

The group’s recommendations include:

  • Dismantling the previous government’s performance fund and ending its use of graduation and placement rates to distribute money to Ontario’s colleges.
  • Adopting local rather than systemwide indicators of quality, which should be developed in consultation with faculty members and students. “Quality indicators at a provincial level would encourage a deadening uniformity across all universities,” the report says, and undermine individual institutions’ efforts to best serve their local areas.
  • Using quality indicators in ways that reward rather than punish institutions. Punitive schemes tend to hurt young and smaller institutions more than established ones, and more generally produce “unintended” results, the group argues.

Although the report by the Ontario Confederation of Faculty Associations points out numerous potential pitfalls in the provincial government's accountability effort, professors do not oppose attempts to ensure quality, says Mandelbaum. "We're very much in favor of quality -- all of our institutions are taking a significant number of measures themselves to promote quality, and it's not as if the provincial government does not have some kind of right and expectation as to what the postsecondary system would ultimately deliver," he says. "Our concern is that some governments, when they approach the question, do so out of their own political priorities rather than out of actual concern for quality."

The ultimate issue in Canada -- as in the United States, Mandelbaum suggests -- is that "the government wants to control, and institutions and faculty want to retain autonomy."


Be the first to know.
Get our free daily newsletter.


Back to Top