Twenty-one scholars at a range of business schools are about to publish a joint call to change the way business schools are ranked. They argue that traditional methods -- which reduce business schools to ordinal rankings -- are deceptive and biased toward certain kinds of M.B.A. and other business school programs.
The article -- soon to appear in the journal Decision Sciences, abstract available here -- does not suggest that all rankings be abolished, and the authors see value in the release of data on which many rankings are currently based. But the article argues for creating tools so different types of students may emphasize different characteristics of business schools.
"Many of us continue to acquiesce to methods of comparison we know to be fundamentally misleading," the article says. "We continue to allow the strategic initiatives we establish to be undermined by assessment approaches that not only restrict views of these initiatives, but which also can entirely skew holistic considerations of programs and institutional strength."
The article does not focus on any one particular ranking, but rather on issues that are in the methodologies of numerous prominent rankings. (Many of the issues discussed in the context of business schools arguably apply to other rankings in higher education.)
The authors, many of whom are also administrators at business schools, are clear that they believe current rankings systems are influential with students and with business schools.
"Looking back on the last several decades, it is clear that these rankings, no matter how flawed, have had a remarkable impact on a wide range of decisions bearing on the ways that many business schools function," the article says. "In particular, in the absence of more customized and easily navigated approaches to reveal student-school matches, the available lists have led to substantial decision anchoring amongst prospective students. In turn, whatever data and aggregation approaches underlie these rankings have become de facto drivers of key administrative decision-making approaches at a large number of schools."
The article adds, "This is the case because these criteria can both impact demand as well as the means to attract new students. As a result, many business school leaders have expressed concern that rankings are in many respects little more than a 'game' that must be played to achieve status, generate visibility, garner alumni donations, recruit students and employers, and ultimately to play on the big stage. Unfortunately, the game being played by these institutions does not necessarily serve the best interests of key stakeholders impacted by the consequences of this approach to establish or maintain institutional legitimacy."
As an example of a part of common rankings methodologies that is counter to educational values, the article cites figures about average salaries after graduation.
"It seems that at some point, these rankings implicitly imply that the best outcome for all students, across society, should be employment in the financial sector, specifically in careers on Wall Street. Institutions with the greatest numbers of such placements tend to rank the highest in third-party evaluations," the article says. "Thus, in the interests of 'winning the game' administrators interested in pursuing stronger rankings would do well to adopt every means possible to promote their finance programs and their finance faculties."
The article calls for business school leaders and organizations to work together to come up with a system for sharing data that could help prospective students and others evaluate the quality of programs.
Whether the criticism will have an impact remains to be seen. The paper is generating considerable buzz even before official publication, but so have some past efforts that have not succeeded in changing the rankings.
Those entities that rank business schools, using the kinds of methodologies criticized in the article, defend their efforts as helping students get consumer information.
Francesca Levy, an editor at Bloomberg, told The Wall Street Journal that "there’s big value in holding schools to the same standard and measuring them against the same, transparent criteria so students can make a better-informed decision.”
And some who rank business schools are mocking the paper.
Dealbreaker, a blog on finance and business that does rankings, used this headline to discuss the paper: "Third-Tier B-Schools Don’t Like Rankings Placing Them in the Third Tier."
The authors of the paper are indeed not from the Wharton/Sloan/Chicago crowd -- they are from business schools at places like Ohio State University, the University of North Carolina at Chapel Hill, the University of Iowa and the University of Washington (and also some private institutions). The public institutions generally have strong local or regional reputations, many times based on close ties to local or regional business needs. And many of these business schools have individual departments or programs that receive top rankings. But the authors aren't from the institutions that appear at the very top of national B-school institutional rankings.
Via email, one of the authors -- Elliot Bendoly of Ohio State -- stressed that the effort is not about "escaping" rankings, as some have suggested, but about "replacing" them. It is possible to share data without promoting the negative aspects of the current system, he said. "Unfortunately, those invested tend to jump to black-and-white conclusions without reading the entire piece."