Institutional research

Planning Trumps Rankings

Americans are infatuated with rankings. Or at least they seem to love arguing about them. Whether it’s the Bowl Championship Series or the 10 best-dressed list, debate rages. Mostly, this is harmless fun (not counting the Texas Congressman who called the Bowl Championship Series “communism”). But trying to rank colleges and universities in the same way we do football teams has the potential to seriously confuse the public about issues of real importance to our society.

The recent suggestions that Clemson manipulates data to improve its standings in the U.S. News and World Report (U.S. News) rankings are a case in point. I don’t believe any of the charges. First, according to the reports of journalists who were there, the accusations incorrectly describe one of the most important elements of the University’s reporting to the magazine (whether benefits are included in salary calculations) -- if that one is wrong, it’s hard to see the others as credible. Second, my organization (the South Carolina Commission on Higher Education) has experienced nothing but the highest integrity from Clemson on data and on all other issues.

The controversy not only reveals a distressing misunderstanding of the key facts, it also illustrates how the rankings can fail to represent what universities really do. To explain this I need to describe Clemson’s strategic plan.

I first encountered Clemson’s strategic planning some seven years ago while I was in another state and visiting South Carolina to review proposals for its endowed chairs program. At that time, Clemson had the best planning process I’d seen. I’ve not read the planning documents of every university in the country, of course, but I’m confident most academic leaders would agree Clemson’s is one of the best.

What makes the plan so good?

First, instead of the abstract rhetoric that characterizes the planning of too many of the nation’s research universities, Clemson’s plan is specific and pragmatic, with clear and measurable goals.

More important, Clemson does a wonderful job of focusing its goals on tangible benefits to students and the state. One of many examples is the Clemson University International Center for Automotive Research (CU-ICAR) which ties a goal of increasing research funding to a strategically-selected emphasis area focused on automotive and transportation technology. CU-ICAR advances the state’s economic development and leverages faculty expertise to increase the quality of vehicles and associated products while preparing students for jobs.

Plans are good but implementation is essential, and Clemson’s annual “report card” shows dramatic progress -- and also demonstrates that the University isn’t focused on magazine rankings. Of the 27 items in the report card, only eight reference U.S. News -- things such as high graduation rates that are priorities of all universities. Put another way, I’m confident Clemson’s strategic plan would be largely the same if the magazine and its rankings didn’t exist.

Why does Clemson reference the U.S. News rankings at all, then? One likely reason is that the public expects it. Another is that the magazine’s metrics conform relatively well to Clemson’s category -- the selective research university.

But that brings up the other problem with the rankings. One of the strengths of U.S. higher education -- one of the few areas where our nation consistently leads the world -- is in options available to students. We have institutions that serve a wide variety of needs and support a great range of public purposes.

If you did exceptional work in high school and want to be in a research-focused college environment, we have great options, including Clemson and the University of South Carolina in my state. Excellent students can also choose from smaller liberal arts focused universities and/or less expensive research institutions. If you’re a late bloomer and need help getting to your fullest potential, we have wonderful choices there also. Ditto if you need to commute.

And then there are internal focus areas such as great books, military studies, community service, and many more. Trying to rank all these variations would be as silly as if the Bowl Championship Series attempted a single ranking across all sports, not just for football. It would also be pointless.

To illustrate its narrow focus, U.S. News allocates 15 percent of its points to the quality of the incoming student body, 10 percent to financial resources, and most of the rest to factors that are strongly related to funding.

What if we rated universities on how they do with less-prepared students and shoestring funding? U.S. News wouldn’t be much help here. It does give 5 percent to a “value added” graduation rate, but that’s doubled by the financial resources category. If we used those criteria I think all of our universities in South Carolina would excel (also our technical colleges, but that category isn’t rated by U.S. News). And, of course, my state isn’t alone in this.

If ranking is so problematic, why are we facing a blizzard of new comparative measures in higher education? The data-centric emphasis appears to stem from the thinking of an emerging group of national experts who regularly describe higher education as being in crisis, and therefore in desperate need of reform. More numbers, they suggest, will guide us to a solution.

I have two concerns about this approach. First, from my historian’s perspective, the whole reform agenda seems askew. American higher education is what it always has been -- overall very successful in what it does. The core problem is that the importance of higher education to society has changed, and colleges and universities are being asked to serve larger numbers of students, many of whom are poorly prepared and/or lack the stable financial background and family belief in the value of education that enhance persistence.

My second concern is with the idea that more data and more rankings and new formulas will lead to substantive change. I’m all in favor of more efficiency (Lumina’s Making Opportunity Affordable is a great initiative) and clearly we need to do a better job in accountability for student learning. I also agree that there are areas where additional data are needed. But I’m afraid that huge efforts at measuring and ranking will lead to at best marginal improvements in things that matter -- notably student learning and success to graduation.

It’s also the case that the emphasis on data -- and the inevitable controversy -- has the potential to distract us from some core issues. For example, funding really does matter (U.S. News certainly thinks so). The reality is that public colleges and universities have been doing more with less for a very long time. But you can’t do that forever and now we’re getting less for less. Using our meager resources to compile more data just in case it might be useful and spending scarce time tinkering with funding formulas and creating ever more elaborate rankings won’t solve the underlying fiscal crisis.

So what is the solution if higher education is to meet the nation’s new needs? My list has just three categories.

First, continuous improvement in student learning and in operational efficiency -- with a recognition that since campuses are largely maxed out on their own, greater efficiency will likely require additional multi-institutional approaches.

Second, more excellent planning of the focused and pragmatic type that Clemson has implemented (fortunately, I think this is very much in process with the current generation of presidents).

And, finally, there must be public recognition that, because higher education is essential to our economic development and quality of life, we can’t afford to have it keep sinking as a state funding priority. This last is one area where I think ranking does matter.

Author/s: 
Garrison Walters
Author's email: 
newsroom@insidehighered.com

Garrison Walters is executive director of the South Carolina Commission on Higher Education.

The Case for More College Grads

Smart Title: 
Leading economist argues that U.S. has produced too few graduates for decades -- and that failure to reverse trend imperils economy.

'Institutional Mythology' vs. Facts

Smart Title: 

At a time that community colleges are under growing pressure to collect and analyze data to improve what they do, their capabilities in institutional research are far behind where they should be, according to a new report.

The Yearly Report Card

Smart Title: 
Education Dept. highlights increasing number of higher-level courses taken in high school, and reliance on federal grants for college.

Anonymity (Almost) Guaranteed

Smart Title: 
A request for records from Ohio University's online fraud reporting hotline causes officials to suspend the system, fearing violations of privacy.

Are 4-Day Workweeks the Future?

Smart Title: 
After experiments this summer, some colleges may go year-round with option, which saved money for commuting students, employees and colleges.

Making Engagement Data Meaningful

Smart Title: 
Gallaudet U. tries to use annual student survey to measure the effectiveness of significant revisions in its undergraduate curriculum and institutional mission.

Taking Aim at the Supply Side

Smart Title: 
Study suggests the rise of community colleges has contributed to a plummeting bachelor's degree attainment rate.

Middle States Divided

Smart Title: 

Accrediting agencies are facing significant outside pressure over their independence and performance, raising questions in some quarters about the viability of education's system of institutional peer review. But one of the country's six regional accreditors of colleges is facing a threat from within, in the form of a nasty internal battle with its parent organization.

Pages

Subscribe to RSS - Institutional research
Back to Top