You have /5 articles left.
Sign up for a free account or log in.

Why should we even talk about the U.S. News & World Report higher education rankings? Everybody hates them. Nobody that I know thinks they’ve done anything good for students, professors or schools.

The answer is that I just can’t help myself. Wishing that the U.S. News college rankings would cease to exist will do nothing to mitigate their pernicious influence on our industry. We need to engage with this thing head-on.

Still, I’m going to try to pick my battles. Rather than simply adding one more voice to the chorus of criticism about the validity of the overall U.S. News rankings, I’m going to focus on just one of the lists that that they publish. This is the Best Undergraduate Teaching rankings, which U.S. News publishes for both national universities and national liberal arts colleges.

Last week in Inside Digital LearningI discussed the methodology for how U.S. News constructs the rankings. Basically, they are based on reputation; that is, what college presidents, provosts and admissions deans think of the schools.

In last week’s piece, I argued that any alternative methodology to rank schools on their commitment to undergraduate teaching should be measurable, input based and actionable.

Basing rankings on inputs rather than outputs is certainly problematic. Ideally, we should measure learning progress. Assessments of learning progress, however, have all sorts of challenges, such as what to measure and when to measure it. The last thing that higher education wants to do is follow k-12 into a system of high-stakes testing.

If we are going use inputs for ranking commitment to undergraduate teaching, we should at least be using the right inputs.

I’d like to suggest three (but I very much hope that you have some better ideas). Again, I’d like to emphasize that my suggestions are intended to make only marginal improvements to what is an inherently bad idea.

Ranking the commitment to teaching excellence among colleges and universities will never provide a nuanced or particularly useful snapshot of the reality of these institutions. Every school has its strengths and weaknesses, and it would be far better to invest energy into making those assets and deficits more visible so that students can better sort through the information so they can determine which school is the best match.

Still, if U.S. News is going to continue to publish this ranking, at least they could employ a methodology that is more valid.

Idea 1: Ask Different People

Are college presidents, provosts and admissions deans the right people to gauge peer institutions’ commitment to teaching? Maybe. But we should extend the survey to the people who are immersed each day in improving their schools’ capacities, resources and infrastructure around learning -- and who spend a great deal of time learning from colleagues at peer institutions about their teaching and learning practices.

I’d nominate people who are employed in these three roles. First, I’d survey directors of centers for teaching and learning. Next, I’d want to talk to directors of academic computing units. And finally, I’d include information from the senior person on campus in charge of learning innovation. All these learning professionals are strongly networked with the educational efforts and initiatives of peer institutions.

Idea 2: Include Student-Designer Ratio

Hiring instructional designers to collaborate with faculty on course design is among the strongest indicators of institutional commitment to teaching. Student/faculty ratios have long been used as indicators of teaching quality. That ratio, however, benefits smaller schools. A student-to-instructional designer ratio could have the benefit of being a more egalitarian measure, as there are so few instructional designers on most campuses. Hiring just a few can make a big difference.

An instructional designer ratio also has the potential to include investments in online learning in measures of teaching commitment. We have witnessed the impact that online and low-residency programs can have on the quality of residential programs. Online programs bring new ways of thinking and new capabilities to traditional residential campuses.

Idea 3: Include Ratio of Active Learning to Traditional Classrooms

It should be relatively straightforward to apply the Educause Learning Space Rating System to evaluate the progress of schools in moving from traditional to active learning. Investments to align classroom design with the findings of learning science seem like a clear indication of institutional commitment to teaching and learning.

Ideas two and three have the advantage of moving away from subjective rankings -- ones based on reputation -- and toward more objective, quantitative measures of commitment to teaching and learning. They are measurable, and more importantly, they are actionable on the part of colleges and universities.

Of course, none of these measures get at what is really important in teaching -- the faculty.

How could U.S. News -- or any other group trying to rank schools -- measure commitment to and support of teaching faculty? Ratios of full-time/tenure-track faculty to adjuncts might be one way. The problem is that many adjuncts are amazing and dedicated teachers, and that sort of ratio would be biased toward wealthier institutions.

There will never be a perfect -- or maybe even very good -- methodology to rank the commitment of schools to excellence in teaching and learning. But maybe we can do better than what U.S. News is doing now.

If you were employed at U.S. News, and were tasked to work on the teaching commitment rankings, what suggestions would you make?

Next Story

Written By