Evaluating Community College Rankings
The Washington Monthly has yet again irked some educators, as it did three years ago, by ranking what it calls “America’s Best Community Colleges” using openly available student engagement survey data.
Using benchmarking data from the Community College Survey of Student Engagement (CCSSE) and four-year federal graduation rates in an equation of its own making, the magazine attempts to rank the top 50 community colleges in the country in its latest issue. Though the periodical’s editors say they only hope to highlight “what works and what doesn’t” at these institutions by ranking them, CCSSE officials have denounced the use of their data in this way and argue it may do more harm than good.
“Community colleges are often underrecognized,” said Kevin Carey, author of the magazine’s community college rankings and policy director at Washington-based think tank Education Sector. “But there’s been a lot of attention paid to them, thanks to the president’s recent effort [with the American Graduation Initiative]. Since he supports investing and improving community colleges, we felt like it was a good time to ask, ‘What do good community colleges look like?’ If we’re going to spend a lot of money, let’s see what reflects best practices out there.”
Carey admitted that such a ranking of community colleges would not be possible without data from CCSSE, a survey run by the University of Texas at Austin that goes out to students at around 650 two-year institutions and uses the results to judge the colleges on broad categories such as “active and collaborative learning,” “student effort,” “academic challenge,” “student-faculty interaction” and “support for learners.” Though every participating college’s survey data are made public, institutional officials are encouraged to compare their benchmark scores only to national averages and those of large peer groups, such as institutions of similar size or in a similar geographic area.
Despite warnings from CCSSE officials that its data sets were never meant to be used to generate college rankings, Carey defended the decisions to do so and to have CCSSE data count for 85 percent of a college's ranking.
“We always equate admissions selectivity with quality,” Carey said. “Well, all community colleges have the same admissions policy, but they aren’t always as good as one another. Part of this was to find a way to talk about excellence in the sector. We’re publicizing information about best practices. We’re talking about it here, and this is an interesting and long-overdue conversation that we need to have at the federal level.”
Carey noted that these rankings could encourage some community colleges to seek out the best practices of others, starting something of a domino effect of reform initiatives. He also added that the list could serve as something of a consumer tool for students looking for a community college.
“I think there are some people who can’t choose their community college, but some can,” Carey said. “For instance, take our top college, Saint Paul College. Well, there are other community colleges in metropolitan Minneapolis-Saint Paul that aren’t listed. If you’re a student and have no information about which community college is better, you’ll probably go wherever is most convenient. But, if you do have some information, you might drive an additional 20 to 30 minutes to get to another community college. It might be worth it.”
Those without much choice in the matter of where to attend a community college, given their location, may also consider taking online courses from those institutions ranked higher in the list, Carey added.
Repeating his stance that only the best community colleges ought to be lauded for their good work while encouraging others to essentially replicate their success by taking similar reform measures, Carey noted that he never considered listing the “50 worst” community colleges in the magazine or continuing his list beyond number 50. He mused that some of the worst-performing community colleges may not have even participated in CCSSE, and that listing the bottom-performing institution that did would be an unfair punishment. Still, he did acknowledge that, conversely, some of the best-performing institutions might not have participated in CCSSE, though he considers this less likely.
Kay McClenney, CCSSE director, criticized Washington Monthly’s use of CCSSE data in creating a ranking of community colleges, calling it both “inappropriate” and “unauthorized.” She noted that she turned down the publication’s request for a more user-friendly version of the open-source CCSSE data sets, adding that it likely pulled the data in what must have been a very tedious process from CCSSE's website.
CCSSE, McClenney argued, is a tool best used when its results are reviewed internally. She added that it does not make sense to compare one community college directly to another, given the significant differences in missions, socioeconomic status of students, budgets and other factors. She said that using broader benchmarks and peer groups is a better way to judge.
“Benchmarking is a process that is entirely different from rankings,” McClenney said. “Our major issue here is that ranking just oversimplifies what’s going on in these colleges. It doesn’t take into comparison major variables. And, from a statistical standpoint, there isn’t that much of a difference between, say, number one and number 15 on the list. It creates a false impression.”
Though she disagrees with Carey’s usage of CCSSE data, McClenney did at least find value in his reason for doing so.
“I grant that [Carey] has positive purposes here,” McClenney said. “He’s attempting to do something he believes is for the cause of goodness. I sympathize with the idea of institutions learning from one another. That’s something we promote in our work. We just disagree that ranking is the way to go about it.”
Search for Jobs