- Eyes on the Aspen Prize
- Valencia College wins first Aspen Prize
- Essay on how community colleges can do better
- Two community colleges share 2013 Aspen Prize
- Chicago community college quickly improves completion rates
- Community college association releases voluntary accountability measures
- Groups call for big changes in recruitment and training of community college presidents
- Aspen Prize chief writes about high-achieving community colleges and their leaders
Whose Top 10%?
WASHINGTON — The Aspen Institute College Excellence Program released a list of what it considers the 120 “best” community colleges in the country Monday, kicking off a multistep, data-driven process for identifying a single institution to receive its $1 million award for “community college excellence.”
And though prize officials say they are simply trying to spotlight institutions that are successful in helping students earn college credentials, to try to help others learn from their methods, critics argue that the selection process unfairly attempts to rank and compare community colleges using data systems that are inadequate to the task.
The Aspen Prize, which was introduced at last year’s White House Summit on Community Colleges, attempts “to recognize community colleges with outstanding academic and workforce outcomes in both absolute performance and improvements over time.” Prize officials hope that by “focusing on student success and lifting up models that work,” they will be able to “honor excellence, stimulate innovation and create benchmarks for measuring progress.”
Several key Obama administration officials, including Education Secretary Arne Duncan, were on hand for a rollout event Monday at the Newseum, mostly offering their support for the Aspen Prize and its goal of spotlighting top performers in the community college sector. Their support of this competitive prize led many in attendance to compare it to the administration's Race to the Top program, which pitted states against one another for federal funding of K-12 reform efforts. (The Washington Post's Valerie Strauss wrote a blog post after the event arguing that "what community colleges really need across the board are resources. And that requires smart state and federal policy, not another contest.")
Keith Bird, former chancellor of the Kentucky Community and Technical College System, co-chairs the prize’s data/metrics advisory panel. He discussed how the prize sponsors came to identify the 120 “best” community colleges. He said that prize officials started using information from the Integrated Postsecondary Education Data System (IPEDS), the primary federal database for colleges and universities. To narrow the nearly 1,200 community colleges in the country down to 120, Bird said, his group used an equation giving equal weight to “performance” as measured in degrees completed and persistence from semester to semester; "improvement in recent years”; and “performance with minority and low-income students.” (More detailed information on the methodology and prize eligibility can be found here.)
Bird said that, next, a selection committee will collect new data from the 120 eligible institutions to determine a top-10 list of institutions, which will be unveiled in September. He noted that this next committee will use three data points measuring “student success” to whittle this list down further: “completion outcomes,” “labor market outcomes” and “learning outcomes.” Though he acknowledged that there are not readily available and common measures of things such as “labor market outcomes,” he said he hoped participating institutions would share with the committee the ways in which they measure success in these areas. Eventually, he added, another prize jury would make site visits to the top 10 institutions and conduct further analyses before it announces a winner in December.
The current list of 120 institutions includes community colleges from 32 states. Large, well-known institutions such as Miami Dade College, which has more than 170,000 students, are included, as are small institutions such as Carver Career Center, in West Virginia, with only about 200 students.
The process has its critics, many of whom cite irregularity in data collection in determining differences between community colleges. Mark Schneider, a vice president at the American Institutes for Research, voiced some of his concerns during the question-and-answer period at the event. Though Schneider expressed misgivings about the use of IPEDS to measure completions — given its shortcomings in tracking successful transfers in and out of community colleges, for example — he was willing to concede that it is the best data source currently available by which to compare institutions. Still, he took issue with how the prize was judging institutions on their “learning outcomes” and “labor market outcomes.”
“I’m really curious, since I’ve been thinking about this for many, many years, [what] the measures [are] that you have for both of those other two categories,” said Schneider, former commissioner of the U.S. Department of Education's National Center for Education Statistics.
“The learning outcomes is a giant enchilada, if you will; I have no idea how you measure it. You have [the Community College Survey of Student Engagement], but that’s just process, that has nothing to do with actual learning. With the work outcomes, of course, you want to know if people are employed … but that’s a spotty process…. What are your data definitions? How are you verifying if what one college says we’ve done is the same as another college — because ultimately you are comparing different schools and the question is, are you really measuring on the same metrics?”
Jane Oates, assistant secretary for employment and training administration at the U.S. Department of Labor, defended the process from Schneider’s critique, arguing that community colleges can submit information like employer survey and coursework portfolios to meet these measures.
“The nice part about the metrics of this prize, and former NCES commissioners may not agree with this, but there’s validity in measures that show real results," Oates said. "If a local hospital is hiring your RN graduates over someone else’s graduates from a different sector, that says something, and that should be a data validator that we look at as we move forward, because most of the kids who go to college anywhere need a job — those without trust funds. But definitely we know those who go to a community college ... go forward to get that education to get a better job and get themselves more opportunity. We need to be open to what indicators ... may be different in Kentucky and Massachusetts and New Jersey and California, and we should be open to that.”
Joshua Wyner, executive director of the Aspen College Excellence Program, told Inside Higher Ed after the event that he stood by the list of 120 community colleges as including those who have “achieved greater excellence on IPEDS measures than others.” In the next step of narrowing down the list of 120 institutions to 10, he said it was the goal of the prize to “bring sense and comparability to nonstandard data systems.” He admitted that there would be some level of subjectivity to the process in terms of determining the reliability and validity of the data.
“I welcome criticism of this list, if people would have specific ways to suggest improving it,” Wyner said. “In fact, I want that kind of conversation. I want people to question what it means to be excellent in community colleges. What I’m not interested in or don’t think we can continue to do is to say, ‘Well, because we haven’t done X, there’s no valid way to measure these institutions.’ For too long we’ve wallowed in the diversity of community colleges and how different they are and how the non-credit side and credit side compare and the regionalism, and we’ve recognized very clear differences between community colleges and throw up our hands and say, ‘They’re not even comparable, so don’t even try.’ I think that’s really damaging to say.”
Wyner also defended the list, noting that the prize is looking at community college success at an institutional level. In other words, those well-known community colleges that have benefited from efforts to improve select programs but have not “moved the needle” more broadly are not on this list.
“I would rather use imperfect, but I would argue rational, data that are campus-wide to identify the best than the processes we’ve engaged in in the past,” Wyner said.
Officials from some colleges that were not featured on the initial list of the best 120 institutions noted their disappointment in the selection process.
“It’s disappointing that [neither] Montgomery College [nor] any other community college in Maryland would ... be considered in the 120 eligible institutions for the prize," said Elizabeth Homan, a spokeswoman for Montgomery College. “We noticed that they used IPEDS data, which doesn’t always demonstrate community college success…. For transfer institutions like Montgomery College, we have many students who transfer before they earn their degree. Certainly Montgomery College considers itself a top community college, and we’re proud of the success our students have achieved, whether they graduate or transfer or come to us for just a few classes to help them with their career.”
Other outside observers were dubious of any attempt to “rank” community colleges. For instance, The Washington Monthly ranked what it called “America’s Best Community Colleges” last year using CCSSE data, to the dismay of many community college advocates who argued the data were not meant to be used in such a way. And though Aspen officials avoided using the word "rank" when talking about their list of 120 institutions at Monday's event, the word features prominently in an Aspen press release describing the list.
“We really don’t think ranking community colleges is appropriate or particularly helpful for our institutions,” said David S. Baime, senior vice president of government relations and research at the American Association of Community Colleges. “I think one good thing about Aspen is that it places institutions on a multifaceted metric of performance as opposed to just one number…. Still, if people view this as a ranking, we think that’s beside the point.”
J. Noah Brown, president of the Association of Community College Trustees, concurred.
"Personally, I'm not a fan of ranking," Brown said, arguing instead that the Aspen prize is simply highlighting top performers in the sector to bring good ideas to other community colleges interested in improving their completion efforts.
The Aspen Institute answered the frequently asked question — whether the winner of the prize will truly be the "best community college in the country" — in the following manner:
"The Aspen Institute recognizes that community colleges in the U.S. serve a huge variety of functions for immensely diverse populations of students," the website reads. "The selection criteria for the Prize have been designed to be comprehensive and cognizant of that variety, while also defining institutional excellence in a standardized way. Specifically, the winner of the Prize will be an institution that has demonstrated excellence in both performance and improvement over time, equity in outcomes among all student populations, and a deliberate and sustained focus on using data to guide practice and policy to improve outcomes. In the end, the comprehensive nature of our three-round process will yield a list of finalist institutions and a winner that have achieved truly exceptional results for students."
Search for Jobs