You have /5 articles left.
Sign up for a free account or log in.
Along with many other community college people, I’m glad to see that the National Center for Education Statistics is finally making some fundamental changes to its IPEDS numbers, which have painted an unduly negative picture of community colleges for decades.
IPEDS (the Integrated Post-Secondary Education Data System) calculates a “headline” graduation rate based on the cohort of first-time, full-time, degree seeking students who graduate the institution at which they started within 150 percent of “normative” time. For a community college associate’s degree program, that means three years. Nationally, the average hovers in the low twenties, with some variation based on regional factors. (For example, regions with heavy concentrations of four-year colleges tend to have lower two-year college graduation rates, because more of the “honors” high school students take those other options. In states with sparser four-year options, the two-year colleges attract more of the high achievers.) Those low-sounding numbers give ammunition to a punitive “performance-based funding” movement, and serve to reinforce the stigma around open-admissions institutions.
Which might be okay if they reflected reality, but they don’t. The NCES also tells us that nearly half of bachelor’s degree graduates in the US have significant numbers of community college credits. Squaring that number with a graduation rate in the low twenties suggests that the graduation rate is missing quite a bit. And it is.
The time limit is a poor fit for many student populations with particular needs, such as English-language learners. They often take longer than three years to finish because they’re learning the English language as they go. That’s an impressive feat, and one worthy of support, but the new immigrant who graduates in four or five years, gets a job, pays taxes, and supports a family shows up in our numbers as a dropout.
The existing rate misses part-time students, students who are returning to college with credits from previous forays, and -- remarkably -- students who leave a community college prior to graduation but complete a bachelor’s degree “on time.” We have a lot of those. They come to community college to get some momentum, and possibly to prove to themselves or their parents that they’re academically capable and serious; once they have some momentum, they move on. I like to use theological language to describe them. They treat community college as purgatory, where they cleanse themselves of sin before moving on to the promised land. By any reasonable measure, both those students and the community colleges that serve them have succeeded; by the old IPEDS measure, though, they count as dropouts.
Part of that is due to an ambiguity in the term “degree-seeking.” “Degree-seeking” doesn’t necessarily mean “degree-seeking here.” For a student who starts at a community college to save money and/or overcome a shaky high school record, but who really doesn’t care about picking up an associate’s degree on the way to a bachelor’s, the IPEDS measure is a terrible fit. It picks up far too many false negatives. The student who does a year at Brookdale before transferring to Rutgers, where she graduates on time, shows up in our numbers as a dropout. She lowers our graduation rate, despite successfully completing a bachelor’s in four years. Substantively, that’s absurd.
We’ve developed some homegrown measures that include early transfer, such as the Voluntary Framework of Accountability. But even there, we can’t always capture transfers across state lines. For smallish states in densely populated regions, that’s a major issue; we send plenty of students to New York City, Philadelphia, and other areas, but they show up as dropouts. I’d make a distinction between transferring early to Temple and dropping out, but the Feds don’t. The same applies to post-graduation wage data. Many of the jobs in NYC or Philly pay more than jobs locally, but only the local ones show up in our data. That leads to figures lower than reality.
Most of this would be inside baseball among sociologists if it didn’t impact our funding and public image. But it does. A measure that makes reasonable sense for a residential college that only admits academically strong students who can attend full-time is a lousy fit for us. The new-and-improved version still has some key gaps -- transfers across state lines still show up as dropouts, rendering false negatives for small states -- but it looks at more students. That’s a start.
In my perfect world, we’d have accurate data that we’d use for improvement, rather than punishment or stigma. We’d recognize that students stopping out to work for a while isn’t necessarily a sign of institutional failure; if anything, the ability to offer second chances is a feature, not a bug. Heck, while we’re at it, we’d have parity in per-student funding with four-year colleges and universities, and maybe some policies based on the students we actually have.
But for now, some recognition that the complaints we’ve been lodging for decades are largely correct is a welcome start. I’ll take it.