In which a veteran of cultural studies seminars in the 1990s moves into academic administration and finds himself a married suburban father of two. Foucault, plus lawn care.
We all know the old joke about the drunk who’s looking for his keys outside the bar at night. He’s looking near the streetlamp, even though he dropped them half a block away, because the light is better there. If we look where it’s easy, instead of where it’s likely, it’s easy to predict the results.
Those of us in the community college world feel the same way about IPEDS data. It’s easily accessible, but it sheds light on the wrong places. It looks only at “first-time, full-time, degree-seeking” students, and allows only three years to complete a two-year degree. In other words, IPEDS captures only a minority of our students. Those with previous college experience, those who attend part-time, those who move between institutions, those who transfer after a year, those who don’t intend to graduate, and those whose timeframes are longer either don’t show up at all, or show up as failures, even if they accomplished exactly what they set out to accomplish.
Until recently, though, the choice has often been between IPEDS data, as misleading as it is, and a sort of moralistic Great Refusal of all data. So IPEDS often wins by default. Better to look under the streetlamp than not to look at all.
That’s why I was so heartened to discover that California -- California! -- has developed a much more intelligent and useful tool for getting a handle on how community colleges are doing.
(I’ve lodged my share of criticisms at the California system over the last few years, but credit where credit is due.)
The first order of business is getting the measures right. For example, many students attend a community college for the first year, then transfer to a four-year school without graduating from the cc. That’s not failure; it was the plan the entire time. (Sometimes it’s for financial reasons, sometimes for familial reasons, and sometimes to prove that a spotty high school record wasn’t a true reflection of ability.) To my mind, a student who finds his land legs at a community college and then transfers on for a bachelor’s isn’t a sign of institutional failure at all, but in the IPEDS data, that student is indistinguishable from a pure dropout.
The California scorecard looks six years out, and looks at completion of any post-secondary credential. Presumably, that combines credit-bearing certificates, Associate degrees, and Bachelor’s degrees. So the kid who did a year at the cc and then went on to finish a bachelor’s is counted, correctly, as a success.
It also disaggregates students by academic preparation level. You’d think this would be basic, but some very smart people forget to do this. If a college has, say, a 25 percent graduation rate, that doesn’t mean that you’d have a 25 percent chance of graduating if you enrolled there. (I think the technical term for that is the “fallacy of division,” which is the flip side of the fallacy of composition.) Some students complete at much higher rates than others, and an institutional graduation rate can reflect the composition of the population as much as institutional performance. The only way to tease out which is which is through disaggregating the data. If one college gets a “good” overall number because it has relatively few developmental students, and another does well even with a bunch of developmental students, then the latter is actually more impressive. Breaking out by, say, whether students placed into developmental math upon enrollment can reveal a lot. California’s scorecard does that.
It also recognizes milestones on the way to a degree, such as successful completion of 30 credits. And in some cases, it breaks out ESL students as a separate cohort. That’s helpful, since the ESL population is meaningfully different from the developmental group, even though it shares a longer march to a degree.
The improved scorecard isn’t supposed to be used for inter-institutional comparisons, but it will be. It can’t not. And that’s not necessarily a bad thing, if the people doing the comparisons know what they’re seeing. Any scorecard will be reductionist by definition; the key is in getting the essentials right. This is far better than anything else I’ve seen so far. Well done, California. Now you just have to get the whole “tuition” thing right, and you’ll be on your way.