You have /5 articles left.
Sign up for a free account or log in.
This piece in IHE went uncommented the day it was published, which, I’ll admit, surprised me. It was one of the most hopeful pieces I’ve read in a long time.
It’s about the Community College of the District of Columbia, a new institution growing out of the University of the District of Columbia. As many people know, the District of Columbia has some issues with poverty, crime, and public school performance. Just a few. Not like you’d notice. So a new community college there makes a world of sense.
But what’s especially heartening is that, as a new institution, it’s actually taking the (vanishingly rare) opportunity to build all of its systems from the ground up around a robust assessment program. Put differently, it’s building an experimental ethic into its design.
Go, CCDC!
Too often, assessment measures are appended as afterthoughts, which, in fact, they were. Departments generally consider themselves the unquestionable local experts in their fields, above being told anything by anyone about what they teach. Accordingly, they don’t see much point in assessment, and treat it as meddlesome busywork to be minimized when it can’t be entirely ignored. They assume they already know the answers, so they don’t like being asked questions.
CCDC is reversing the order. The burden of proof is not on the data; it’s on the programs. If the programs fall short, presumably, they’ll have to adjust. Expertise will be in the service of the mission.
It’s still early in the game, and I’d expect to see ‘pushback’ in various areas gradually get stronger over time. Some people will try, sincerely or not, to explain programmatic failures by reference to lack of funding; if the college falls for that, it will quickly fall into the same sinkhole that has swallowed much of the rest of higher education. The inevitable lag between when you need to make decisions and when the data actually comes in will create some awkward moments. There will be honest and real disagreements over the interpretations of some of the findings, some of which will probably become heated. To the extent that destination colleges make their acceptance of transfer credit decisions based on other criteria, there may be gaps between what the data says and what’s politically possible. And probably some interpretations will be unfortunate, reductionist, or otherwise flawed.
But still. This is absolutely the right way to go. The community college exists to serve the students and the community, rather than the faculty. That means relocating ‘truth’ from ‘he who huffs and puffs the loudest’ to ‘what actually happens with the students.’ Allocating resources based on data is much easier if the data measures are built in from the beginning. The students are too important, and too vulnerable, to trust their fates to the way things are usually done. We know perfectly well how they’ll fare if that happens.
If anything might help, this might. Best wishes, CCDC. I’m rootin’ for ya!