On Tuesday I had the opportunity to talk briefly with someone who works at a college that has set up some competency-based degree programs about how it works. His college has been around for a long time, and it does CBE based on pre-existing credit-hour based programs. (For those keeping score at home, it’s not “direct assessment.”) In its CBE programs, it has a subscription model: students sign up for a block of time, and it’s “all you can eat” during that time. They have to complete a certain number of competencies to maintain “satisfactory academic progress” for financial aid purposes, but that’s a floor; the only ceiling is the completed degree.
As with College for America, he mentioned that they work closely with a lot of large employers to help upskill incumbent employees. (Yes, I just used “upskill” in a sentence. Admin-speak sneaks up on you…) He expressed pleasant surprise at how open-minded employers’ HR departments have been about a format that still strikes many academics as radical. If we hadn’t been pressed for time, I might have pointed out that non-credit workforce development programs have always been like that, but it didn’t seem urgent in the moment.
We discussed the campus ERP system challenges, the shift in the faculty role, the genesis of the competencies themselves, and the parallels between “new” competency-based education and “old” prior learning assessment. The real surprise, for me, was when he named their greatest challenge:
Procrastination is not unknown in traditional higher ed. In my undergrad days, back when history classes were short because the earth’s crust was still cooling and we still weren’t sure about the whole “written language” fad, the usually-quiet campus computer center really started hopping around the time that term papers were due. We all know the jokes about the correlation between grandparents’ deaths and final exams. Terms like “cramming” and “extension” and “all-nighter” emerged to describe behaviors around deadlines. As a writer, I can personally attest that in the absence of deadlines, absolutely nothing would get written.
And apparently, that’s the tragic flaw in a competency-based program. In the absence of deadlines, many students just never buckle down.
Procrastination is bad enough in a traditional structure, but class meetings, semester boundaries, project deadlines, and exam dates provide motivation. They can provide the nudge to move a particular task from “important, but not urgent” to “urgent and important.” But in an unstructured, all-you-can-eat setting, students can coast.
I saw a little of that at Holyoke. The math department there, with my encouragement, developed a self-paced developmental math sequence. The idea was to allow students to move more quickly through the foundational stuff so they could get to the courses that “count.” In developmental math it seemed to make particular sense, since students’ preparation levels were markedly uneven; sometimes they’d already be solid with whole numbers, but hit the wall on fractions. Allowing them to blast through the stuff they already knew well would give them more time where they really needed it.
It worked as intended for a smallish group of students. It made little difference for a larger group. But for a plurality, self-pacing meant slowing down. That was not the intention, but that was what happened. I subsequently heard from colleagues at other schools that their results were similar. I wasn’t sure how much of that reflected procrastination and how much reflected engaged struggle. At the time, I assumed it was mostly the latter. Now, I’m less certain.
It’s possible to provide deadlines, of course, but the more you do that, the more you move from a competency-based format to a regular, if online, course.
Small sample sizes only tell you so much, of course, so I’m curious. Has anyone out there seen an elegant and effective solution to the procrastination problem in competency-based programs?