Where Analytics Go Wrong

Jeff Aird says until higher ed uses analytics in a self-aware and brutally honest way, it can’t fix the growing problems with student success and retention.

September 13, 2017

Colleges and universities have traditionally looked at completion and achievement gaps through an institutional lens. We naturally look outward to identify most problems. We see a K-12 system that produces depressingly low levels of college preparation. We see “at-risk” students with full-time jobs, family pressures and unrealistic expectations.

But what would happen if, instead, we looked at the institution through the lens of the at-risk student? What would they see as the problem? What could they teach us about our institutions?

Most student analytics initiatives follow an interventionist model. Students follow the traditional admissions and enrollment process until at some critical point they act in such a way that suggests their likelihood of success has declined. This behavior could include having done poorly on an early exam, missing a few class periods, dropping a course, or not meeting with their adviser. Algorithms and statistical models identify these markers and signal the institution that intervention is needed. The hope is that this “just-in-time” extra support will meet the students when they most need it and help them solve their problem.

The trouble with this model is that it focuses on building new processes to fix the student’s problem. It largely ignores what we can do to keep students from becoming at risk in the first place.

The interventionist model, at least as played out at most community colleges, assumes that students bring with them unique and individual characteristics and background, which, when played out in the higher education system, predispose them to become at risk for failure. The belief is not only that the power of analytics can identify these students early enough, but also that these large, bulky and bureaucratic institutions can customize individual interventions to increase the likelihood of completion.

When a recent vendor of student analytics software explained to me that his tool could “identify which students are unlikely to return next semester,” I couldn’t help but let out a small chuckle. I responded by saying, “Yeah, it’s not that hard; just start pointing.” Community colleges interact with students in nonlinear ways and often experience more than 50 percent turnover of students every year. I didn’t need help finding the at-risk students. They were everywhere. I needed to better understand why the systems and processes in our college were not already helping them.

What if we could genuinely view our institutions through the lens of at-risk students? What do students think as they fill out the application? Why do they decide to take a certain course? How do they feel about the assignments they are given? We would begin to see that the choices, behaviors and actions we deem “at risk” are often explained as natural and even expected outcomes given the way we design their experience. We would discover that much of the student success problem resides not in at-risk behavior, but rather the business model, systems and processes that produce at-risk students and then try to fix them.

Perhaps we don’t need to intervene with students, but rather with ourselves?

Exposing Institutional Problems

We need a new business process: one that better matches the needs, desires and expectations of our students. Instead of trying to fix the student, we would put our efforts on exposing and solving the institution’s problems. We need to let our student success problem solve ourselves.

Could we transition our efforts at implementing an interventionist analytics model to a formative process model? A formative student analytics model focuses on looking at the institution through the student lens. Shift the focus from creating new supplemental support systems to improving or eliminating the existing processes. Imagine having intricate, detailed and actionable information about all students as they are going through our admissions, orientation, classes, food services, advising, midterms, registration, etc. We could learn from the students how to make those processes better, how to improve their learning and how to increase engagement.

What if this information was readily available to both front-line staff and managers to learn from, respond to and improve by? What if we could build a system so student-centric that few students actually ever became at risk? What if we built a system that works for at-risk students rather than trying to help them through a system that creates them?

New Rules Required

This will require what Mark Bonchek calls “unlearning” in his November 2016 Harvard Business Review article “Why the Problem With Learning Is Unlearning.” Bonchek is not an educator, but his message rings true to those who serve community colleges. While community colleges are admittedly a more recent development (at least compared to their university peers), they were built upon a nearly ancient educational model. This model certainly is not obsolete, but, to borrow from Bonchek, “it is decidedly incomplete.”

The single greatest challenge faced by community colleges is unlearning the assumptions of structure, design and organization that are nearly ubiquitous. The sector is permeated with obsolete mental models. We see a K-12 system that produces depressingly low college preparation. We see at-risk students with full-time jobs, family pressures and unrealistic expectations. While these factors are largely true, by putting them as the focus of our interventions, we block a perspective of a true self-reflection needed to improve organizational design.

Too many of us see a perfect educational model with broken students. If we could unlearn our current model, we would discover that much of the student success problem resides not in at-risk behavior, but rather the organizational model itself. Think of it -- we have a model in which nearly four in five students are at risk. We intentionally funnel students into a system we know isn’t built for them and then we try to “intervene” around the edges to plug the holes. Instead of trying to fix the students, we need to put our efforts into unlearning the model.

We treat students as passive consumers even when they would rather to be co-creators in a meaningful educational experience. We continue to operate as a linear and transactional model even though students don’t interact with us in a linear fashion. It’s admittedly scary and unclear how we embrace the nonlinearity of the learning journey, but as we shift to focus on building continuous learning-centered relationships with students, we will more fully meet our purpose as community colleges.

Until we use analytics in a self-aware and brutally honest way, we will only be working around the margins.


Jeff Aird is vice president for institutional effectiveness at Salt Lake Community College.


Back to Top