The Value of the Unpredictable

Can education be the practice of freedom if algorithms shape the curriculum?

January 26, 2016

railway tracks

photo by StooMathiesen

When I was entering the tenth grade, my family moved from Wisconsin to Kentucky. For the first time, I was in classrooms with black students. It turned out it was an administrative mistake, and it was fixed. There were no black students in my courses in my junior and senior year. That’s because I and my fellow classmates were on track to go to college. Black students (and many working class white students) were expected to either drop out or learn a trade. One of the five buildings on the high school campus was for vocational training. It was some distance from the other buildings and, instead of the red brick and classical white columns of the main classroom building, it was painted an industrial green and looked like a factory or a prison. I never had a reason to set foot in that building. It belonged to another world. We were tracked, and my track led elsewhere.

It struck me as I read two articles this morning – John Warner’s “Learning is Liberation” here at IHE and a Guardian article by Harriet Swain, “In the Library, In the Gym, Big Brother is Coming to Universities” – that the current interest in learning analytics isn’t so different from old-school tracking. There was nothing malicious about my high school’s practices. They thought they were doing right by all of their students. Why ask a kid to suffer through (and quite possibly fail) advanced algebra or third-year French when it wasn’t going to be relevant to their life after graduation? When we harness a variety of data points to characterize a student’s capacity to learn, when we use predictive analytics to guide the kind of learning experiences they will be provided, we’re basing their placement in the system on what we think we already know about students. Some of what we already know is likely to be wrong because it’s shaped by biases we don’t even perceive. Some if will be wrong because we’re making decisions about what comes next before a student has a chance to find their feet. (That’s partly why so many people were dismayed by the bunny drowning kerfuffle. It’s not just the unfortunate metaphor, it’s the institutional unwillingness to prioritize the work of helping struggling students, instead expelling students quickly to avoid them being counted against the college's success rate.) How will students ever have the kind of life-changing moment that John Warner describes if all the data fails to predict it and guides them instead toward activities better suited to their past record?

I’ve heard the argument that it would be unethical to ignore the value of learning analytics (generally made when privacy advocates question the ethics of industrial-scale surveillance of students). Yet there are so many instances when the power of big data is undermined by its flaws. It’s incomplete. It’s full of errors. It’s unaware of its own blind spots that reinforce inequality. It’s often proprietary, a black box you’re not allowed to examine. It’s not always clear whose interests are being served. It’s reliant on assumptions that haven’t been tested and tends to treat correlation as causation, as when libraries assert their value by correlating library use with high GPAs, ignoring a host of factors that influence library use and success in the classroom that have nothing to do with the library’s efforts.) It can be gamed, as when students learn they will be considered more engaged if they enter certain buildings on campus, so swipe their cards repeatedly to build up engagement credit.

Data can be incredibly useful, but it’s also misleading if it’s not collected carefully and used wisely. If our educational paths are to be determined algorithmically by the record of our past, education will have no relationship to the practice of freedom. Your path will be made known, not chosen.

Admittedly, the system we have shares some of the same features as these algorithmically-driven solutions. We are too often unable to see our own biases. We make mistakes. We overlook factors that are keys to a student's struggles. But the thing that worries me the most about this rush toward algorithmic and data-driven solutions to our problems cuts out people - both faculty and the students themselves - whose learning can take unexpected detours and sometimes huge leaps. We need to take care that we don't shut out the unpredictable.

The students who were tracked into that big green building on the edge of the high school campus didn't choose to go there. They were told, based on past performance and on demographic data, that they weren't college material. No one asked me if I would rather study plumbing. But I'm grateful someone goofed and put me on the wrong track for a year. The most important things I learned at that school came from being unpredictably seated in classrooms with students who weren't like me.


Back to Top