The article's headline set me on edge, “Big Data Comes to College.”
The headline set me on edge because I do not trust the solutionist edtech-utopians. I do not doubt their sincerity, or their wish to do well by students, but I don’t believe they think very seriously or deeply about education and the role it plays as part of our humanity.
What they call success often looks more like failure to me.
The article itself, by education journalist Anya Kamenetz, describes Purdue’s Course Signals, a kind of semi-automated system to alert students to when they’re in danger of doing poorly in a course. It makes use of data points like grades as well as “time on task,” while adding in academic risk factors derived from their demographic profile. The same grades and time on task for a low-risk student may not trigger a warning as they would for a high-risk student.
When the software sends off a distress flare, the professor is then expected to, in the words of Matt Pistilli, one of the developers of Course Signals, send “tailored feedback – tips, tricks, hints,” to help the students succeed.
According to Purdue, Course Signals improves grades in individual classes, as well as raising overall retention.
So, the goal is to improve grades and increase retention. By these measurements, Course Signals, at least to some (still debatable) degree, “works.”
But what if we have different goals for our students? Or maybe additional goals beyond college persistence and good grades?
For example, what if one of our goals for students is the development of agency, the ability to negotiate and exert control over their own lives? What if we believe this is an important goal because it is significantly correlated not only with success, but happiness and well-being?
What if we believe that a student’s education should extend beyond “tips, tricks, and hints,” for getting better grades?
What if we believe that failure is a productive part of learning?
What if we worry that their adult lives will not come with Course Signal warnings?
And mostly, what if we worry that this institutional focus on capturing and employing data distracts us from what is most meaningful about the college experience at places like Purdue?
Interestingly enough, Purdue, along with Gallup, is indeed attempting to measure what’s meaningful about college and we shouldn’t be surprised that it doesn’t have much to do with technology.
They’ve zeroed in on two key areas: student support, and student experiences.
Under support, they asked if students had, “at least one professor who made me excited about learning,” if their “professors cared about me as a person,” and if they “had a mentor who encouraged me to pursue my goals and ambitions.”
The experiential criteria focuses on the type and breadth of work, whether or not they had the opportunity to work on a semester-long project, if they did an internship that seemed related to their studies, and if they were active in extracurricular organizations.
The more criteria students identified as applying to their college experience, the happier they reported being in work and life after college.
Does Big Data fit into this picture? Maybe. Perhaps Course Signals is a way to engender feelings that their professors care about them as people.
Or maybe Course Signals tells students that they are a data point. Or maybe Course Signals becomes a crutch, substituting tips and tricks for in-depth human interaction, the kind we know alters lives.
There are other reasons to be wary of Big Data in higher education. Kamenetz identifies one of them, the under-discussed privacy issues regarding students’ personal data. If I look into Course Signals and see two students with identical grades, but only one of whom receives a warning, I will know that there is a demographic red flag on one of their records.
As the course instructor, I have no right to that information. Moreover, I do not want it. The “Pygmalion effect” and “stereotype threat” are real. Pistilli himself says that because of Course Signals, some students drop earlier than they would otherwise, creating a kind of self-fulfilling prophecy.
I want to relate to my students as the persons they prove to be over the course of the semester. I do not care if they flunked my course the previous time they took it.
Their slate with me is, and always should be, clean.
I am bothered that so little of the conversation surrounding these technologies examines the costs, and not just the very obvious monetary ones.
We are in the midst of what I can only think of as a mass delusion among so-called K-12 education “reformers” that technology is the key to improving schools. This has resulted in $1 billion being spent on iPads in the L.A. Unified school district. Seventy-seven percent of last year’s Federal “Race to the Top” money for the New York City Department of Education went to administrative costs primarily related to data tracking and “accountability.” Maryland needs at least $100 million just to allow students to take the Common Core compliant tests.
How we choose to relate to our students is a reflection of our values. What does embracing Big Data say to those students? That we care deeply about their futures?
I don’t see it. I think it says that they’re experimental subjects, just like Facebook recently made clear to its users.
As Kamenetz says at the end of her article, “Learning analytics has yet to demonstrate it’s big beneficial breakthrough” its “penicillin” in the words of HarvardX researcher Justin Reich.
I wonder, what if learning analytics are not the cure, but the disease?
 One obvious answer is that we’re not paying a huge swath of college instructors a living wage. In the zero sum game of higher ed budgets, every dollar to technology is one less for actual human labor. Ask the average instructor if they’d rather have Course Signals for their course, or a raise.