• Just Visiting

    John Warner is the author of Why They Can't Write: Killing the Five-Paragraph Essay and Other Necessities and The Writer's Practice: Building Confidence in Your Nonfiction Writing.

Title

A Hippocratic Oath for Algorithmic Intervention

First, remember that individuals are not averages.

October 17, 2019
 
 

So here’s something I’m thinking we need: a version of the Hippocratic oath for education technology.

Can we agree on a framework that respects each student as an individual and invests in a “first do no harm” ethos?

More precisely, I am concerned about aggregated, algorithm-driven interventions and how they are applied at the individual level. This concerne is rooted in one of my personal mantras when people point to Big Data driven “solutions”:

Individuals are not averages.

What may be true in general and in aggregate may not/will not be true for each individual that is part of that aggregate. I believe that educational institutions are morally and ethically bound to protect the rights and freedoms of individual students, just like doctors must treat their patients as individuals.

Ergo, we need to agree about some guidelines for using this stuff.

My concern is not new. I was disturbed by some of the implications of Purdue’s Course Signals[1]all the way back in 2014. But recently there is more and more energy and money behind these algorithmic tools and how they’re going to be used in direct interventions with individual students.

Recent examples which have landed on my personal radar:

1. Tracking software on university websites that develop a score on the level of student interest in an institution, which then correlates with their likelihood of admission.

2. Location tracking of students on campus to see where the “successful” ones are spending their time in order to develop nudges targeted toward other students, e.g., people with GPA’s above 3.0 tend to spend X amount of hours in the library. How about you?

3. The “genoeconomics” movement, which is in search of the “money gene” and is fully behind the utility of one’s “polygenic score” for educational attainment as a predictor of future outcomes.[2]

I do not know why the potential for these things to harm individuals is so obvious to me, while seemingly not being a concern of institutions, but honestly, it’s all I can see every time I read news of some new data-driven application or intervention.

For example, the predictive analytics attached to the university website tracking software can help identify wealthier students who are more likely to be able to pay tuition, which results in additional recruiting attention, therefore making them more likely to take a slot in an incoming class. 

To the institution, this is a sensible strategy to protect a necessary revenue stream. 

To the individual student who is less wealth, and who does not get into the state university closest to her home – which would allow her to save money – this is a system which may have economic effects that quite literally put constraints on the rest of her life.

To the institution, the aggregation of student behaviors revealed through location tracking into individual nudges is a move to increase persistence and retention.

To the individual student, we may have created a nice little anxiety app, where each alert is an opportunity to dump a little more cortisol into the old system and spend some quality time worrying, rather than, you know, working (or playing or sleeping).

To an institution, gaining an additional 11% edge in determining the variance in likely educational attainment may help them focus their admissions on those students more likely to succeed, saving them from wasting precious resources on the polygenetically disadvantaged. 

For the individual in this case, I hope the problem is obvious. The precision education movement is not so precise. Someone with a polygenic score at the 98th percentile could demonstrate educational attainment anywhere from the 2ndto the 98thpercentile.  

 

 

 

As of yet, as far as I’m aware, we are not directly applying ecogenomics to individuals pursing education, but rest assured there are people working diligently to make this happen. 

In each case, we see a disconnect between the needs of the institution and the needs and rights of the individual. A college must raise its graduation rate, and nudging particular students towards particular fields of study where the aggregates say they’re more likely to persist must be a good thing. 

But what if it pushes some individuals who could’ve succeeded on one path into another that is ultimately lower-paying? Those individuals have been demonstrably harmed by being subject to the “wisdom” of aggregation. 

Kyle Jones, a researcher of information policy and ethics in the Department of Library and Information Science at IUPUI raises these and other questions about student privacy and potential harms in a recent interview at Project Information Literacy.

As Jones says, “Those who advocate for learning analytics have an educational policy agenda in mind. What they choose to quantify and analyze in part signals what is important to them. But what is important or valuable for those who have the power to pursue analytics may not be the same for those who become the subjects and targets of learning analytics.”

In other words, individual rights and freedoms are secondary to the needs of the institution.

Like Jones, I think there are potentially positive applications for this technology in delivering useful information to students that enhances their agency, but I share his concerns expressed here (emphasis mine): “The problem is that many of these analytic systems use a combination of limited predictive models, potentially biased algorithms, and paternalistic nudging strategies to turn student behaviors toward outcomes the institution believes worthwhile; it doesn’t follow that students share their institution’s views and goals.”

Some would argue that these are tradeoffs. I disagree. I think they’re betrayals. The lure of these technologies is one of the reasons why we need some kind of firm ethos around when and how we’re going to allow aggregated data to determine the fate of individuals.

We should do everything possible to help students be successful, but we cannot be in the business of dictating what success looks like for our students.

Of course it is the underlying conditions of austerity and scarcity that make all of these technological interventions potentially attractive. Mississippi State needs to try to enroll as many out of state students as possible because they don’t receive sufficient funding from the state itself. In a world where sufficient funding was guaranteed, the wouldn’t have to pay outside consultants to track student clicks on the university website and provide reports of prospects affinity scores. Individual students also wouldn’t bear the unnecessary expense of potentially being forced into an out-of-state college because the slots at their home institutions are being taken by wealthy students from another state.

These applications are already doing harm at the individual level. 

How far are we willing to go to justify the continuance of the institution?

How can we claim to fulfill a student-centered mission when students are treated as aggregates, rather than individuals?

 

 

 

[1]Which I believe is no longer in use.

[2]This one was already on my radar, but recent events put it back.

Read more by

Be the first to know.
Get our free daily newsletter.

 

Back to Top