There’s a lot of things I worry about when it comes to education, but rapidly rising to the top of the list is the use of surveillance technologies as a substitute for what should be human labor.
I can chart the acceleration of this problem through my own work. When I wrote the proposal for Why They Can’t Write: Killing the Five-Paragraph Essay and Other Necessities, surveillance as a barrier to student learning wasn’t on my radar in a big enough way to include it in the outline.
By the time I was drafting the book, it became a whole chapter where I covered existent apps, like ClassDojo, which allows teachers to keep a real-time, public scorecard of student academic and behavioral performance. A recent reader of the book introduced me to something called Kloud-12, which is billed as lesson-capture technology but strikes me as more of a full-service panopticon. Technology like eye-tracking to judge levels of attention, somewhat speculative when I wrote the book, is now being employed for real.
My objection to the use of this kind of technology in education is multifold.
For one, it poisons the learning atmosphere. We do not do our best, most interesting work when we know we are being watched. Particularly for writing, we need some measure of freedom to explore as we develop. Skinner-like behavioralist practices where every moment of a student’s day is monitored for compliance and correctness is incompatible with learning.
For two, the introduction of technology often results in an exit of humanity. While ClassDojo may be teacher controlled, there is much ed tech that is pitched ostensibly as a labor-saving device that frees the teacher to do the “important” things, but this is sales and marketing cow flop.
Consider algorithmic grading technology or plagiarism detection software. This technology is not used to free the instructor but is instead employed so the instructor may handle more students. Rather than engaging with another human being, students are monitored by algorithms, which in turn alters the very nature of work that is judged positively. It is a vicious circle that divides students from their teachers with every trip around.
As I say in Why They Can’t Write, reading and responding to student work is the core of the instructor’s job. Outsourcing grading or plagiarism checks to technology is akin to asking a coach to try to work with their team when they know the score but haven’t watched the game. It makes no sense if the goal is to help students learn to write.
For three, a move to substitute surveillance technology for human engagement is tantamount to giving up on addressing any underlying structural problem. As Jacob Silverman put it on Twitter recently about the ubiquitous “Can AI help save X?” articles, the answer “is always no.” “AI is not a substitute for broken politics, discriminatory institutions, or a failing biosphere.” Believing in an algorithmic solution to a structural problem is pure magical thinking.
For four, most of the time this shit doesn’t work, even according to the product’s own low or meaningless (when judged against educational goals) standards.
And finally, even when the stuff does work as advertised, it does harm to students, because software like Proctorio -- the online exam proctoring service -- has not been designed with any thought to student well-being or learning in mind.
I am concerned about this technology because of how quickly and thoroughly it can insinuate itself into our systems. Plagiarism checkers have become tabs and functions inside learning management systems, suggesting they should be a normal part of our work. (They shouldn’t.) Tracking software to monitor “distraction” is a far inferior approach to utilizing approaches that, rather than defending against “distraction,” encourage “engagement,” as James Lang covers so persuasively in his new book, Distracted: Why Students Can’t Focus and What You Should Do About It.
These companies quickly become extremely powerful and influential and engage in actions designed to protect reputation and market share that we should find abhorrent when measured against the values we hold for education.
While I actively follow the controversies around educational surveillance technology and stay abreast as best I can of the developments of ed tech in the realm of writing instruction, I do not consider myself expert.
Fortunately, we have an opportunity to learn from experts I’ve been following on Twitter for years, such as Maha Bali, Benjamin Doxtdator, Jesse Stommel, Chris Gilliard, Audrey Watters and others participating in a “Teach-In #AgainstSurveillance” on Dec. 1, from 1 to 4 p.m. EST.
The event is to benefit the legal defense fund of Ian Linkletter, a learning technology specialist at the University of British Columbia, who is being sued by Proctorio over a series of critical tweets in which he posted links to freely available YouTube videos provided by the company that the company claims are proprietary.
Donations are not required, but you should sign up for the event in advance if you’d like access.
Education and learning are human endeavors, and they are far too important to turn over to algorithms and artificial intelligence.