The Ed Tech Garbage Hype Machine: Behind the Scenes
A big player in ed tech software sent me a press release. I'm as surprised as anyone.
You may be hearing some significant ed tech news in the automated feedback/grading software arena today.
Turnitin, a company most known for its automated plagiarism detection software is acquiring LightSide Labs, a "start-up with roots in Carnegie Mellon University that uses machine learning algorithms to provide automated feedback on student writing.”
I knew this was happening last Friday because I was sent an "embargoed" press release announcing this news by Chris Harrick, VP of Marketing for Turnitin.
Had it been my choice, I would not have respected the embargo because I didn’t ask for, and indeed, never even wanted the information. As I make clear in the reply I sent to Chris Harrick that I’ve included below, I thought it was odd for them to send me this information, given that I write extensively about my objections to the way companies like Turnitin replace human interaction with algorithms. But because Inside Higher Ed as an institution previously agreed to the embargo, and I work for them, I could not publish this post until the embargo lifted at 5am PST today.
In the stories published today about this acquisition, you will see a lot of what is called "churnalism," a word-for-word repurposing of PR materials provided by Turnitin without any additional perspective or reporting. This is how Turnitin would like it. You will see language like "Common Core aligned" and "real-time feedback" and "relief for teachers." Very few if any of these articles will question whether or not these are even worthy goals, or what it means that these things are achieved by having students interact with algorithms, rather than human beings.
These articles may discuss how LightSide has been piloting its software in Pittsburgh-area schools with "underserved populations." Those articles will probably not mention that one of the reasons that those populations are "underserved" - what a euphemism! - is because Pennsylvania is one of the most screwed-up states in the entire country when it comes to education policy and funding, having drained something like $1 billion dollars out of the system.
There will be nothing in the churnalism that discusses how the LightSide software was piloted at Lakeside School in Seattle, or the Gilman School in Baltimore, or Sidwell Friends in D.C., because the very idea that these elite private schools would have need of an algorithmic substitution for human beings is absurd. What you will find is that LightSide has received support from the Bill and Malinda Gates Foundation because Bill Gates is determined to help develop software for other people's children to use.
I am some combination of sad and disgusted by these things. I took out this disgust on poor Chris Harrick. I feel a little bit bad about this.
Though, he will get the last laugh when Turnitin captures a dominant share of what is likely a multi-billion dollar market for automated grading software and his stock options will allow him to retire in luxury and send his kids to private schools where they will get the care and attention of well-trained and well-compensated teachers.
Chris Harrick’s email to me starts:
I really enjoyed your piece on Automated Grading and wanted to share with you some news that we think could accelerate the adoption of automated feedback in education.”
This is my reply:
Honestly, I am confused by your "enjoyment" of my piece on automated grading and question your decision to send me an embargoed press release that trumpets the potential accelerated adoption of automated feedback software.
If you read the piece, you well know that I am not a supporter of automated feedback and education by algorithm. I included such bon mots as "What I’m saying is that writing to the simulacrum is not the same thing as writing to a flesh and blood human being. Software graded writing is like having intimate relations with a RealDoll."
And concluded with: "The purpose of writing is to communicate with an audience. In good conscience, we cannot ask students to write something that will not be read. If we cross this threshold, we may as well simply give up on education. I know that I won’t be involved. Let the software “talk” to software. Leave me out of it."
Are you sending me this press release in order to rub my nose in the triumphant acquisition of this amazing new software?
Is this some sort of warning that you and your software are coming for my job?
I'm now wondering if this email will even reach Chris Harrick, the actual VP of Marketing of Turnitin, as it seems entirely possible that it was generated automatically through some sort of database where an internet bot has mined emails associated with every mention of automated grading, and for the sake of efficiency, compiled all of those emails for mass press releases.
Is this what happened, ChrisHarrickBot? Are you in there?
Is this the Matrix? Hello? Hello? Is Carrie Ann Moss available?
If there is indeed a human being on the other side of this communication, let me reiterate my stance on automated grading software and algorithmic feedback.
Automated grading software, even if it is somehow made to be "accurate" (whatever that means), is a blight on the educational landscape.
Of course, so is Turnitin, so it makes sense that you are acquiring LightSide Labs. I have never used your plagiarism detection software because of your business model that fails to compensate students for their labor as their submissions increase the size of your database.
I also have no need for it because I have not had a plagiarism case in the last nine years, and the one I had nine years ago was easily detected by me, without the help of an algorithm. Do you know why?
Because I have a relationship with my students. I know how they write. I know what they want to write about. I know when they turn something in that they didn't write. I can even tell when they turn something in that their mom or their roommate or the writing lab got a little heavy handed with in an effort to help.
The movement to put algorithms between students and teachers in the name of efficiency and productivity is destroying the fabric of education.
I know, I know, I read your press release. I see the part where you are promising "relief" for overworked teachers and their increasing class sizes.
Do you know what would provide relief for those overworked teachers? Smaller class sizes.
Do you know what's actively working against public reinvestment in public education that might one day reduce class sizes? Corporations like Turnitin that siphon public dollars for profit and empower administrators to substitute algorithmic interaction for human contact.
I imagine that you're all nice people, that you believe in the work you do, that you are not acting out of greed or ill-motive.
But make no mistake, you are doing harm to students and the entire educational system. Your sales rationales are built on piles and piles of self-justifying B.S.
It is immoral to assign writing that will not be read by a human being.
If writing assigned by a teacher is not going to be read by a human being, we can't even call it writing, since there will never be even a hypothetical receiver of the communication.
I'm afraid I can't respect an embargo for information I never requested. I'll be blogging some version of these thoughts at Inside Higher Ed. You know the spot. It's in the link in your message to me.
If you see examples of churnalism on this topic, tweet them at me.
Read more by
Opinions on Inside Higher Ed
Inside Higher Ed’s Blog U
What Others Are Reading