You have /5 articles left.
Sign up for a free account or log in.
Like all of us, I’m struggling to make sense of my job in a world where students have unbridled access to AI tools. And I thought I had ChatGPT’s number—I had two tools that were crowdsourced that helped me to discover and prove plagiarism.
I shared with colleagues that “All we have to do is ask ‘Did you write this?’” and then copy and paste the student work into the prompt box. It will tell you whether it created something that a student turned in. My plan to create unique modes of assessment and then prove academic (dis)honesty was communicated to my peers and to the students themselves. I felt confident about continuing to use learning management systems to host exams.
My first exam of the semester happened last week. ChatGPT told me that it wrote 20 out of 24 of my students’ essays.
At first, I was angry with my students. I emailed students expressing my frustration and a pause in my grading because ChatGPT revealed that at least part of several of the essays were in violation of the academic honor code. My inkling was that somebody created a shared study guide using ChatGPT and then, one by one, students took pieces word for word that they included in their short essays. Was this an issue of plagiarism or crowdsourcing—and is there a difference? I pored through each word of each essay looking for similarities and differences. And on closer examination, as I carefully read each distinct response, it dawned on me that they were each, indeed, distinct.
While I was wrestling with the haste of my initial response, I received six frantic student emails, each professing their honesty, fear and dismay. Another student stopped by in tears. She rushed out of her house to catch me early in order to explain that she’d worked really hard on her study guide and the subsequent essay question. Yet another student sent me an example of how ChatGPT took credit for writing an article that we read as a class. By now, I knew that the tools at my disposal were flawed. Those 20 students in my class did not cheat on their essays, despite my confidence in my sleuthing skills.
When I walked into the classroom later that day, it was in an uproar. I simply said, “I know. I do know. And I’ll talk about it when everyone gets here.” I had come to the conclusion that I’d have to wipe the slate clean, regrade all the essays and mend the trauma that my students were experiencing. One asked me, “Were you pissed, like I was?” Another said, “That really did something to me … to read that email and experience that level of fear.” ChatGPT: 1, Students: 0.
I lamented to the students that I had made a mistake. Instead of focusing on student growth and innovation, I invested too much time into surveillance strategies. Instead of focusing on my commitment to developing strong personal relationships with students, I let a robot tell me what I should believe. Instead of trusting my students, I had trusted an online tool.
I fell for the discourse that was circulating in academic communities that told me that I “had to be prepared” for the AI apocalypse. That reaction to ChatGPT brought out the worst of me, colliding with my critical pedagogical sensibilities.
The scenario in my class, however, also reminded me of some basic principles of critical pedagogy that we must consider as we are ushered into the new AI era. First, we must trust students, foremost because they are real people. They bring to the classroom their values, livelihoods and traumas. We must be mindful of the ways in which academic surveillance dehumanizes the very people we signed up to nurture.
Second, let’s push back against the idea that it is our job to surveil our students. We should have learned our lesson when reports of racism in lockdown browsers prompted discussions of the value of education surveillance. My job is to educate and develop critical thinkers. I can use my energy to develop new ways to accomplish these goals rather than to sharpen my skills as an academic bounty hunter.
Finally, we must continually challenge positions of power that emerge in the classroom and are propped up by wider societal ideologies that result in oppression. By destabilizing our perceived position of power, we can reprioritize classroom goals and honor the voices of our students. Moreover, dismantling systems of power means reflecting on the power of our collective discourse. We influence each other’s classroom practices and can impact how we respond to this new phenomenon.
As a community of educators, let’s not be so reactive to AI. If we are reactive to ChatGPT, then we are no more nuanced than it is itself. Instead, let’s have more open conversations with our students about AI, equity and the true purpose of education. We can also ask our students what would make assignments more meaningful and valuable to them. Together, we can host conversations about critical thinking and intellectualism. And, we simply need to assume the best in our students.
ChatGPT may have bested me this time. But, if I were to continue to subscribe to the belief that it’s an educator’s job to catch and punish plagiarists, then everyone in the class would lose. Let’s not bring out the worst in ourselves or each other.