Because of its sudden appearance, and because the world does not wait for us to figure out the best possible approach before the new semester starts, I am disinclined from being too judgmental about the different choices instructors are making on how to deal with ChatGPT being in the world.
I think an outright ban, or any other approach that relies on surveillance, detection and punishment, is not a good idea because of what this signals to students about the underlying values you’re bringing to assigning writing, but for the moment, I understand even this.
I do believe that a much better, more sustainable, more scalable long-term solution is to put learning at the center and give students something they believe is worth doing, but this is a tune many have been singing for a long time with only limited success. The fact is that our institutional structures actually make it quite hard to put learning at the center of our courses, as my fellow Inside Higher Ed blogger Steven Mintz pointed out in his recent list of “hard truths” about higher education.
I think if someone is primarily concerned about ChatGPT’s utility as a tool for students who want to commit academic dishonesty, they are giving up the game before the clock even started. To me, it’s a tacit admission that something is not right at the core of the entire enterprise.
That said, we have to deal with this thing, thoughtfully, productively, meaningfully, and, to that end, most of what I’ve read, watched and listened to regarding how educators and educational institutions are responding has been encouragingly thoughtful and meaningful.
I don’t necessarily agree with everything I’m hearing, but agreement isn’t the goal, and I’ve found my own views shifting as I hear more from others. We’re still in the sense-making period of this discussion, so we should keep discussing and sharing perspectives, making sure to keep our underlying values around what’s important in education in mind as we do this.
To that end, I want to explore a common talking point from those of us who do not believe banning AI like ChatGPT is a good idea—namely that ChatGPT is to writing as the calculator is to doing math.
There is obviously a lot of truth in this statement, and I’ve heard from multiple math educators that the current discussion about ChatGPT reminds them of a similar storm when calculators became broadly available to students in math classes.
If technology can now do in seconds what it takes students many minutes to do, should we still be asking students to do those things?
In both the case of ChatGPT and of calculators, the answer is no, but I think not necessarily for the same reasons, and the difference is important when it comes to thinking deeply about how to move forward.
For calculators, when it comes to the mechanical operations calculators can do, the labor of the machine is, at least as I understand it, identical to the labor of the student. This is not to say that students don’t necessarily benefit from knowing how to do those operations without the aid of a calculator, but the labor itself is identical, with the machine likely being much more accurate as well as fast.
Given that this is the case, forbidding students from using calculators in a math course closes them off from spending more time practicing the kind of thinking math requires, as opposed to working through those mechanical operations by hand.
With ChatGPT, however, while the output of the algorithm may look similar to what a student produces in a course, the underlying labor is actually quite different.
Writing is thinking. It is both the expression and the exploration of the subject of the writing. The act of writing both requires thinking about how to best present the material to the intended audience and is a chance for the author to process and refine their own knowledge and understanding about a subject.
At least, this is how we should be thinking about writing, but of course, much of the criticism I’ve written about here and in my books is that we have actually been training students to behave like algorithms engaging in pattern matching, rather than making them think like humans working inside a genuine rhetorical situation.
As has been well established, ChatGPT and other big data AI do not think. They do pattern matching. They create pastiches that simulate thinking but are not the product of thought.
If we want students to learn to write, we must give them purposeful and meaningful practice at thinking while engaging with writing experiences.
Now, there are many ways that a tool like ChatGPT may be able to help with this thinking, and I’m encouraged to see all the energy that is going into figuring out how to integrate the technology into class activities oriented around thinking.
However, while I think pursuing a long-term banning of ChatGPT is the worst possible approach to the future, I also think that training students on how to make use of ChatGPT in all their writing would also be a significant mistake.
Students should still be doing lots of writing where ChatGPT is not only not necessary but is largely irrelevant to what they are doing. Even in skilled hands, overrelying on ChatGPT for the language that winds up in a written product risks derailing the important exploration of an idea that happens when we write. This is particularly true in school contexts, when the learning that happens inside the student is far more important than the finished product they produce on a given assignment.
That we seem to so highly value that product, rather than valuing the learning, is why so many people are worried about students using ChatGPT to cheat.
Learning is rooted in experiences. A calculator doing long division or churning through a formula is not robbing students of experiences that they must do over and over again to continue to learn. The same is not true of AI like ChatGPT and writing.
I don’t know much about teaching math, but my understanding from folks who do is that we’ve made great progress in terms of how we think about teaching math in a world where these tools can do sophisticated calculations, and, in fact, students spend much more time practicing learning how to think in math terms now.
For writing, we don’t need to uncover any innovative pedagogy. My own book The Writer’s Practice opens with an experience I did in my third-grade classroom, writing the instructions for a peanut butter and jelly sandwich, and then being required to follow those instructions to the letter to generally disastrous (but sort of fun) results. My actual instructions were a disaster, but the learning was indelible.
Our problem is that over time as assessment and standardization came to dominate, we moved away from thinking and toward creating writing simulations.
Returning to the roots of what writing means, and giving students access to experiences that engage and challenge their thinking—whatever that looks like—is the only way forward.