As a writing across the curriculum scholar and programmer, I’ve been talking about artificial intelligence writing tools a lot lately.
I’ve heard it all, from the woes and worries of students using ChatGPT to cheat their way through college to AI tools being the new face of the future to improve everyone’s writing and eliminate jobs. I’ve worked with faculty members who actively embrace AI and find ways to incorporate it into their pedagogy. I’ve also worked with those who fearfully ban it and revert to requiring students to handwrite essays or present orally on topics.
And I’m here to tell you, based on decades of writing studies research paired with what we know from learning theory, that AI is not going to ruin or replace writing. We don’t need to—and shouldn’t—eliminate writing classes or instruction because of AI (despite what you might have read in Melissa Nicolas’s recent opinion column calling for the elimination of the required first-year writing course).
About me, to get my own credentials out of the way: I hold three degrees (B.A., M.A. and Ph.D.) in composition and rhetoric, having spent 10 years explicitly studying writing. I have taught writing at all levels, from first-year writing to advanced writing to graduate-level writing courses, and have taught a wide range of genres. I work as the assistant director of Writing Across the Curriculum at Miami University’s Howe Center for Writing Excellence, where we’ve taken a leading role on providing faculty development and support for writing and AI.
More important than my own credentials are the core theories and principles of writing and learning to write. Our center uses guiding principles that can (and should) apply to any learning context. Some of these principles are:
- Writing is social and rhetorical—writers benefit from talking and sharing drafts with other writers.
- Writing is informed by prior experience, including experience with writing, reading and cultural experiences and norms.
- Writing is not natural; rather, it is something we all work to learn. Being a competent writer is something that can be developed across time and with practice.
- Reflection and metacognition are important parts of improving as a writer.
With these principles in mind, which are based on extensive work from many writing scholars (see Naming What We Know, edited by Linda Adler-Kassner and Elizabeth Wardle, for more), we can thus think about AI and writing in the following ways:
- Students/writers still need to talk to others about writing.
- Students/writers bring with them invaluable (and situated) prior knowledge that AI generation can’t replicate.
- Students/writers need to practice, and writing is learned over time.
- Students/writers need to reflect on their writing, as the product isn’t the only important aspect.
That is, writing is not (only) about the final product. By definition, the final product is what generative AI tools like ChatGPT produce.
In her column, Nicolas writes, “AI can generate genre-specific text, approximate discipline-specific prose and create content that is free of grammatical mistakes.” Noting that these are the skills faculty outside writing programs typically want students to learn in first-year writing, she concludes that students therefore “don’t need a required first-year writing course anymore” and asserts that “AI will take care of students’ biggest writing problems so professors can spend all their time on disciplinary content.”
The issue, however, is that AI tools cannot “do” or “produce” everything that students should be working toward in such a course, or in other disciplinary-based writing instruction.
For example: take some of the course outcomes for my own writing course this semester (technical writing), where students will:
- Compose ethical and rhetorically effective visuals for technical communication;
- Develop teamwork strategies and intercultural communication competencies; and
- Reflect on composing processes and on the rhetorical and ethical decisions made in the researching, drafting, and delivery of technical communication.
How can ChatGPT help students reflect meaningfully on their composing processes and why they made the decisions that they did? Again, these outcomes are not even about the final product that a tool like ChatGPT can create. These outcomes are about students’ learning and critical thinking, about reflecting on their growth as writers, and about being ethical—including in their usage of AI tools. Writing is a way to express and record that learning, but it’s the thinking and the progress that happen throughout that’s the main point.
As mentioned, Nicolas also argues, “AI will take care of students’ biggest writing problems so professors can spend all their time on disciplinary content.” From a Writing Across the Curriculum perspective, this is inherently incorrect, as writing cannot be separated from disciplinary content.
That is: writing is thinking and learning, including in disciplinary courses. Learning how to write is disciplinary content. The ways my colleagues in accounting write are not ways that I as a writing teacher can help my students learn how to write, and writing thus needs to be taught in context and in concert with said disciplinary content.
There are other reasons why a required first-year writing course might be concerning and why it may be worth asking whether or not it should exist. There are decades of research to support and complicate debate on this question, such as Elizabeth Wardle’s piece on “mutt genres,” which explores how students don’t actually write the genres in a first-year composition class that they are meant to write in their actual disciplines. And there are very real issues of labor and expertise in terms of who is teaching these courses (and if their expertise is best matched with the course).
But the point I want to make here is that claiming that AI can “do it all” is not an effective argument for eliminating a required first-year writing course (or any writing instruction). It is not one that is even sound or true, based on what we know about how writing and learning work and what needs to happen to make effective, ethical writers (see How Learning Works by Susan A. Ambrose et al. for more on learning).
AI tools have their uses, as well as their limitations and ethical concerns (like intellectual property and environmental impact and the human labor to train these programs). What we do know when we bring the conversation back to learning is that we as educators—in composition and rhetoric studies and in other disciplines—want students to show us what they can do.
Let’s not fearmonger or focus too narrowly on how AI can replace writing and instruction in our fields. Let’s instead double down on what we want students to do and how we can help them get there—with or without these tools, as we see fit, and as we act from our principles around learning.