You have /5 articles left.
Sign up for a free account or log in.

Man using a laptop computer chatting with an artificial intelligence asks for the answers he wants.

Supatman/iStock/Getty Images Plus

It’s been nine months since OpenAI released its beguiling chat bot into the wild. Since then, judgments have swirled, especially in education. First came full-throated panic that ChatGPT would lead to rampant cheating, especially for writing assignments. Then followed more tempered discussion about how creative use of the technology could bolster learning.

Meanwhile, a whole industry has risen up for detecting when AI, not humans, did the writing. Yet however crafty the tools have become, they remain easy to fool. Just ask students about their success in gaming the system.

With the new academic year upon us, let’s move past last year’s battles and rethink the problem. We need guidelines to help students—and the rest of us—decide when programs like ChatGPT (or Grammarly or Sudowrite) deserve a place in written work that has a human’s name on it. And when those programs don’t.

Here are five touch points to help you as a faculty member steer the choices made by student writers in your classes. Consider asking your students to think about each of the following.

  1. Trust: Do you trust what AI writes? We’ve all heard that generative AI can hallucinate. A lawyer learned this lesson the hard way by relying on ChatGPT to prepare a brief, which the bot filled with made-up citations. When it comes to AI’s truthfulness, users are well served by the old dictum “Trust, but verify.”

Plus, beware that AI-based programs like Microsoft Editor can be wrong on points of grammar or word choice. The hitch is recognizing when verification is needed, especially if you don’t know the topic or you’re not confident about your writing skills. Don’t automatically assume the AI program knows better. Double-check information you’re not sure about, and ask someone who’s a good writer to confirm your wording choice.

  1. Effort: How much are you willing to expend when you write? Writing takes mental effort. Of course, so do many activities we undertake—solving calculus problems, learning the trombone, even deciding which book to read. Psychological theories explain that we don’t always need to apply 100 percent of our mental energies to a task. Sometimes “good enough” is good enough. It’s understandable wanting to minimize effort in the writing or editing process by conscripting help from AI. Local policies vary widely as to how much assistance is legitimate. Students need to be sure they’re following institutional guidelines.

Yet just because you can doesn’t mean you should. Heavy reliance on a tool like ChatGPT to do your writing or editing, even when not against the rules, can erode writing skills over the long haul.

  1. Writing skills: Does AI improve or weaken them? Ideally, AI’s writing prowess can be harnessed to improve our own skills—editing text we’ve written, helping us generate new ideas, maybe even making us better spellers. But there’s another possibility: by letting AI do large chunks of your writing and editing, you risk losing your own writing abilities or, for younger users, not feel motivated to develop skills in the first place. Reducing human skills—not just for writing but in, say, art or argumentation—is, in fact, one of the biggest challenges of today’s AI. What’s more, by maintaining your writing skills, you’re not stuck when there’s no internet connection and an essay is due in the morning.
  1. Writing voice: Does AI compromise your own? A cascade of AI writing tools—from predictive texting to Grammarly or Wordtune—offers up text completion or “better” ways of saying what we mean. But when we mutely acquiesce, what happens to our own writing voice? In research that I’ve done involving predictive texting, one university student said, “Don’t feel I wrote it,” while another worried that “Maybe it makes me a little too repetitive.” We all have individual writing voices. Beware of letting AI obliterate yours.
  2. Commitment: How much do you care? The biggest elephant in the room is commitment. What level of it do we have for a particular piece of writing? Maybe the stakes are low for routine emails or blog posts. Next up the ladder is writing for which we have more responsibility. Our name is on that plot synopsis we bade AI to draft. Did we bother reading the book to gauge where we agree with what ChatGPT wrote and where we don’t?

But then come even higher rungs of commitment for putting words together. Over the decades, more than a dozen authors have declared they only know what they think when they read what they’ve written. At its best, human writing is a self-discovery process. AI has no self to discover. And remember, as well, that writing is ultimately a craft in which you can take pride, whatever your level of expertise. When you cede control to AI, you forgo personal artistry.

In sum, as the instructor, acknowledge to students that human writing is often time-consuming, even painful. Like a teacher with infinite patience, AI stands ready to offer an assist. But asking your students to consider these five touch points helps make clear that AI can also seduce us into minimizing effort or trusting the bot when we shouldn’t. Its smooth prose (that companies paid billions to generate) potentially leads us to believe its skills and voice always surpass our own. Writing bots tempt us to forget how empowering it can be to think through a problem with pen or keyboard at hand, and how satisfying it can feel to find our own perfect wording.

Remind your students that writing is a profoundly human activity. AI now has a role in the drama, but people need to remain the playwrights.

Naomi S. Baron is professor emerita of linguistics at American University and author of Who Wrote This? How AI and the Lure of Efficiency Threaten Human Writing (Stanford University Press).

Next Story

Written By

More from Teaching