You have /5 articles left.
Sign up for a free account or log in.

I joke around regularly with my brother-in-law, a high school history teacher, about the eventual machine revolution and how to deal with robotic overlords. After ChatGPT launched late in 2022, he simply said, “We’re doomed.”

He was doomed to see a sharp rise in AI-generated homework responses and not actually succeed in teaching anything, and, as the director of thought leadership for a higher education communications and leadership agency, I was doomed because I would be out of a job as an editor and writer. And undeniably, there are serious implications for humanity using this kind of technology. But the more I have thought about it, the more I can see the value in the tool for writing and editing.

I’m certainly not qualified to weigh in much about AI; I’m late to almost every tech trend and still use a paper planner with a fountain pen. But we’ve been inundated with AI-related news in 2023 (Google spits out more than 16 million hits when you search for ChatGPT news) so even the Luddites among us have been hard-pressed to ignore it.

Even without considering AI-generated content, ghostwriting exists in an ethical gray area. I write words for someone else to pass off as their own. I’ve consented to this and am compensated for it. Certainly, there’s more nuance than that—I’m not generating the expertise in these cases, I do my best to capture key phrases and language from conversations with the eventual author to include, and often the person I’m writing for makes at least some edits to reflect their voice—but that’s essentially what the outcome is. My words, their name. The entire concept of ghostwriting sounds pretty similar to using ChatGPT to write a draft.

Letter to the Editor
A reader has submitted
a letter in response to this
blog post. You can read
the letter here, and view all
of our letters to the editor
here.

And editing can raise some questions about authorship as well. A conversation with a faculty member hoping to engage more with the media and produce public scholarship once memorably included a concern about whose name would go on a byline of a written piece if they received assistance and guidance from our agency. It was a fair point. By the time I’m through reviewing a piece, sometimes the entire document is aglow with tracked changes and it barely resembles the original draft. Does that make me a co-author?

Let me be clear: cheating and plagiarism (and robot uprisings) are bad. I am in no way suggesting that people use AI to write full pieces to submit as their own work. I don’t think that is an honest move—or even a smart one. Although I’ve not yet tested the newer GPT-4, I found ChatGPT wanting. No matter how I adjusted the settings or tweaked my request, the output always required edits. My robot-generated drafts read as if someone relied on a thesaurus too much; they were clunky and repetitive and not reflective enough of the ideas I came up with and wanted to express. But in both writing and editing, we’ve used some element of AI for many years, such as software that evaluates the readability of a written piece, programs to check writing like Grammarly, and even spell-check and autocorrect.

There are only so many words and so many possible arrangements of them on any given topic. And while the options for combinations may be exponentially large numbers, they’re not infinite. Even our fresh takes may not be as unique as they seem to us. It is hard to know if this is too close to plagiarism if you take something that’s written and make it your own or make it better. Or if you request a draft to react to or an outline as a jumping-off point. Because truthfully, it sounds a lot like my job.

So even with concerns about integrity, academic honesty and other possible negative outcomes from using AI, I can appreciate the potential benefits of using ChatGPT professionally. Staring at a blank page can be terrifying, soul-crushing. Why can’t a robot offer an outline based on your main ideas that you fill in with your own words, or help frame an argument you’re trying to make, or even create a first (very rough) draft to alleviate some of the work and allow someone to review something with the unequivocally more objective lens of an editor? I do these things for clients every day.

Ali Lincoln is the director of thought leadership at TVP Communications, a national communications and leadership agency solely focused on higher education.