“Journalism is paywalled, but disinformation is free.” —Sarah Kendzior
I’ve been following with interest the recent panic around students writing academic papers using ChatGPT and similar expedients. I’ve been struck that the dangers are twofold, but only one fold (?) is getting most of the attention.
Yes, it can replace human effort with machine effort, thereby defeating the purpose of student writing. We don’t assign papers to get papers; if pressed, many academics would admit that grading papers is one of their least favorite parts of teaching. We assign papers because the act of writing the papers forces students to engage with the material at a deeper level than a multiple-choice exam question does. To the extent that students outsource the task of writing, they outsource the task of thinking. It’s similar to saying that it’s faster and easier to drive a car 26.2 miles than to run a marathon. Yes, it is, but that defeats the purpose of the marathon.
That critique is true. No argument. But there’s another one that seems to be getting less attention: apparently, ChatGPT fills in gaps in information by making stuff up. It simply invents facts that seem to fit what it’s trying to say.
At one level, that may help defeat the fear of outsourcing. A ChatGPT paper may get through Turnitin, but if the paper claims that Harold Stassen won the 1980 presidential election, a savvy professor should raise an eyebrow.
I worry that, as a culture, we’re getting the AI we deserve. As the journalist Sarah Kendzior has noted, on the internet, journalism is paywalled but disinformation is free. A paper-writing machine that spews out disinformation at a record clip almost seems too on-the-nose to be real.
I’m old enough to remember when “negative campaigning” was the great worry. Negative campaigning consisted of taking nuggets of truth out of context and reframing them in the most sinister possible ways. A vote for a budget that would raise entrance fees at a dozen state parks in order to pay for new swing sets would become “Jones voted to raise taxes 12 times!” It was frustrating in multiple ways. Not only was it profoundly misleading, but over time, it taught many legislators to be wary of even the most unobjectionable positions for fear of how they could be portrayed in ads. The ads were often exercises in bad faith, but you could usually trace the wild claims back to some nugget of fact somewhere, even if that nugget was coated in thick layers of balderdash. Negative campaigning endured because it worked.
Disinformation dispenses with the need for even a nugget of truth. Or, as Kendzior points out, it transplants a nugget of truth from one group to another. Does it seem like it’s harder to make a decent living than it used to be? It must be because one political party is comprised of lizard people. (John Carpenter’s neglected classic film They Live was not a documentary.) The usefulness of disinformation is that it’s infinitely fungible; if the lizard-people line stops working, try something else. The rioters aren’t really rioters. No, wait, they’re really antifa. No, wait, they’re freedom fighters. Whatever works right now.
To the extent that disinformation is actually automated, the task for academics is to ensure that students become better readers. That means some history and civics, so they know some basics of what and why. Someone who knows no American history might not catch the Stassen reference. More fundamentally, it involves teaching students to analyze arguments to see how they hang together. Most disinformation falls apart at the slightest questioning, since it has no underlying reality. Even if writing can be automated, reading can’t be.
Disinformation is a fundamental challenge when a society defines free speech entirely as the absence of legal constraints. That definition worked for a long time, when access to a large public was difficult. Now that the challenge around mass communication is around the lack of quality control, as opposed to the lack of venues, we need to come up with new understandings. President Stassen needs us.