You have /5 articles left.
Sign up for a free account or log in.
style-photography/iStock/Getty Images Plus
Let me paint a picture. At 2 a.m., a frustrated college student sitting in the student union pours over their economics class notes about supply and demand. At 2:03 a.m., the student emails the professor with questions, the professor quickly responds with several supply and demand examples from The Wall Street Journal, the Financial Times, and Bloomberg designed to supplement their lecture notes and text. Soon after, the faculty member proactively pulls together a personalized quiz based on these articles, and forwards it to the student. At 3:30 a.m., the student, after having read the articles, takes the quiz and earns a passing grade. After receiving the grade, the student sends the professor a “thank you” email, and they both log off at 4:17 a.m. Well, the student logs off: The “professor”—an artificial intelligence (AI) bot —waits for the next student interaction. Amazingly, while assisting this particular student with supply and demand, the AI professor conducted the same routine for 11 other students from “its” economics class of 300 students.
This is not science fiction—the above scenario is here. Companies like Tutello are already trailblazing AI-enabled tutoring.
While most faculty are now struggling with students’ reliance on ChatGPT, as a scholar who has studied disruptive technologies, as well as their impact on jobs, I’m here to argue that student usage of ChatGPT should be the least of faculty members’ concerns. Make no mistake: At the end of this decade, i.e., 2030, AI will be solely teaching a notable percentage of courses across the globe. (Don’t believe me? Five years ago, I wrote that by 2024 AI would replace software engineers—a prediction that raised skeptical eyebrows. Well, I hate to say I told you so: AI is beginning to replace coders.)
AI’s evolution to professor will follow a straightforward path. Initially, professors will be convinced that AI is an assistant or partner. The first AI classes will be listed as human-taught and led, but AI-enabled. After several semesters, the course will be listed as co-taught. Under the co-taught or AI hybrid model, professors will function in a quality control or supervisor capacity. One professor may oversee numerous AI-led sections ensuring that the AI agent effectively develops the course, delivers instruction, and evaluates students’ performance. It will be a stark departure from the faculty’s traditional instructional role—in effect, faculty will be managing AI bots. Eventually, AI won’t need its “professor training wheels,” and will be able to run courses without human supervision. At that point, I expect the following conversation will occur on campuses around the world.
University Administrator: We are going to peel off the AI bot from your section and list an independent course with the AI bot.
Professor: Really? What about student support? Office hours, tutoring …
University Administrator: Well, that is the great thing. Our initial pilot demonstrated that students were willing to go to AI’s office hours at all hours. Amazingly, most of the students can use AI to better manage their work-school-life balance. We have also seen students perform better with AI-only led courses in this pilot.
In recent weeks, I have heard faculty extol the ability of AI to develop syllabuses, craft PowerPoints, design exams and correct papers. Faculty are giving up “real estate” in their classrooms one metaphorical acre at a time. It’s time to pump the brakes on artificial intelligence in higher education.
Students Are Driving Embrace of AI
For better or worse, AI’s moment in higher education has arrived because our students have changed. To the opening example: How many of us have received student help requests at two in the morning? All faculty receive these emails. The problem is not the timing of the email: It is that many students expect an immediate reply. Today’s student lives in the “moment of now.” Want a song you just heard on the radio? Log onto Spotify. Want to watch a new movie? Log onto Amazon Prime or Netflix. Want a cheeseburger? Order from Grubhub. Get a date? Log onto Tinder. Today’s students play video games with people across the globe, they watch Reels and TikTok videos from anonymous media creators and connect on Facebook Marketplace with people they have never met. Yet they still realize the full value and fulfillment from those experiences and relationships.
While this shift in student expectations is evident, faculty believe they will be protected by “their relationships” with their students. Too many faculty suffer from a fatal bias relative to AI. When I speak to faculty across the academy about the disruptive threat that AI poses to higher education, and more specifically to their jobs, I get a familiar retort—“AI could never do my job. It lacks my deep disciplinary training, and more so, AI could never replace the relationship I have with my students.” Such a sentiment reflects a very flawed way of thinking, and a serious underestimation of AI’s capabilities.
As someone who teaches technology, innovation and disruption, the best case study for illustrating these dynamics is Kodak, the once great film company. While many teach the case from a technological perspective, arguing whether Kodak could have responded differently to the advent of digital photography, the company’s demise is best approached from a human bias perspective. Kodak should have dominated digital photography; they invented it! Yet, Kodak employees at all levels of the company could not believe that digital would replace film; they were so biased in their thinking about film that they underestimated the power of digital cameras. How did that work out for them? It is important that faculty across the country learn the Kodak lesson. Biases and perceptions do not reflect reality.
Whether or not you believe that AI could take your job as a professor is immaterial. The only person that matters in this equation is the student.
One other point—administrators are not the problem. Administrators recognize this shift in student behavior and expectations. In today’s corporate higher education model, administrators value two things above all else: cost savings and customer service. AI plays to both. Simply put, AI can scale. Human professors cannot.
Don’t Let the AI Camel Nose in the Tent
How do we respond as a larger faculty? In the short term, think of the old parable: Do not let the camel’s nose into the tent, or the rest of the camel will follow.
- Stress critical thinking within your discipline. Currently, AI uses data-driven models that cannot account for ambiguity well. Within your respective disciplines, find the gray areas that require debate and reflection. This applies to all disciplines—the humanities, sciences, business and engineering. No field will be immune from AI’s reach.
- Make experiential activities central to the class. Whether it be labs, field trips, or guest speakers, endeavor to get students out of the classroom and hear other perspectives. Faculty-driven internships need to be part of the curriculum if they are not already.
- Do not feed the AI monster: faculty would be wise to abandon ChatGPT entirely. The goal is to engage more with the student, so for example, shift the modes of assessment from writing to oral exams. And do not get lazy! Sorry, but you need to continue to develop your own syllabus, craft your own PowerPoints, and grade your own exams.
- Protect your intellectual property. AI only works with available data; do not make your lectures, videos, notes, and other intellectual property available to it. If you built it, then it’s yours—don’t give it up in the short term, because you will pay for it in the long term when AI is reusing your content.
- Decline participation in AI pilot projects. When your dean or vice provost offers a $500 grant to introduce AI-enabled grading, tell them you are doing just fine. Keep using Scantron. There’s no risk of a Scantron taking your job.
Finally, beware of the false narrative beginning to circulate within the academy that AI will free up the professor to do more “high-touch engagement” with the student. Or that AI can be a partner in the classroom to make you a better professor. For example, do not pay heed to the notion that if you bring AI in to help with teaching, you can help the student get a job or provide career counseling. Sorry, AI does job searching better than you at this point. When it comes to customer service, course advising, and career counseling, let AI have these tasks, but when it comes to the classroom, keep AI on the outside looking in.
As Roy Amara, the acclaimed futurist, famously quipped, “We tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run.” In higher education, we are past the short term, and now entering the long term. Conversations are starting to heat up relative to AI and faculty. While compelling arguments can be made relative to efficiency and AI, more direct commentaries are satirically asking if we need faculty altogether. Now is the time faculty need to ask big questions, and not just jump on the AI bandwagon.