You have /5 articles left.
Sign up for a free account or log in.
Just_Super/iStock/Getty Images Plus
“We see [artificial intelligence] as an industrial revolution for knowledge work.” So predicts a managing partner at Bain & Company in a press release announcing a new partnership with OpenAI. The barrage of commentaries (and late-night TV segments) since ChatGPT was launched in November suggest the revolution is just beginning. This is especially the case in schools and universities, where AI is already transforming teaching, learning, research and regulations (not to mention public discourse). AI tools like ChatGPT have been embraced, questioned, critiqued and lamented by educators.
As faculty committed to supporting students’ intellectual development and preparing them to build successful, sustainable careers, we’re troubled—and compelled—by the seemingly inevitable yet presently opaque changes AI is bringing to our world. Economists and computer scientists are uncertain of the specific contours of AI’s impact on the workforce, but they are certain the effect will not only be significant, but that it also has the potential to exacerbate inequality. How should educators respond to these predictions? What kind of workforce are we preparing our students to enter? What if the work they aspire to do—developing apps, managing businesses, practicing medicine, working in public service, communications, marketing or even education—is mostly being done by AI when they reach their professional primes? Should we stop worrying about reading, writing and research and instead focus on algorithms and coding? Should we deepen our emphasis on creativity and critical thinking in hopes that our humanness will prevail? And how can we center justice and stem growing inequalities in light of this technology?
We teach at very different institutions—a public community college, a selective public regional university and a private research university—but our takes on AI have converged. We believe we must engage our students in critical reflections on AI, while also adapting our courses to responsibly address concerns about it. But we also suspect that our pedagogical concerns might be missing the forest for the trees. The longer-term prospects of an “industrial revolution for knowledge work” require a more far-reaching response from educators. For this reason, we are recommitting to several pedagogical practices that are crucial to ensuring students—regardless of level or institution type—thrive in a world shaped by this new technology.
First, we must reckon with the reality that tomorrow’s workforce conditions will differ from the ones we had in mind when we designed our curricula. When colleges anticipate employers’ needs, they build structures to help students achieve economic mobility for themselves and future generations. Workforce development is equity work. The rapid uptake of AI across industries shows that students need to be ready to adapt to changing conditions. For example, they need to be able to communicate in a variety of modes and genres and to evaluate information rather than simply take it in. We can foster skills in these areas by scaffolding assignments to include more in-class writing activities, discussions, debates and presentations. Using formative assessments will make it more likely that students focus on the learning process rather than just on final products that AI can easily simulate.
To take full advantage of our students’ emerging expertise, we must also commit to designing assignments that challenge them to integrate experiential knowledge as a scholarly resource. The experiences students bring to the classroom, whether they are first-year undergraduates or master’s candidates, add knowledge and make it possible for them to consider the social and ethical implications of their work. This is something AI tools cannot replicate. In fact, the technology’s amoral nature may be its most dangerous limitation. By contrast, we have seen firsthand how students’ personal experiences with issues such as gentrification, immigration, social media usage, educational inequality, etc. have enabled them to craft sophisticated research questions and develop innovative proposals for action. Students’ lived understandings of these issues allow them to develop visions of equity and justice rooted in reality, and they have the potential to shift the terms of scholarly conversations.
Lastly, and perhaps most importantly, we must recognize that ChatGPT relies on existing knowledge rather than generating new ideas. AI tools need human creators to continuously add to their store of information in order to advance. Rather than trying to outsmart AI, our job is to educate and empower principled creators. We need to design more opportunities for students at all levels to do original research, participate in fieldwork, co-create with peers, conduct interviews, collect data and leverage their insights and experiences to advance society. These are things that AI tools cannot do.
Teaching and research are our forms of knowledge work. We value this work because of what it makes possible for our students. We will defend it—and our students’ potential as emerging experts and professionals—as best we can. In a world where our jobs as knowledge workers are not guaranteed to be safe, we feel called to double down on ensuring each of our students can adapt to what’s next. We welcome the challenge of embracing our students’ real experiences and immersing them in real work in the real world from the first day of their first semester on our campuses. We see their creativity and knowledge daily, and as such, we are excited to live in the future they will create.