You have /5 articles left.
Sign up for a free account or log in.

Concept of using artificial intelligence in teaching shown with various students at computers looking at a huge robot emerging from a particularly large one

Naratip Chandapaeng/iStock/Getty Images Plus

Many faculty members across North America have sat on the sidelines in 2023, hoping for someone or something to keep generative artificial intelligence out of their institutions. The primary lesson we all learned, however, was that no one can save us from AI but ourselves.

Many of our colleges’ students are already using generative AI—without guidance—and many recent graduates say they feel unprepared for jobs because of AI. Businesses of all types have embraced generative AI, and significantly more say they will in the coming years. The trend seems likely to accelerate because early research suggests that AI can save workers time and improve the quality of their work.

So, as we look forward to starting a new year, the academy can no longer live in denial. We must help our students learn to use generative AI in effective, ethical ways in their learning and, eventually, in their careers and lives as citizens. In other words, the academy must step off the sidelines and into the scrum, especially as AI continues to change rapidly.

What Instructors Can Do Now

We’ve all received abundant advice about how to prepare students and ourselves for using generative AI in class, and many instructors have used the past year to learn new skills. But if you are still feeling uncertain—or overwhelmed—with the pace and scale of change that generative AI is bringing, that’s OK. You can still take some easy, concrete steps to begin moving your classes and students into the age of generative AI.

In the examples that follow, we offer both some basic approaches (“good”), as well as more nuanced ways (“even better”) to integrate generative AI into teaching and learning. The key is to start making changes now.

  • Good: Create syllabus language about AI use. Yes, this is basic, but including guidelines and expectations for how students may and may not use generative AI in your classes is crucial. With clarity and transparency about what to expect, students will be better able to make good decisions. Examples of syllabus language abound. The University of Toronto and the University of South Florida offer examples you could adapt for your own class.
  • Even better: Draft a plan with students. Add a statement to your syllabus that the class will collectively develop a plan for appropriate class use of generative AI. Empowering students in the process will improve their motivation to follow the policy.
  • Good: Plan conversations about AI. Help students understand how generative AI tools were created, how they work and why they must be used with caution. Include topics such as accuracy of information, stereotyping and biases in generative AI; appropriate use of generative AI; data privacy; and equitable access to generative AI. Kathryn Conrad’s Blueprint for an AI Bill of Rights for Education and the AI Pedagogy Project are good starting points. The Center for Teaching Excellence at the University of Kansas also offers materials on helping students understand biases in AI and the ethical use of AI in writing assignments. The emerging field of explainable AI can also provide areas for discussion.
  • Even better: Embed exploration of ethical practice into assessments. Create an assignment that asks students to explore the ethics of generative AI use in your discipline.
  • Good: Create an exploratory assignment. Identify an assignment in which students must use AI tools. This exercise will help students—and you—learn about the workings of AI. It will help you better guide students in its use and generate ideas about how to adapt future assignments. The Center for Teaching Excellence at Kansas has curated a list of tools you might consider, or you can find your own at such sites as Futurepedia and There’s an AI for That.
  • Even better: Take this approach with all your assignments. The more opportunities students have to develop facility with generative AI during structured activities, the better they will understand their role and responsibility in using the technology.

Deepening Your Practice

Once you have the basics down, consider deepening your use of generative AI as a teaching and learning tool. Here are five approaches that we and colleagues have found to be helpful.

  1. Evaluate assignments through the lens of generative AI. Consider which activities are most vulnerable to misunderstanding, misuse or bias if students use generative AI. Evaluate how you might better integrate AI into those assignments, focusing students’ attention on ensuring responsible practice. You might have a chat bot create questions for class discussion or ask students to use different chat bots for the same assignment and report back to the class about their experiences.

The Center for Teaching Excellence at Kansas provides an example of how Bing Chat can tutor students in a research and writing assignment. Ethan Mollick, an associate professor at the University of Pennsylvania who studies and teaches innovation and entrepreneurship, also offers examples of using AI in assignments. Anna Mills, an English instructor at the College of Marin, has curated an extensive list of AI-related teaching materials. Sites like AutomatED, the Sentient Syllabus Project and the online community AI in Education provide additional approaches. If you are feeling ambitious, you can even use ChatGPT to create a class.

  1. Add a methods or reflection component to assignments. Academic research usually contains a methods section in which authors explain how they gathered and analyzed data, as well as how they arrived at their conclusions. Faculty members can treat generative AI as another method for completing tasks or assignments and arriving at conclusions. Students could describe their use of generative AI just as they would any other method. APA Style, the Chicago Manual of Style and the MLA Style Center have all published guidelines on citing and acknowledging use of generative AI.

Alternatively, instructors could have students briefly reflect on the approach they used for their work. How did they use generative AI? How did they adapt the output so that the work became their own? These practices encourage transparency and also help students improve their metacognition, the understanding of their own learning and thought processes.

  1. Allow students to work on assignments and activities in class using AI. Work performed in class will allow you to observe and guide students to use AI tools in valuable and appropriate ways. Assignments or quizzes must be short so students can complete them during limited class time. They should also be low stakes, as some students don’t work well under time pressure. If you need to free up class time for that type of work, consider a flipped class model, which offers flexibility for interacting and observing students.
  2. Do oral checks of students’ understanding. If you are concerned that students aren’t doing assignments on their own, set up short individual meetings. Ask students to lead you through the process of their work and to explain key concepts, ideas or methods. As they do, it is usually easy to tell who has done the work and who hasn’t. These meetings don’t have to be long—five minutes, perhaps—and they can often be done in class. If you suspect problems, set up a longer meeting with the student to discuss academic integrity.
  3. Adopt authentic assessments. Authentic assessment lets students apply their developing knowledge to real-world situations, and that contextualization can improve relevance and motivation. Such assessments can take many forms, depending on the discipline. Generally, authentic assignments apply content in ways that students are likely to encounter in their careers, that allow students to share their learning outside the class, or that encourage students to connect disciplinary thinking to other fields, their own lives, or a general audience. For example:
  • Students in a chemistry class create posters about how chemical interactions affect everyday life (hand washing, water purification).
  • Students in a psychology class write an op-ed article applying the principles of psychology to inform an event in the news.
  • Students in a journalism class work with a nonprofit agency to create messaging about mental health resources for high school students.
  • Students in a physics class create a poster describing what might have destroyed a deep-water submersible vessel.
  • Students in a biology class hold an end-of-semester poster session for which they create displays and activities to help attendees learn about threatened species.
  • Students in a film and media studies class create video and social media messages about the impact of digital literacy and fluency on teen deaths by suicide.

Generative AI offers powerful new opportunities to expand authentic assignments and infuse them with technological skills that students will need in careers and their daily lives. In each of the examples above, generative AI could initiate ideas, provide examples, create images and illustrations, design posters and brochures, and draft materials. You can also use it to create discipline-specific case studies or interactive scenarios in which students grapple with real-world problems. Or you could challenge students to envision using generative AI in various professions, both to learn more about potential careers and to consider how the technology might change a profession. The teaching center at South Florida has created a document on how to approach these areas.

Just Another Tool

Increased familiarity and facility with generative AI will help you and your students see it as just another technological tool, not a fatal foe. AI hasn’t broken education. Neither did the internet, smartphones, Wikipedia, digital search, Wi-Fi, ebooks or other technologies that have emerged over recent decades. Yes, teaching practices must change, and that will require time and resources. But colleges and universities, and we as academic professionals, should consider this work to be an investment in the future that will help shape the standards of a new AI-infused world.

Doug Ward, associate director, Center for Teaching Excellence, and associate professor of journalism and mass communications, the University of Kansas

Alison Gibbs, professor teaching and director of the Centre for Teaching and Support Information, the University of Toronto

Tim Henkel, assistant vice provost for teaching and learning, the University of South Florida

Heidi G. Loshbaugh, senior research associate, the University of Colorado at Boulder

Greg Siering, director of the Center for Innovative Teaching and Learning, Indiana University

Jim Williamson, director of educational technology systems and administration, the University of California, Los Angeles

Mark Kayser, instructional designer, the University of California, Los Angeles

Next Story

More from Career Advice