You have /5 articles left.
Sign up for a free account or log in.

A woman on a flotation device in a body of water working on her laptop.

“We need to get to that point where we feel some agency again,” Mills said of faculty perceptions of AI in teaching and learning.

mihakonceptcorn/Getty Images

Some faculty members recharge in summer; they exhale while tweaking assignments or course policies for the upcoming year. Other instructors face challenges that are unique to summer—they may struggle to find childcare while teaching multiple courses, sometimes on multiple campuses, during an accelerated term and often to students in need of remediation.

But Microsoft and Google are moving forward with integrating artificial intelligence text-generation into the environments where modern humans write. As the pace of progress in AI writing tools accelerates, faculty members across the summer spectrum face a shared challenge: How can they upskill in AI for teaching and learning—fast?

Some academics and institutions are offering AI faculty workshops or, in the words of Anna Mills, English instructor at the College of Marin, “safe spaces where we don’t feel overwhelmed by the fire hose of [AI] information and hot takes.”

In these summer faculty AI workshops, some plan to take their first tentative steps in redesigning assignments to recognize the AI-infused landscape. Others expect to evolve their in-progress teaching-with-AI practices. At some colleges, full-time staff will deliver the workshops or pay participants for professional development time. But some offerings are grassroots efforts delivered by faculty volunteers attended by participants on their own time. Even so, many worry that the efforts will fall short of meeting demand.

“We need to get to that point where we feel some agency again,” Mills said of faculty perceptions of AI in teaching and learning. Faculty need to articulate “what we want to advocate for as teachers, where we’re not intimidated, where we understand that [AI] will continue to change, but there are certain basic concepts around what language models are that we can grasp and that are not going to change, at least not foreseeably soon.”

‘Some Are Angry’

During the spring semester, many college librarians stepped up to help faculty members navigate the disruption caused by ChatGPT-like tools. For example, Melissa Adams, librarian at Tacoma Community College, and her colleagues set up a faculty resource page that offers an introduction to generative AI. They also offered two professional development workshops on AI in teaching and learning, monthly pedagogical discussions, and a cloud-based SharePoint file where faculty could ask questions and contribute ideas. The institution is also planning funded communities of practice for faculty to share ideas about leveraging AI for critical thinking.

But those efforts fall short, according to Adams. Most faculty continue to project “discomfort, confusion or fear” about what AI means for their disciplines and student learning.

“We’re a little bit behind the curve,” Adams said. “We have people picking up the slack and trying to center [upskilling in AI] more for our faculty and our students. But there’s a little bit of frustration there that there is not enough support.”

Many Tacoma faculty teach part- or full-time in the summer, which means that they may continue to struggle with or to leverage the disruptive technology in the same ways they did in the spring semester, Adams said.

“Some are angry, too,” Adams said. “Some see this as the demise of the ability to write and think … We need a wider conversation and wider support throughout campus.”

The Salt Lake Community College libraries have also created a community resource page to offer faculty, staff and students a starting place for learning responsible use of generative AI. This week, the institution’s librarians and the Writing Across the Curriculum director will launch the first of three virtual, synchronous summer AI workshop for faculty. In the workshop, each of the 30 registered individuals will adapt at least one of their assignments or teaching strategies to embrace AI. The team considers that enrollment robust for a summer faculty workshop. But the number represents only a fraction of the nearly 800 instructors the institution employs, according to Amy Scheelke, instruction and liaison librarian at Salt Lake.

Even as faculty redesign assignments to adapt to the presence of generative AI tools, Scheelke and her colleagues are concerned about digital equity moving forward. Salt Lake students, many of whom hail from lower income brackets or struggle to connect to the internet, are currently using ChatGPT and Bard as free research previews.

“But what happens when there aren’t free research previews anymore? Or when the proprietary version of ChatGPT gets significantly better than the free version?” Scheelke asked. “If faculty don’t consider that now as they retool their assignments to include some of these tools, that could have negative implications in the long run for our students if it’s suddenly that you have to use this tool, but now it’s only a paid version.”

Scheelke said that a Salt Lake staff member reached out to OpenAI, the developer of ChatGPT, two months ago to ask for a quote for the library to provide access. But OpenAI has not responded.

“Are these companies going to be willing to work with academic libraries if we’re buying subscriptions, or are we too small a fish?” Scheelke asked. “How are we going to provide equitable access for our students without asking them to purchase one more thing?”

At other institutions, instructors who gained facility with the tools during the spring semester are offering summer workshops for their colleagues, often on their own time.

“We still have faculty members who are just not sure about AI,” said Laura Dumin, professor of English and director of the technical writing program at the University Central Oklahoma.

In the spring semester, Dumin integrated AI literacy into her classes to help boost students’ knowledge about the strengths and weaknesses of generative AI writing tools. She also offered students opportunities to experiment, write reflections and discuss the appropriate use of the tools. Dumin credits those efforts with averting the cheating concerns that some of her colleagues faced. Now, she plans to lead a summer AI workshop guiding other faculty members through the redesign of one of their assignments to incorporate AI. She has capped the workshop at 20 participants to foster discussion.

“If I had the time, and if I were being paid for it, it might be different,” Dumin said, adding that she aspires to help contingent faculty. “But [summer] is my time off to do my own work.”

Dumin would also like to see more institutional funding devoted to training.

“Instead of running from it or assuming the faculty will find their way on their own, institutions need to put some money into ensuring that their faculty are being trained in ways that make them comfortable and in ways that help them to understand this new landscape,” Dumin said. A University of Central Oklahoma spokesperson said the institution is developing workshops this summer that will be delivered in the fall.

Marc Watkins, lecturer in composition and rhetoric; Stephen Monroe, chair and assistant professor of writing and rhetoric; and Lori Nichols, data scientist—all at the University of Mississippi—secured funding for participants in their AI Summer Institute for Teachers of Writing.

“Over half of the participants who signed up for the pilot are non-tenure-track teachers of writing,” Watkins said. Even so, the team had double the number of applicants than they could accommodate. To meet the extra demand, they plan to develop a version of the training as a stand-alone online course. They are also developing a version of the training for K-12 teachers and administrators.

“Spring was overwhelming for everyone in terms of dealing with ChatGPT and other generative AI,” Watkins said, adding that he spoke with nearly two dozen faculty members about suspected cases of academic dishonesty using generative text. Most of the affected faculty did not penalize the students, but the labor involved in turning cases into teachable moments was exhausting, Watkins said.

‘We Have to Be Graceful’ With Faculty

AI advancements are moving quickly—so quickly that some academics worry that opting out may mean forgoing opportunities to influence AI’s higher ed trajectory. But the flip side of that argument is that those who engage early may help direct its appropriate use in teaching and learning moving forward.

“Now that we’re over the initial kind of fear of ‘will we ever write again?’ … [we can] constructively think through what the promise of AI could be in terms of enhancing what’s possible,” said Asim Ali, executive director of the Biggio Center for the Enhancement of Teaching and Learning at Auburn University. “That’s really a compelling, exciting place to be.”

In April, Auburn launched a self-directed, fully online course to help faculty upskill in the wake of ChatGPT’s release. Approximately 120 people pursued the course during the spring semester. Now that summer has arrived, that number has more than doubled to 300, which represents approximately 15 percent of the faculty, Ali said. By the end of the summer, Ali expects that approximately 500 faculty members at the institution will have taken the course.

The online modality, which Ali credits with boosting the programming’s reach, allows faculty to upskill at times, paces and levels of engagement that suit them. Other institutions appear to agree. Vanderbilt University and Stanford University, among other institutions, will offer virtual summer AI workshops.

“We have to be graceful with folks and meet them where they are,” Ali said. “That means that some are ready for transformative work, and some are ready to learn but need to be a little bit more selective … We need to be ready for all different levels and types of engagement.”

Marc Ebenfeld, director of the Center for Excellence in Teaching and Learning at the University of New England, is also involved in planning summer AI workshops for faculty on the institution’s Portland and Biddeford, Me., campuses. Faculty who participate will receive a “very, very modest” stipend and lunch. (Ebenfeld declined to say how much.) Faculty who gained experience in the spring semester teaching with AI in sociology, ethics, art, math and computer coding courses will help lead the workshops.

The New England team plans to highlight how to use AI to reduce the burden of repetitious tasks in academe, such as writing reports, letters for colleagues who served on committees or letters of recommendation for students, Ebenfeld said.

“It does a credible job” on such tasks, Ebenfeld said. “Obviously you want to touch it up. But … it’s pretty good when you’re using it on creative mode, which is GPT-4, on purple prose—using lots of flowery adjectives.” The team may also highlight how AI tools can create glossaries of terms, lesson plans and quizzes from articles or videos, Ebenfeld said.

But opportunities can quickly morph into risks.

“If faculty members are generating their teaching assignments with language models, and students answer them with language models, then we’re just hooking it all up to the [learning management system], and off we all go,” Ebenfeld said. “That would be a potential negative, but there’re also immense positives … AI really forces our hand to do that [critical thinking] that we say is our focus.”

Next Story

More from Artificial Intelligence