When it comes to artificial intelligence and higher ed, the excitement and hype are matched by the uncertainties and need for guidance. One solution: creating an AI advisory board that brings together students, faculty and staff for open conversations about the new technology.
That was a key idea presented at the University of Central Florida’s inaugural Teaching and Learning With AI conference, a two-day event that drew more than 500 educators from around the country.
AI has had a “breakout year,” said Ray Schroeder, a senior fellow of the University Professional and Continuing Education Association (and a contributor to Inside Higher Ed). Schroeder, who has recently focused on the intersection of AI and higher education, opened the conference seeking to help faculty, administrators and staff attempt to navigate the choppy waters of AI.
“We cannot afford to ignore it,” he said. “The intent is to make clear, ‘What is the intention of the university? How are they going to move with this technology?’”
Schroeder and other experts interviewed said universities need a formal mechanism for getting advice on how to proceed.
“Artificial intelligence is a technology that impacts nearly every aspect of higher education institutions—recruiting, admissions, financial aid, student support services, teaching and learning, assessment, operation, and more,” said Kristina Ishmael, deputy director of the Education Department’s Office of Educational Technology.
Ishmael said in an email to Inside Higher Ed that the department’s top recommendation about AI is to “‘emphasize humans in the loop.’ Institutions that choose to create an AI advisory board, or a similar group, would be implementing this recommendation.”
AI Boards Arrive
Many universities are already pursuing that advice. The University of Louisville had its first AI advisory board meeting last week. Stanford University and Vanderbilt University formed boards earlier this year after investing millions in AI research on campus.
Northeastern University created an external AI board, co-chaired by two faculty members and joined by industry heavyweights including Honeywell and the Mayo Clinic.
The University of Michigan unveiled its 18-member advisory board in May, a group tasked with creating a report centered on best practices for generative AI.
The Michigan board was the brainchild of Ravi Pendse, the university’s vice president for information technology, who chatted with fellow faculty members about AI at the start of the year.
“I said, ‘We need to get a faculty group together to provide general guidance to the campus,’” said Pendse, who also serves as the university’s chief information officer. “We need to make sure we consider this technology, frankly, with our eyes wide-open and feet on the ground, so we embrace it, but do so thoughtfully.”
Building an AI board
There is no perfect blueprint to building an AI advisory board, and the approach will vary for each college or university. However, the experts interviewed noted important factors to make it work.
- Diversity of thought is key—ensure both faculty and staff are included in the group. “You have five fingers on your hand and they’re all diverse, but if they want to do something, they have to come together,” Pendse of the University of Michigan said. “We brought a diverse group together, and what you see in the work that’s followed has been from a creative diversity of thought leaders.”
- Make sure to include students. “Students should be involved in this process to ensure there is clarity, because that is what many students are currently seeking,” said Ishmael of the Education Department.
- Look at technology leaders. “The one thing I would recommend is having people who work on the tech in the room,” said Noah Smith, a senior director at the Allen Institute for AI. “So many people have become self-proclaimed AI experts. There’s a certain knowledge when you have the underlying pieces of technology that allows you to give useful insight and avoid the weird conclusions people come up with.”
- Be flexible in setting rigid rules. “I do think if universities move too fast to set rules, that could backfire, because the technology is going to keep changing,” Smith said, pointing to when calculators were first introduced and many classes banned them. “Institutions may tie their hands behind their back [with strict bans].”
- In addition to an internal board, consider an external board. “Part of what institutions need to know, from employers, is what applications you see for your field—and those are changing,” said Schroeder of UPCEA. “Right now we’re in the midst of it … where expectations are rising and the software’s changing. It’s hard to set a time limit, but maybe for the next three years, it’s going to be dramatic with the amount of change.”
- Talk to other institutions. “Share policies with other institutions so the wheel is not recreated hundreds of times,” Ishmael said, pointing toward a report released in May by the Education Department with insights and recommendations for AI. “Join affinity groups and/or communities of practice to learn from others and share that learning with your institution.”
- Focus on what’s best for your institution. “I get worried about these ‘follow the leader’ things,” Smith said, when asked about some institutions waiting for large technology-focused universities like Stanford or the Massachusetts Institute of Technology to set their own policies first. “I think you have to try a lot of things, and we will have a better idea if we take responsibility and share it [with policy setting]. We’ll find a way forward or multiple ways forward.”
Options Beyond Boards
An AI advisory board may not work for each institution, and there are other approaches. Schroeder suggested a workshop for faculty could suffice. Pendse pointed toward having a lunch-and-learn series and said that the key, whatever the format, is to encourage discussion.
“These discussions are already happening,” Pendse said. “We want to know what this thing is, to use it, to leverage it, to debate it. And the way you do that is providing safe spaces where this debate can happen and engage with each other.”
For the institutions forming boards, however, it can be dual purpose, according to experts. There’s the value it delivers to students, which, Schroeder said, can help prepare them for the changing workforce.
“For students, I think it’s a matter of tapping their expectations,” he said. “And I think their expectations are driven in part, perhaps in large part, by the expectations of employers.”
Those students, once prepared, can boost discussion and ultimately help with AI research in the future, creating a flywheel effect.
“We can’t just sit idly in this country; other countries are investing, so we need to be flying, not running or walking,” Pendse said. “And the only way we can is with institutions contributing to the AI talent—that will create policy makers, people who can debate the pros and cons. That’s how we can compete in the world.”