You have /5 articles left.
Sign up for a free account or log in.

Last year, 2023, will be known as the year that artificial intelligence entered the chat and dominated the conversation. Higher ed has been no exception—from the classroom to HR to admissions, university students, staff and faculty have been figuring out how to wrangle the massive potential (and dilemmas) posed by generative AI technologies.

If there’s one thing universities love, it’s a good policy, and it’s been heartening to see many institutions creating comprehensive policies addressing the use of AI tools and approaches. (In December, Educause published a good guide to creating an institutional policy regarding the use of generative AI).

While policies that broadly address copyright, data privacy, academic integrity and other topics are critically important to create (and keep revising, as our understanding evolves), what about higher ed marketing and communications teams? Higher ed marcomm teams serve a unique function for the institution—a function directly tied to advancing its business goals and building affinity and trust with key audiences and constituencies.

And as we know, there has been an explosion of chatter around how AI tools can make that function a ton easier—producing high-quality content with ease, creating personalized experiences that engage and convert at remarkable rates, making quick work of search engine optimization efforts, and more.

While there is significant potential, caution is also warranted. And broad generative AI policies may not provide actionable guidance for a higher ed marketing communications team. What uses are OK and where should we steer clear? How does using these tools change our workflow? How do we account for the brand in AI-generated content? And how do we deal with bias?

A survey of the landscape revealed precious few higher ed marketing and communication–focused generative AI policies. But there are a few worth highlighting.

  1. North Carolina State University Extension

Starting with an overview is helpful, since not everyone may be on the same page about what AI is or means. It is also good to clarify that this is a living document of guidelines and not a formal policy—especially as some legal waters continue to be muddy, this distinction could prove prudent.

N.C. State also specifically addresses prompt writing, which is an essential skill for effective use of generative AI tools. It would not be surprising to see internal training on prompt writing pop up on campuses in the next year or two.

The page helpfully notes that it was last updated in November 2023—as we create and revise policies and guidelines for our teams on a topic that is highly in flux, dating these documents is key.

N.C. State also explicitly encourages “thoughtful” engagement and experimentation with AI.

  1. University of Utah

Utah’s AI guidelines are clear and helpful, including background on AI, guiding principles for usage (notably phrased with nine “we believe” statements) and examples of acceptable use (as well as examples of explicitly prohibited uses).

But the most notable thing here is that the university has a marketing and communications AI use working group. Encouraging experimentation is great, but convening a body of marcomm professionals from across the university to share learnings and contribute to the ongoing evolution of these guidelines is next level.

  1. University of Wisconsin at Milwaukee

This comprehensive set of guidelines covers similar territory in terms of overview, guiding principles, acceptable and prohibited use, and warning against bias. But the best thing UW Milwaukee does here is outline a recommended production process for using AI tools to generate marketing content.

The process includes creating a project brief on how AI will be used, an approach for human review and revision of generated content, usage of approved style guides to ensure brand alignment and more.

As exploration shifts into practice, defining responsible processes for incorporating AI tools into your marketing project workflow will prove increasingly valuable.

Best Practices for Generative AI Policies in Higher Ed Marketing Communications

After reviewing these and other generative AI guidelines, the following elements should be considered for your marcomm team’s policy:

  • Brief overview of AI—does not need to be exhaustive and can link to other authoritative sources but should touch on areas such as what it is, what it can and can’t do, definition of terms, current landscape (legal, etc.), outlining limitations or parameters in terms of LLM (large language model) training/data sets/etc.
  • Guiding principles/best practices/dos and don’ts on how to use generative AI tools
  • Examples of appropriate and prohibited usage
  • Guidance on how to properly cite AI-generated content in publications
  • Warnings about the proliferation of bias in AI-generated content and guidance on how to control for this
  • Data privacy guidance (regarding the sharing of confidential data and personal information—this could go beyond things like students’ personal information to include survey responses or email replies with personally identifiable information!)
  • Links to recommended AI tools and resources (preferably vetted or regularly referenced by your own team)
  • Links to other relevant policies or resources at your university (bonus points for a link to an AI working group!)
  • Asserting the need for human review of AI-generated content and personal responsibility in the generating of such content
  • Specifically encouraging experimentation with AI tools (with guidance on how to effectively do so)
  • Introductory guidance on prompt writing
  • Recommended process for using AI tools in your project workflow
  • Disclaimers about AI being an evolving field and that the guidelines are a living document and not a formal policy (noting the date they were last updated)

Creating and maintaining an evolving set of guidelines for a practice that is very much in flux may fly counter to higher ed’s predilection for fixed and vetted policies. But with the impact of generative AI on our work only poised to grow, it is imperative that higher ed adapt a nimble approach to guiding marketing and communications teams to leverage these technologies appropriately and effectively.

Georgy Cohen is director of digital strategy at OHO Interactive.