Filter & Sort
Filter
SORT BY DATE
Order
An orange robot hand holds a pencil over a standardized test. The test has bubbles filled in spelling out "A" and "I"

Can AI Be Used to Cheat on Multiple-Choice Exams?

A Florida State professor found a way to catch AI cheating on multiple-choice tests. He also found that ChatGPT got a lot of “easy” questions wrong.

Opinion

Upskilling, Reskilling or Retiring: Responding to the Advent of AI

The anticipated replacement of human workers with generative AI apps has begun. Earlier this year IBM announced about 8,000 layoffs amid an AI-powered initiative.

An icon that says "AI" can be seen from above in the middle of an expanse of otherwise intact rainforest.

In Teaching With Gen AI, Consider Sustainability

Faculty lack information about generative AI’s environmental impacts, and universities should prioritize sustainable computing, Susanne Hall writes.

UC Berkeley Launches AI-Focused Law Degree

The University of California, Berkeley, Law School is now offering an artificial intelligence–focused law degree in response to a growing...
A robot hand points to a book. The book is being held by a human hand.

Struggling to Create AI Policies? Ask Your Students

A professor at Florida International University tasked her students with devising an ethical guide to using AI in their classes—and found them to be stricter than she would have been.

Special Report: Reducing Points of Friction With AI

Much of the discourse on artificial intelligence in higher education since the introduction of ChatGPT has focused on classroom applications...

AAC&U, Elon University Launch AI Guide for Students 

The American Association of Colleges and Universities and Elon University have launched an artificial intelligence how-to guide for students navigating...
Two police officers stand alongside a woman in a white button-up shirt. They are looking at a whiteboard.

When Professors Partner With Police

Universities are leveraging AI to help police overcome bias in crime fighting—while contending with the technology’s own biases.