You have /5 articles left.
Sign up for a free account or log in.

Zoom logo, contract, exclamation mark and question mark and megaphone

Faculty are becoming concerned a recent change of Zoom's terms of use could put their data at risk. 

Photo illustration by Justin Morrison/Inside Higher Ed | Zoom | Getty Images |  Rawpixel

Teleconference apps like Zoom have become synonymous with the modern classroom, with faculty relying on the services for everything from teaching and meetings to research projects.

But recently updated terms and conditions from Zoom—and subsequent backtracking by the tech company after an outcry—has left higher education faculty and experts with questions and concerns.

“We’ve become so dependent on things like Zoom and that dependence allows them to change things without consulting its users,” said Sukrit Venkatagiri, assistant professor in the Department of Computer Science at Swarthmore College.

On Monday, chatter began on X—the social network formerly known as Twitter—about Zoom updating terms and conditions. Most notably, Zoom stated it would have access to all customer data and could use it to train its artificial intelligence and machine learning systems.

Concern began brewing, with many faculty members joining other Zoom users by publicly sharing their outrage. Some vowed to stop using Zoom altogether.

Zoom quickly reversed course late Monday night, updating its terms and conditions to clarify it will not use audio, video or chats to train AI without user consent.

“Zoom customers decide whether to enable generative AI features, and separately whether to share customer content with Zoom for product improvement purposes,” Zoom said in a statement. “We’ve updated our terms of service to further confirm that we will not use audio, video, or chat customer content to train our artificial intelligence models without your consent”

But what “consent” means remains vague. Terms and conditions are typically supposed to be broad, according to Sean Hogle, an attorney specializing in commercial data and intellectual property law. With the rise of AI comes a scramble to incorporate new language.

“All these companies want to monetize this data for AI,” Hogle said, noting that tech firms realized in the last eight months that “they’re sitting on a potential gold mine.”

“Now with AI and machine learning, they’re realizing that [the customer services] clause needs to be beefed up to do a lot more and they’re rushing to do that now,” he said.

The Zoom changes and resulting controversy come at a particularly bad time for higher ed, said Jim McGrath, instructional designer at the Center for Teaching Innovation at Salem State University.

“The fact this is happening in early August where some universities have been teaching or making plans shows the disregard of core educational values we’re trying to foster and encourage here,” he said. “And the clarifications they make often raise more questions of what these companies are doing with this black box of data.”

The ambiguity has left some in the lurch. Swarthmore College’s Venkatagiri is concerned about using Zoom to meet with research participants.

“If there’s an even a slight possibility the data or photo or transcript becomes part of a database that I don’t have control over, how can I in good conscience continue to conduct research when I can’t guarantee their privacy?” he said.

Swarthmore is one of several higher education institutions that has its own contract with Zoom, which could mean more—or less—stringent terms and conditions.

Venkatagiri said while he is verifying Swarthmore’s exact Zoom terms, he will keep his research off Zoom for the time being.

Universities at large could have a chance to leverage their size to change or update their own contracts, according to Hannah Quay-de la Vallee, senior technologist at the Center for Democracy and Technology.

“The bigger you are, that’s legitimate power that individuals don’t necessarily have,” she said, saying universities could ask for an additional contract specifying the rules around their use of Zoom services. “It’s a question worth asking.”

In the interim, faculty are pushing for administrators and IT officials to keep communication lines open, so the faculty themselves will not have the onus of vetting every piece of technology they use.

“We do run into these things where faculty feel they have to become the expert and all the weight is put on them,” McGrath said. “We’re trying to create spaces where instructors can talk about this stuff and be compensated for it but it’s really challenging. Like what’s happening with Zoom: We’ve seen excitement around students being able to beam into courses and now there’s a big question mark.”

If there’s even a slight possibility the data or photo or transcript becomes part of a database that I don’t have control over, how can I in good conscience continue to conduct research when I can’t guarantee their privacy?

Sukrit Venkatagiri, assistant professor in the Department of Computer Science at Swarthmore College

As for alternatives to use in the meantime, platforms like Teams may have similar conditions now or in the future, according to Darren Laur, chief training officer at White Hatter, an internet safety and digital literacy education company.

Laur has turned toward building his own platform. But those come with their own headaches.

“What we do is not cheap and most schools don’t have the money to put the same tech into their schools, which is why they turn to third-party vendors,” he said, underlining the importance of knowing the privacy issues associated with third-party platforms.

As AI continues to permeate classrooms, admissions essays and dining halls, Quay-de la Vallee said it is “inevitable” that tech companies will continue to leverage it.

“The ‘If we don’t do it, someone else will,’ reason has paved the road to hell for a lot of tech stuff,” she said. “It’s true and also a terrible reason to do things.”

How the companies go about it, though, may be slightly different in part due to the recent backlash toward Zoom.

“Maybe other companies will hold off, but one thing I’d expect to see is much more tailored language, of, ‘This is what we’re doing and this is what it will look like.’ Which is good—be more explicit in how you’re going to do it.”

Next Story

Written By

More from Artificial Intelligence