You have /5 articles left.
Sign up for a free account or log in.

A robotic hand holds a stack of books, with the bottom book reading "AI" on its spine.

Inside Higher Ed | Moor Studio/Getty Images

The higher education sector is awash with hype about the potential of rapidly evolving generative AI tools that promise to revolutionize learning, research and work. Even as some AI tools’ developers are predicting the technology may soon replace a significant share of entry-level white collar jobs, many institutions—including the University of Florida, the California State University system and Ohio State University—are launching efforts to integrate AI tools into their curricula.

The rise of AI is also prompting questions from students and faculty about how and when to use it in a meaningful, ethical manner. But the guidance is still far from clear at most institutions; a recent Inside Higher Ed survey found that the vast majority of campuses have yet to adopt comprehensive AI strategies.

That’s where university libraries can come in, says Karim Boughida, dean of libraries at Stony Brook University in New York, which is in the process of developing its own campuswide AI guidance.

“The library is taking the lead to be the interdisciplinary hub,” he said, adding that centralizing such services at the library can help avoid the silos that department-specific AI guidelines often create. “Libraries are for everyone; anybody should feel like they can ask us for help and consultation on AI.”

To make that happen, Stony Brook’s library—which helped launch the university’s AI Innovation Institute earlier this year—is building a team of information professionals focused on helping students and faculty navigate the practical and ethical considerations of using AI.

One of those people is Nicholas Johnson, who started earlier this month as the library’s inaugural director of AI. He’s one of the first people in the country to hold such a position at an academic library, which he landed after losing his data-analysis job at the United States Agency for International Development amid the Trump administration’s widespread cuts to government agencies. Inside Higher Ed interviewed Johnson about what drew him to the role and his vision for educating the campus about AI.

(This interview has been edited for length and clarity.)

Q: Although you’re new to working in libraries, you have a lot of experience working with data. Tell us more about your background and how you approach data analysis.

A: I come from a background of focusing on human-centered design in terms of using technology.

I did a lot of programming and computer interface work during my undergraduate and graduate training. My Ph.D. is in urban science. I focused on the intersection of computer science data and cities, developing sensors to monitor air quality and collecting data using new technologies. A lot of what I was doing was generating data, then analyzing that data from a human and social perspective to understand different things going on in our world. For example, I spent a lot of time understanding waste streams in my graduate research, which involved looking at data on how waste moves through cities and using technology to try and track data throughout cities.

Headshot of a light-skinned man with brown hair and a short beard, wearing a blue button-down shirt.

Nicholas Johnson

Stony Brook University

Once I left academia, I started working in city government, specifically fire departments. I was director of operations research there for a couple years, where we were using machine learning AI to improve our fire and EMS operations in the city. I left the fire department and then went to the USAID, where I worked in the bureau that handled global food security until I was part of the reduction in force there earlier this year.

Q. What drew you to apply for this newly created position, where you’re working at the intersection of artificial intelligence and academic libraries?

A: Like many of my colleagues who were also laid off from USAID, I was trying to think about how I could transfer my skill sets into different markets, which was a huge challenge. I had the data background, and certainly, there were a lot of opportunities in the private sector that I applied for. But after going through the interviewing process for many of those jobs, I realized it’s not what I’m passionate about. I don’t really care to go to some company in Silicon Valley to do analytics on the number of likes on a Facebook post—that’s not interesting.

Around the same time, I found the director of AI at the Stony Brook library job advertisement, and it was immediately very exciting to think about addressing this new challenge to libraries. It’s a little bit different than what I’ve done before, but it was also very clear to me that there is a need for people who have a critical view of AI who also understand the ethical, responsible uses of AI. To me, this position isn’t about riding the AI hype wave, it’s about figuring out how we can use AI in a thoughtful way.

Q: How are libraries well positioned to approach AI thoughtfully and deepen student and faculty understanding of its uses?

A: The library is the hub of the university, so we’re trying to think creatively and innovatively about how we can leverage AI tools—which are evolving very quickly—to serve our students, faculty and staff across the university. The library has always been a historical knowledge center, and adopting AI is a natural progression for libraries in terms of providing digital services and helping students advance and understand the literature, research and the knowledge resources the library has.

Students are looking for guidance on how to use some of these tools in real-world experiences, and they have an array of questions and concerns. Our approach is to think about how we use those tools in an ethical and responsible way, and that’s the value that we’re trying to instill through the library services. Generative AI is becoming a bigger part of our lives and increasingly part of requirements in the workforce. We’re trying to provide students an advantage as they start to focus on their own careers. The more students are aware of what the tools are, of how to use them and really understand, again, the ethical and responsible uses of these tools, the more empowered they’re going to be to take advantage of them.

It’s also our role and responsibility to engage faculty in these discussions. Some faculty are very apprehensive about allowing their students to use AI and some are very much encouraging it.

So, the library can engage separately with students and then separately with faculty, learn from both and then be mediators and really drive the conversation on AI.

Q: What’s your vision for advancing those goals as director of AI at the library?

A: My role specifically is to try and coordinate the library’s work as an AI hub.

For one, there’s this sense that AI is huge. Most people know it through a chat bot, but it’s much more than that. Certainly, there are some students in computer science or other departments that are going to be very engaged in this, but we want to provide that to all students across different disciplines. One of the things that I want to do is to create a community of practice, where students can get hands-on experience with AI, have more in-depth discussions and exploration of AI, and also support them in using it to develop their own research projects.

The library has already been brainstorming ideas and implemented some creative uses of AI. For example, some of the staff have already developed a research assistant, where students can come and ask questions to access our catalogs. We’re also working on a book finder, which uses AI to identify where in the library you can find a specific book. We’re also in the process of creating a dedicated space on the first floor of the library for students across campus to come develop their ideas, have access to high-performance machines to actually use and develop models, to kind of be more educated, and have the services there available to learn more about AI.

Q: Generative AI has also spurred concerns about misinformation and data privacy. How do you plan to enhance AI literacy in your new role?

A: Part of the library’s role in AI literacy is to ensure that we’re being clear and transparent about what is happening, how it’s happening and the ways that we can evaluate it. There’s a strain of research which is focused on how we actually evaluate bias in AI algorithms and data; those are all things that we are trying to ensure that students are aware so they don’t blindly accept AI-generated output.

I’ve got the same idea when it comes to privacy. I don’t know to what extent students understand the privacy concerns about just uploading some documents to ChatGPT to get an analysis from it. That’s also a part of our goal—thinking through the implications beyond just uploading and getting the results.

Next Story

Written By

Share This Article

More from Libraries