It came as no surprise to many of those attending the annual meeting of the Association of College and Research Libraries this weekend that the typical liberal arts freshman believes Time and Newsweek to be legitimate scholarly sources. Groans and laughter accompanied this and other non-surprising factoids -- 100 percent of incoming liberal arts freshmen surveyed use online sources, most think it’s easy to know when to document a source but nearly half couldn’t determine when one was required -- that are familiar to anyone who works at a college library.
But while the problems of “information literacy” and the limitations of otherwise tech-savvy students’ abilities to differentiate between legitimate and unacceptable sources are well known, there is yet to be a unified, coherent approach to combating them.
Under increasing pressure from accreditation groups and growing awareness of the problem, many such efforts are emerging to assess students’ literacy in research practices and track improvements over their college careers, like the Educational Testing Service’s ICT Literacy Assessment and Kent State University’s Project SAILS. A well-attended panel session at the ACRL meeting on Friday focused on another such approach.
The First Year Information Literacy in the Liberal Arts Assessment project (FYILLAA) focuses specifically on the liberal arts context, developed initially by a collaboration of eight Midwestern institutions: Carleton College, DePauw University, Grinnell College, Lake Forest College, Macalester College, Ohio Wesleyan, St. Olaf College and the University of Chicago.
With “surveys falling like mushrooms after a heavy rain,” as Carolyn Sanford, head of reference and instruction at Carleton College, put it, this particular assessment, created under the auspices of the National Institute for Technology and Liberal Education and available to its 119 participating institutions, is concerned with the specific culture of liberal arts colleges and “the type of students who decide to go to liberal arts colleges,” in contrast to, for example, the ETS exam’s broader reach.
The liberal arts group’s approach is unique in a number of ways, said Jo Beld, a professor of political science at St. Olaf College and its director of academic research and planning:
- It doesn’t just rely on self-reports; FYILLAA actually tests student competence through a number of situations with multiple-choice answers.
- It is “multidimensional,” testing for both knowledge and students’ experience.
- It is “directly useful for instruction.” Beld, who was on the national technology institute’s steering committee for the project, contrasted their assessment to Project SALES, which she did not find directly applicable to instruction efforts aimed at increasing information literacy.
- The test was developed with input not only from librarians but also from faculty and staff as well.
The Midwestern institutions’ project also offers both post-testing and the ability to track individual students, Sanford said.
Nancy Millichap, NITLE’s director of professional development programs, said it might be possible to open the assessment beyond the group’s membership, but noted the costs involved. Beld in particular stressed that “liberal arts experiences” are not uncommon at other types of institutions -- say, a large state university or the Ivies.
Two themes emerged from this and another panel at the conference dealing with information literacy and how to improve it: the various groups of people developing assessments and working on ways to improve learning are not always paying attention to each other’s efforts; and among the university library community, there seems to be a consensus that tying research instruction into the curriculum would go a long way toward bridging the literacy gap.
“It is great that multiple groups are working on [information and communication technology] literacy,” said Barbara O’Connor in an e-mail. She is director of the Institute for the Study of Politics and Media at California State University, Sacramento, and although she was not on the panel, remains a prominent advocate for information literacy. “The only caveat [to the multiple approaches] is that we agree on standards and what thresholds should be.” The ACRL approved widely acknowledged standards in 2000.
Sanford, for instance, said that when the NITLE assessment was first being developed in 2005, the ETS test wasn’t “out there,” and certainly both seem to have been conceived and created at roughly the same time. “We haven’t looked at it seriously,” Sanford admitted. “They’re a company, and they have all sorts of resources.” A member of a later panel echoed the sentiment. “We haven’t been paying much attention” to NITLE’s effort, said Terrence Bennett, a business and economics librarian at the College of New Jersey Library.
The reasons for the disconnect might be the obvious: the problem is so well known -- and it’s not limited to the liberal arts -- that many librarians are focusing on solutions rather than measuring the skills gaps they’re up against. “It validates what we’ve been struggling with for so long,” said Bennett, echoing the sentiments of many at the conference.
Or, perhaps, various groups are waiting for a clear frontrunner to emerge from among the competing information literacy assessments. “At some point, we will get to standardized testing and some of these will fall out in the same way we did with the SAT and GRE,” O’Connor said.
Meanwhile, suggestions to bridge the gap between the library and the classroom include holding class sessions in the library and building research requirements into grant applications, Beld said. Since librarians can’t themselves force students to use specific research methods for assignments, some hope that working more closely with professors and teaching on-site will allow them to approach the problem on their own turf. “We’re not the people who are assigning grades that are the results of using these resources,” Bennett noted in a second panel session.
But where significant integration of classroom and library instruction in research methods is lacking, librarians have found another tool to combat information illiteracy: online tutorials, which like FYILLAA can be used to track individual improvement.
Getting through to students using tutorials will always be an uphill battle, though. Referring to the conference’s keynote address, by conspicuously out-of-place trash-film auteur John Waters, Bennett jokingly suggested a sensory enhancement or two. Taking a cue from the filmmaker’s innovative use of scent to enrich the moviegoing experience, he offered a new take on “Odorama” -- marshaling its potent forces to enhance learning and memory.
“John Waters was all over information literacy … when he was putting together Polyester,” Bennett said.