WASHINGTON — In a report released in February, 17 librarians, scientists, and technologists spent 116 pages detailing the challenges of preserving culturally valuable digital artifacts. But at a symposium held Thursday to discuss the findings, it was perhaps Derek Law, chair of JISC Advance, a British advocate for technology in higher education, who articulated the problem most succinctly:
“If I put a book in [University of Oxford's] Bodleian Library, I know exactly what they’re going to do with it: they’re going to keep it forever, or at least as long as they can,” Law said. “If I give a file to the Oxford University computing center, I know what they’re going to do: They’re going to lose it within three months.”
For Law — who was one of the many preservationists to speak at the symposium, but probably the most colorful — this lack of faith in even higher education’s most august institutions to effectively preserve digital materials bespeaks an endemic crisis. And there is plenty of blame to go around.
“Scholars, at the moment, are a single point of failure in the system,” Law said.
“I’d like to see us be much more aggressive dealing with the scholarly community,” he said, suggesting that universities have been too timid in creating incentives for scholars to contribute to the preservation effort. Institutions, he said, should require academics to help preserve the scholarship they produce as a condition of their employment.
Most libraries, meanwhile, have a “digital overlap strategy” when it comes to preserving digital artifacts, said Law, holding up two pairs of crossed fingers. The gesture elicited chuckles from the audience. “They’ve done two things,” he continued. “Set up committees and licensing agreements with publishers — because we’re good at committees — and then started digitizing the bits of paper we have already. What they have failed to do is to engage wholeheartedly with born-digital material, because it’s really difficult.”
Guiding Principles, But No Solutions
While Law was more blunt in his framing of the problems facing digital scholarship than the formal document produced by the Blue Ribbon Task Force on Sustainable Digital Preservation and Access, there was consensus on the main point: Too little has been done to systematically preserve the massive quantities of cultural artifacts being produced daily in digital formats.
The task force report analyzes the problem according to four different types of information that ought to be preserved — scholarly work, research data, commercially owned cultural content (e.g., television shows, films, online publications, etc.), and collectively produced Web content (e.g., blogs, social networking sites, etc.) — and makes recommendations to that end for each. Detailed articulations of the challenges and possible solutions related to each category of content are available in the full report, which can be viewed on the Web.
In general, the report urged institutions interested in preserving digital materials into the future to do a better job of articulating why it is worthwhile to commit to better preservation strategies; lobby for policies that encourage preservation; and make a greater effort to coordinate the preservation process such that various parties are investing as much into content preservation as they are getting out of it. The task force in particular criticized "free riders," who use content but do not contribute to its upkeep.
The report did not, however, provide answers to the three questions that would form the basis for an actual system: Who should be in charge of preservation; what criteria those institutions should use to determine which pieces of the impossibly vast universe of data deserve saving; and who should pay for it. “[These questions] are unanswered because we do not fully grasp the opportunities, constraints, and realities of sustainability from an economic point of view,” the report’s authors write.
Thursday’s symposium concluded with a panel of economists discussing some options that might help eliminate the “free rider” effect and distribute preservation responsibilities among more people and institutions.
William G. Bowen, an economist and president emeritus of the Andrew W. Mellon Foundation and Princeton University, suggested that the cost of preservation be incorporated into the price of access to archives. Bowen pointed to the example of JSTOR — the scholarly publication archive run by Ithaka, a nonprofit organization he helped found — which charges members a fee to feed a fund for preservation expenses.
Libraries could also try to pressure publishers to contribute to preservation funds, Bowen said — though this might be difficult, he added, since “academics, as we all know so well, will kill a librarian if the librarian refuses to subscribe to a favorite journal just because there’s not a preservation component in place.”
Hal R. Varian, chief economist at Google and erstwhile professor at the University of California at Berkeley, said that commercially owned content — such as television shows, radio broadcasts, and Web sites — could possibly be preserved by industry coalitions. It would not take much money, Varian said; the present issue with backing up culturally important broadcast content is not so much cost as a lack of interest.
Alternatively, the government could pass legislation defining the rights and responsibilities of an archive keeper vis-à-vis copyright, “then just let free entry take over,” Varian said. Under that scenario, for-profit archives could multiply like video rental stores of old, he said.
As for collectively owned content such as user-generated blogs, Varian said, specialized libraries might step forward as stewards of relevant blog content, following the example of the Congressional Law Library — which, he said, has begun backing up various legal blogs. “If you look at science libraries, medical libraries, business, art — there are many cases where libraries could take on the responsibility of backing up Web sites in their field of interest,” he said.
One tack that institutions should not take is that of trying to delegate responsibility for preserving digital artifacts to government, Bowen said. “Since in theory there is a public-good aspect to all of this, in principle one can also look to public sources for help,” he said. “But we should not, not, not assume that that is at all easy,” he added, citing the instability of public funding.
But if any of those in attendance had come into the symposium with illusions of the government filling the role of deus ex machina, they had likely been disabused earlier in the day — when Thomas Kalil, deputy director of the White House’s office of science and technology, told them that the federal government has too much on its plate to pay attention to anything but the briefest, narrowest requests from digital preservation advocates.
“In the paper, there is this notion of some benign, 30,000-foot view that’s looking down at these different communities and figuring out how to architect the right incentives,” Kalil said. “That’s sort of the implicit view about how to make progress on these issues.
“But it’s unlikely that there’s going to be a ‘preservation czar’ in the White House, running around,” he said. “So I think the majority of the progress that’s going to get done is going to get done by people in this room.”