Some months back, one of the cable networks debuted a movie -- evidently the pilot for a potential show -- that inspired brief excitement in some quarters, though it seems not to have caught on. Its central character was someone whose grasp of esoteric knowledge allowed him or her (I'm not sure which, never having seen it) to command the awesome mysterious forces of the universe. Its title was The Librarian.
The program was, it seems, a reworking of a similar figure in Buffy the Vampire Slayer. That's in keeping with the fundamental law of the entertainment industry once defined by Ernie Kovacs, the great American surrealist TV pioneer: "Find something that works, then beat it to death."
At another level, though, the whole concept derived from a tradition that is pre-television, indeed, almost pre-literate. The idea that a command of books provides access to secret forces, the equation of the scholar with the magus, was already well established before Faust and Prospero worked their spells. The linkage has also left its trace at the level of the signifier. Both glamor, originally meaning a kind of witchy sex appeal, and grimoire, the sorcerer's reference book, derive from the word grammar -- one of the foundational disciplines of medieval learning, hence a source of power.
Today, it's much rarer to find the whole knowledge/power nexus treated in such explicitly occultic terms, at least outside pop culture. As for librarians, they are usually regarded as professionals working in the service sector of the information economy, rather than as full-fledged participants in contemporary intellectual life. That is, arguably, an injustice. But the division of labor and the logic of hierarchical distinctions have changed a lot since the day when Gottfried Leibniz (philosopher, statesman, inventor of calculus and the computer, and overall polymathic genius) held down his day job running a library.
The most persistent aspect of the old configuration is probably the link between glamor and grammar - the lingering aura of bookish eroticism. At least that's what the phenomenon of librarian porn would suggest. The topic deserves more scholarly attention, though an important start has been made by Daniel W. Lester, the network information coordinator for Boise State University in Idaho. His bibliography of pertinent livres lus avec une seule main ("books read with one hand") is not exhaustive, but the annotations are judicious. About one such tale of lust in the stacks, he writes: "Most of the library and librarian descriptions are reasonable, except for the number of books on a book cart."
But the role librarians play at the present time brings them closer to the most pressing issues in American cultural life than any cheesy TV show (or letter to Penthouse, for that matter) could possibly convey.
Their work constitutes the real intersection of knowledge and power -- not as concepts to be analyzed, but at the level of almost nonstop practical negotiation. It is the cultural profession most involved, from day to day, with questions concerning public budgets, information technology, the cost of new publications, and intellectual freedom. (On the latter, check out the American Library Association's page on the Patriot Act.)
Given all that, I've been curious to find out about discussions by academic librarians regarding current developments in their profession, in the university, and in the world outside. A collection of essays called The Successful Academic Librarianis due out this fall from Information Today, Inc. Its emphasis seems to fall on guidance in facing career demands. But how can an outsider keep up with what academic librarians are thinking about other issues?
Well, the first place to start is The Kept-Up Academic Librarian, the blog of Steven Bell, who is director of the Gutman Library at Philadelphia University. Bell provides a running digest of academic news, but for the most part avoids the kind of reflective and/or splenetic mini-essays one associates with blogdom.
My own effort to track down something more ruminative turned up a few interesting blogs lus avec une seule main run by librarians, such as this one. But this, while stimulating, was not quite on topic. So in due course I contacted Steven Bell, on the assumption that he was as kept-up as an academic librarian could be. Could he please name a few interesting blogs by academic librarians?
His answer came as a surprise: "When you ask specifically about blogs maintained by academic librarians," Bell wrote earlier this week, "the list would be short or non-existent."
He qualified the comment by noting the numerous gray areas. "There may be some academic librarians out there with an interesting blog, but in some cases I think the blogger is doing it anonymously and you don't really even know if the person is an academic librarian. For example, take a look at Blog Without a Library. I can't tell who this blogger is though I think he or she might be an academic librarian. On the other hand Jill Stover's Library Marketing blog is fairly new and pretty good, and she is an academic librarian -- but the blog really isn't specific to academic libraries.... Bill Drew of one of the SUNY libraries has something he calls BabyBoomer Librarian but it isn't necessarily about academic librarianship -- sometimes yes, but more often not."
Bell listed a few other blogs, including Humanities Librarian from the College of New Jersey. But very few of his suggestions were quite what I had in mind -- that is, public spaces devoted to thinking out loud about topics such as the much-vaunted "crisis in academic publishing." It was a puzzling silence.
"I can't say any individual has developed a blog that has emerged as the 'voice of academic librarianship,' " noted Bell in response to my query. "Why? If I had to advance a theory I'd say that as academic librarians we are still geared towards traditional, journal publishing as the way to express ourselves. I know that if I have something on my mind that I'd like to write about to share my thoughts and opinions, I'm more likely to write something for formal publication (e.g., see this piece.) Perhaps that is why we don't have a 'juicy' academic librarian out there who is taking on the issues of the day with vocal opinions."
And he added something that makes a lot of sense: "To have a really great blog you have to be able to consistently speak to the issues of the day and have great (or even good) insights into them -- and it just doesn't seem like any academic librarian out there is capable of doing that. I think there are some folks in our profession who might be capable of doing it. But if so they haven't figured out yet that they ought to be blogging, or maybe they just don't have the time or interest."
Now, that diagnosis may perhaps contain the elements of a solution. The answer might be the creation of a group blog for academic librarians -- some prominent in their field, others less well-known, and perhaps even a couple of them anonymous. No one participant would be under pressure to generate fresh insights every day or two. By pooling resources, such a group could strike terror in the hearts of budget-cutting administrators, price-gouging journal publishers, and even the occasional professor prone to associating academic stardom with aristocratic privilege.
Full disclosure: I am married to a librarian, albeit a non-academic one, who knew about the World Wide Web (and the proper grammar for using various search engines) long before most people did. She has proven to me, time and again, that librarians do indeed possess amazing powers. They also tend to have a lot to say about the bureaucracies that employ them -- and the patrons who patronize them.
An outspoken, incisive, and timely stream of commentary on the problems and possibilities facing academic libraries would enliven and enrich the public discourse. If anything, it's long overdue.
Scott McLemee writes Intellectual Affairs on Tuesdays and Thursdays. One of his previous columns was on the pleasures of reading encyclopedias.
One sign of the great flexibility of American English -- if also of its high tolerance for ugliness -- is the casual way users will turn a noun into a verb. It happens all the time. And lately, it tends to be a matter of branding. You "xerox" an article and "tivo" a movie. Just for the record, neither Xerox nor TiVo is very happy about such unauthorized usage of its name. Such idioms are, in effect, a dilution of the trademark.
Which creates an odd little double bind for anyone with the culture-jamming instinct to Stick It To The Man. Should you absolutely refuse to give free advertising to either Xerox or TiVo by using their names as verbs, you have actually thereby fallen into line with corporate policy. Then again, if you defy their efforts to police ordinary language, that means repeating a company name as if it were something natural and inevitable. See, that's how they get ya.
On a less antiglobalizational note, I've been trying to come up with an alternative to using "meme" as a verb. For one thing, it is too close to "mime," with all the queasiness that word evokes.
As discussed here on Tuesday, meme started out as a noun implying a theory. It called to mind a more or less biological model of how cultural phenomena (ideas, fads, ideologies, etc.) spread and reproduce themselves over time. Recently the term has settled into common usage -- in a different, if related, sense. It now applies to certain kinds of questionnaires or discussion topics that circulate within (and sometimes between) blogospheric communities.
There does not seem to be an accepted word to name the creation and initial dissemination of a meme. So it could be that "meme" must also serve, for better or worse, as a transitive verb.
In any case, my options are limited.... Verbal elegance be damned: Let's meme.
The ground rules won't be complicated. The list of questions is short, but ought to yield some interesting responses. With luck, the brevity will speed up circulation.
In keeping with meme protocol, I'll "tap"a few bloggers to respond. Presumably they will do likewise. However, the invitation is not restricted to that handful of people: This meme is open to anyone who wants to participate.
So here are the questions:
(1) Imagine it's 2015. You are visiting the library at a major research university. You go over to a computer terminal (or whatever it is they use in 2015) that gives you immediate access to any book or journal article on any topic you want. What do you look up? In other words, what do you hope somebody will have written in the meantime?
(2) What is the strangest thing you've ever heard or seen at a conference? No names, please. Refer to "Professor X" or "Ms. Y" if you must. Double credit if you were directly affected. Triple if you then said or did something equally weird.
(3) Name a writer, scholar, or otherwise worthy person you admire so much that meeting him or her would probably reduce you to awestruck silence.
(4) What are two or three blogs or other Web sites you often read that don't seem to be on many people's radar? Feel free to discard anything you don't care to answer.
To get things started, I'm going to tap a few individuals -- people I've had only fairly brief contact with in the past. As indicated, however, anyone else who wants to respond is welcome to do so. The initial list:
An afterthought on the first question -- the one about getting a chance to look things up in a library of the future: Keep in mind the cautionary example of Enoch Soames, the minor late-Victorian poet whose story Max Beerbohm tells. He sold his soul to the devil for a chance to spend an afternoon in the British Library, 100 years in the future, reading what historians and critics would eventually say about his work.
Soames ends up in hell a little early: The card catalog shows that posterity has ignored him even more thoroughly than his contemporaries did.
Proof, anyway, that ego surfing is really bad for you, even in the future. A word to the wise.
Submitted by Eric Jager on August 11, 2005 - 4:00am
Unlike my first two books, produced by university presses, The Last Duel, about a notorious criminal case that riveted France in 1386, and drawing from new documents I found in the French archives, was published commercially and aimed at a popular audience. My last article focused on how to deal with commercial presses; what follows is focused on writing for them.
To appeal to a popular audience, my book had to avoid the forbidding jargon and arcane theory that now plague the humanities. It also had to offer a lively narrative that would capture and keep readers’ interest. Losing the jargon and the theory was easy -- and very liberating. But learning how to craft an appealing narrative for a general audience was a much bigger challenge. For 15 years, since beginning a full-time university teaching career, I had written mainly for other academics. How could I write a book that appealed to thousands, not just dozens, of readers?
Just as studying the trade-book business helped me to sell my book to literary agents and editors (as described in part one of this piece), studying other crossover books by successful scholarly authors gave me a kind of “on-the-job training” for the task. I also had the advice of some excellent readers, and a superb editor. Here is what I learned, including good advice and useful examples I tried to follow in writing my book, as well as some pointers and illustrations that have come my way since finishing it.
1. Getting the Reader’s Attention: A few months ago, a bookseller was telling me how people browse for books in his store, something he watches carefully. “First they pick it up and look at the cover,” he said. “If they don’t put it down, they turn it over read the quotes on the back. Then, if they’re still interested, they open it and begin reading. If they read it, there’s a chance they’ll buy it.” Of course, people don’t always begin reading a book at the first sentence. But if they do, you can grab their attention with a good opening.
"I always wondered how he did it." That’s Howard Bloch’s brief, curiosity-arousing start to God’s Plagiarist, his entertaining and revealing biography of the remarkable Abbé Migne, the 19th-century French priest who edited, printed, and sold more books by the yard than anyone else before him in human history.
Sometimes a longer, more descriptive opening works better, as when Natalie Zemon Davis maps out a whole journey at the start of The Return of Martin Guerre: “In 1527 the peasant Sanxi Daguerre, his wife, his young son Martin, and his brother Pierre left the family property in the French Basque country and moved to a village in the county of Foix, a three-week walk away.” By the end of that sentence, we’re already traveling with the family along the country roads of 16th-century France.
A dramatic conflict can open a book nicely, too. James Shapiro begins his new book, 1599: A Year in the Life of William Shakespeare, with the colorful story of how , on a freezing winter day, armed members of Shakespeare’s playing company nearly came to blows with supporters of the landlord of a disputed theatre, before they prevailed and hauled away the theatre’s dismantled frame. Before you know it, you’re in London 400 years ago, feeling the icy winter air and the heat of an off-stage dispute.
2. The Golden Triangle: Many readers want books about real, interesting people who lived dramatic lives in colorful times or places. Successful nonfiction books -- like most novels -- tend to work within a “golden triangle” of plot, character, and setting. Even jokes and anecdotes -- our shortest forms of narrative -- rely on these three basic elements. News articles do as well, as codified in the famous Who-what-when-where-how-and-why, which leads with character and plot. The authors of longer narratives, if they want readers, must do the same.
Plot, character, and setting -- essential to all good narratives, both fiction and nonfiction – have been analyzed in critical classics from Aristotle’s Poetics to E. M. Forster’s Aspects of the Novel. But as the humanities have abandoned the warm campfires of story for the frigid heights of theory, historians and other social scientists once given to dry fact-gathering have rediscovered the joys of narrative, as successful popular books by Simon Schama and others attest. Even in the sciences, narrative writing has enjoyed a renaissance, as popularizing experts such as the late Stephen Jay Gould have satisfied a hunger for good stories and readable essays once filled largely by the humanities.
Gould’s best-selling book Wonderful Life recounts the early-20th-century discovery of the Burgess Shale fossils in British Columbia and their profound implications for evolutionary science. Although the book is about biology, and some passages may slow down non-specialists, Gould, like many skilled writers, appeals to popular readers by telling a story. He weaves the discovery -- and belated rediscovery -- of the fossils (plot) into the personal histories of the scientists (character), situating both in the relevant geological strata or cultural milieu (setting). As signaled by the film allusion in his title, Gould even comments, by way of another story (George Bailey’s, as played by Jimmy Stewart), on the nature of narrative and its role in our scientific understanding of ourselves. Touché!
Paul Fussell’s widely acclaimed book, The Great War and Modern Memory, begins with a sentence that almost embodies the golden triangle: “By mid-December, 1914, British troops had been fighting on the Continent for over five months.” The British troops, soon to be fleshed out as individual poets and combatants, will be the book’s main characters. The “fighting” points to plot, which includes the myriad other activities -- eating, sleeping, smoking, talking, reading, and writing letters -- pursued by the troops between brutal shellings and risky raids. And “the Continent” is obviously the main setting, soon to be mapped out in Fussell’s meticulous and highly readable narrative as particular fronts, trenches, and wire-infested no-man’s-zones. From its first sentence, the book tells a story, drawing readers into its world.
3. Story Structure: As Aristotle famously said, every story has a beginning, a middle, and an end. Or as the novelist Peter De Vries once quipped, "a beginning, a muddle, and an end." The main story in turn consists of smaller narrative units -- each with its own beginning, middle, and end -- that offer smaller payoffs to readers along the way as they move toward the big payoff at the book’s end.
The most obvious narrative unit in a book is the chapter, but chapters, too, generally consist of smaller units, often signaled by headings, or enlarged capitals, or simply white space. These smaller units tend to have their own narrative integrity. Yet all the pieces must add up to a single, compelling story.
In Trying Neaira, her unbuttoned biography of a courtesan in fourth-century B.C. Greece, Debra Hamel invites readers right into the story with her frank, immediate narrative style. Her first sentence even daringly risks a foreign term in italics -- a Greek verb with a sexual meaning -- in a way that’s sure to excite interest in the story rather than putting readers off. Hamel also divides her book (classically) into three parts that embody beginning, middle, and end. And she marks the reader’s path by dividing each chapter into bite-sized pieces of narrative topped by short descriptive headings such as “Buying Neaira” and “Playing the Sycophant.”
In my own book, The Last Duel, each chapter comprises a narrative of its own even as it advances the book’s main story. In chapter three, for example, a Norman noble leaves home to join a foreign campaign, risks his life in battle, and returns home in bad health and seriously in debt but having earned a knighthood. The chapter, although fact-based, assumes a familiar narrative shape through its pattern of journey–battle–journey (symmetry), and the character’s changed situation as a result of the campaign (development and contrast). The reader is meant to finish the chapter with a sense of completing one small story -- with its own beginning, middle, and end -- even as its results (illness, debt, a knighthood) set up the next chapter of the main story.
4. Surprise and Suspense: Crossover books can also exploit other traditional storytelling techniques, such as surprise, suspense, and foreshadowing. In The Return of Martin Guerre, Natalie Zemon Davis orchestrates a dramatic conclusion to one chapter as follows: "The Criminal Chamber was about to make its final judgment of the case, opinions being ‘more disposed to the advantage of the prisoner and against the said Pierre Guerre and de Rols,’ when a man with a wooden leg appeared at the buildings of the Parlement of Toulouse. He said his name was Martin Guerre."
Here are suspense (the wait for the verdict) and surprise (the new plot twist). The book’s title, of course, foreshadows the revelation all along, and the question is less one of what than of when. But even if we suspect what will happen, or we’ve seen the film before reading the book, the passage makes a powerful impression. The first long sentence packs its punch into its final clause, as the phrase "a man with a wooden leg" confronts us with a physical fact before the precise meaning of that fact is revealed. Thus we experience the man’s arrival, with its aura of mystery, as the people in the story did. The terse final sentence, an indirect quote, divulges the deferred meaning, as Guerre’s own voice, in effect, announces the identity of the one-legged man. The passage is founded on fact, but its carefully arranged details and rhetorical devices form the facts into a compelling piece of story.
A graduate school mentor once said to me about teaching: "You have to know everything about your subject, and then remember what it’s like not to know anything." There’s a useful writing tip here, too. For readers to understand and enjoy your book, you have to imagine how they experience the journey, the discoveries along the way, and the sense of an ending that is promised but, for a time, deferred.
5. Being There: The noted biographer Robert Caro, discussing how he tried to capture the atmosphere and events of the American Civil Rights era, once said: "Make the reader see, make the reader feel, what was happening. If it was thrilling, make it seem so." Not just another version of the old adage, "show, don’t tell,” this useful advice stresses the importance of putting the reader imaginatively at the scene. (I kept Caro’s quote taped up over my desk while writing my book.)
Caro abundantly illustrates his own advice in his widely read books. But to flesh out this fifth and final point, I’ll illustrate what Caro says with an example from an author who better embodies the crossover phenomenon, Simon Schama.
Schama’s many popular historical books have found a large readership, driven in part by his role as host for the TV-miniseries, “A History of Britain.” In the companion volume to part one of that series, Schama puts his readers right at the battlefield in 1066, as the Normans and the Saxons prepare to fight:
“If you were a Norman foot soldier you would be praying that the gentlemen on horses know what they’re doing. All around you is the scraping of metal: the sharpening of swords; the mounting of horses. You peer up to the brow of the hill and see a thin, glittering line. You cross yourself and toy with the linked rings of your coat of mail. Can they dull the blow of an axe? You’ve never faced axes in battle before....”
It’s not just the collar-grabbing repetition of “you” that makes this passage so compelling. It’s also the vivid physical details -- sights, sounds, even tactile sensations -- that tell us what it was like to be there, how it looked and felt. While the style may annoy some academic historians by “personal intrusions” or “lack of objectivity,” its vivid imagery and human emotion are precisely the things that thrill general readers and bring history alive for millions. As storytelling, it’s masterful, and it epitomizes the art of crossing over from the library or the archive into the reader’s imagination.
Eric Jager is a professor of English at the University of California at Los Angeles, where he teaches medievalÂ literature. He is the author of three books, most recently, The Last Duel: A True Story of Crime, Scandal, and Trial by Combat in Medieval France ( Broadway, 2004). The paperback will appear in September, and a BBC documentary based on the book will air during the next year.
Next week, Intellectual Affairs will undergo a big change. (For those who have been wondering why the column disappeared for the past couple of weeks, that's the reason: remodeling.)
From now on, the slot for Thursday will be dedicated to coverage of new books -- mostly, that is, of what is often called "serious nonfiction," with a definite bias in favor of scholarly titles. But the column won't entirely neglect fiction. Nor am I absolutely committed to seriousness, for its own sake, on a 24-7 basis. (Some academic books are preposterous, and they deserve attention as well.)
Work from university presses will have a definite advantage for coverage in this slot that they might not elsewhere. The same is true of books from small presses. Having a muscular publicity apparatus behind a given title won't do it any good; the only real criterion, to be honest, is the appetite and mood of my intellectual tapeworm.
You might very well find an interview with somebody whose books never get reviewed anyplace besides The Outer Mongolian Review of Phenomenological Ontology. At the same time, I plan to cover a couple of forthcoming books that will probably be best-sellers.
The distinction between "popular" and "specialized" titles is very important to booksellers; also, to snobs. Otherwise, however, it does not seem all that meaningful, let alone worthy of respect.
In its Tuesday slot, Intellectual Affairs will continue along the lines it has followed for several months now, offering the usual smorgasbord: thumbnail accounts of scholarship, glosses on current events, interviews with academics and writers, personal essays, reading notes, and the occasional targeted spitball.
The decision to take on regular book coverage is, in part, a matter of putting my backbone where my mouth is. It's easy enough to complain about the erosion of book coverage by mainstream media. Doing something about it is another matter.
Ever fewer newspapers give any space at all to books of any kind. And most that do, it seems, have cut back in recent years. Even then, they tend to run material "off the wire" -- that is, from news services. Which means (in turn) that titles and topics reflect some vague but rigid notion of what "the public" will find of interest.
As for the general-circulation newsmagazines, they are, if anything, even worse about it. Last year, I complained about this bitterly at some length in a speech at the awards ceremony for the National Book Critics Circle.
There was a murmur of assent from the crowd. And for one brief, adrenaline-charged moment, it seemed possible to imagine shaming certain very powerful media gatekeepers into a sense of responsibility.
Perhaps the days might return when Time and Newsweek felt some obligation to report on the same books covered in The New York Review of Books. They did, you know, once upon a time.
Well, no such luck. If the editors of Time and Newsweek do have a model for their cultural coverage, it seems to be People magazine.
Even when the mainstream media do attend to ideas or substantive books, there can be serious misfires, as discussed last time. After all, the ethos of a newsroom bears no resemblance to that of a seminar room.
After some years in journalism, I finally understood that the profession has its own metaphysics. Reporters exist in a universe that no scholar can quite imagine. In it, the world came into existence only during the previous week, and nobody understood a thing about it until earlier this morning.
There are good reasons for operating according to this "as if." It is a perfectly suitable framework for handling a zoning dispute or a payola scandal, for example. But it creates certain problems in covering ideas, books, or arguments with a complex backstory.
As a corrective, of course, one may zip past complexity and shoot for hipness by declaring that some concept or trend is "hot."
Lending an aura of sexiness to the otherwise abjectly nerdish strikes me, for various reasons, as a good thing. But there can be too much of a good thing. Whole sectors of academic life are already so dominated by the star system that, in attending conferences, you halfway expect to see a red carpet.
So this column will never, ever, under any circumstances, call any book, idea, or person "hot." (Paris Hilton can have that word.)
Is pessimism in order, then? Maybe it is. But there have been encouraging signs, from time to time.
The late and sorely missed Lingua Franca had a section called "Inside Publishing." It will serve as one model for the book coverage in the Thursday slot. (Between 1995 and 2000, I turned out many an Inside Publishing piece, so the inspiration here is not purely contemplative.)
And the Ideas section of The Boston Globe is an oasis in what often seems like a newspaper desert.
And while Intellectual Affairs is not a blog, it takes some inspiration from the emergence of a new and growing public sphere. Or rather, of a perpetually self-renewing multitude of public spheres -- organized in so many layers that broad, categorical statements tend to be reminders of Blake's aphorism: "To generalize is to be an idiot."
Sobering words for a generalist writer to consider, to be sure. Still, at whatever risk of nonspecialist idiocy, the new books coverage will start next Thursday.
And on Tuesday, I'll be back with a report on a contemporary thinker so disgusted by his own celebrity that he's faked his own death....well, sort of.
Two years ago, The Virginia Quarterly Review published an essay called "Quarterlies and the Future of Reading." The author, George Core, has been the editor of The Sewanee Review since 1973 -- at which time, it was already a venerable institution, one of the oldest publications of its kind in the United States.
By "publications of its kind," I mean the general-interest cultural quarterlies, usually published by universities or liberal-arts colleges. Unlike scholarly journals, they aren't focused on a particular field. Usually they offer a mixture of contemporary poetry and fiction with essays that are learned but nonspecialist.
It pays to be explicit about that, because so few people keep track of the university quarterlies now. Many of them are still around, with a modest if reliable subscriber base in the libraries. But it often seems as if they continue just by inertia.
It is always a sentimental gesture to speak of a golden age. But what the hell: The years between, say, 1925 and 1965 were a glorious time for such journals. Then, as now, their circulations were usually modest. But the worlds of publishing and of the university were smaller, and the quarterlies had a disproportionately large role to play. Truman Capote once mentioned in an interview how he knew he had "arrived" as a young writer in the 1940s: On the same day, he received two or three letters from the editors of quarterlies accepting his short stories for publication in their pages.
In his essay, two years ago, Core insisted that "the literary quarterly ... has been the linchpin of civilization since the 18th century." And if things did not look encouraging at the dawn of the 21st century ... well, so much the worse for what that says about civilization. "The average librarian these days, like many members of various departments in the humanities," wrote Core, "has become hostile to books and hostile to reading." Needless to say, technology is to blame.
It is hard to know what to make of the fact that Core's essay is now available online. I mean, sure, you can read it that way, but would he really want that?
But cantankerousness in defense of the quarterly is no vice. Nor, for that matter, is it a virtue to overstate how much the university-based general-interest periodical has declined. The situation may not be good, but it is not quite catastrophic.
Core's Sewanee Review is hopelessly out of touch with many trends in contemporary literary studies -- which is one reason it is still worth reading. But The Minnesota Review is very much in touch with developments in cultural theory, while also publishing a good deal of poetry, fiction, and personal essays. I've been reading it with interest for 25 years now (during which time it has never actually been published or edited in Minnesota). The Common Review, published by the Great Books Foundation, occupies a niche somewhere between the old-fashioned university quarterly and magazines such as Harper's and The Atlantic Monthly.
The list could go on -- and if it did, would have to include such non-academic, sui generis publications as N+1, which I've been urging upon the attention of startled bystanders ever since seeing the prototype pamphlet that appeared in advance of its debut, not quite two years ago. A third issue is now at newsstands. (See also the magazine's Web site. )
And if you want to see some interesting and successful experiments in updating the whole format, keep an eye out for Boston Review and The Virginia Quarterly Review.
The September/October issue of Boston Review marks its 30th anniversary. Coming out six times a year, and as a tabloid, BR might at first seem to bear little resemblance to the university quarterlies of any era. But those differences are superficial. The mixture of political, philosophical, and literarydiscussion calls to mind the early Partisan Review, the most agenda-setting of the quarterlies published at mid-century.
Boston Review's anniversary issue contains a 12-page anthology of poems and essay organized by year -- including work by (to give a partial list) John Kenneth Galbraith, Adrienne Rich, Rita Dove, Ralph Nader, George Scialabba, and Martha Nussbaum. (Somebody on that list is bound to interest or agitate you.) One of the selections for 1993 comes from an essay by Christopher Hitchens called "Never Trust Imperialists." It would have been interesting to hear the editorial discussion that resulted in that one being included.
As for The Virginia Quarterly Review, it has come under a new editor, Ted Genoways, who seems to have ignored entirely the worries expressed in George Core's ruminations on the state of the quarterly. In its new incarnation, [ital]VQR[ital] is colorful, topical, even a bit flashy -- with the latest issue offering a gallery of photographs from Vietnam as well as a pull-out comic book by Art Spiegelman, along with poetry, fiction, a play, and several essays.
Some of the once-vital quarterlies ended up becoming so polite and reserved that their audiences did not so much read them as search each issue for signs of a pulse. You certainly don't have that problem with Boston Review or VQR. Recent discussion of standards for "public scholarship" has emphasized the possibility of creating venues "at the interface of campus and community."
Arguably, that is what the quarterlies and review always were -- and their revitalization now can only be a good sign.
It is disagreeable to approach the cashier with a book called How to Read Hitler. One way to take the stink off would be to purchase one or two other volumes in the new How to Read series published by W. W. Norton, which also includes short guides to Shakespeare, Nietzsche, Freud, and Wittgenstein. But at the time, standing in line at a neighborhood bookstore a couple weeks ago, I wasn't aware of those other titles. (The only thing mitigating the embarrassment was knowing that my days as a skinhead, albeit a non-Nazi one, are long over.) And anyway, the appearance of Adolf Hitler in such distinguished literary and philosophical company raises more troubling questions than it resolves.
"Intent on letting the reader experience the pleasures and intellectual stimulation in reading classic authors," according to the back cover, "the How to Read series will facilitate and enrich your understanding of texts vital to the canon." The series editor is Simon Critchley, a professor of philosophy at the New School in New York City, who looms ever larger as the guy capable of defending poststructuralist thought from its naysayers. Furthermore, he's sharp and lucid about it, in ways that might just persuade those naysayers to read Derrida before denouncing him. (Yeah, that'll happen.)
Somehow it is not that difficult to imagine members of the National Association of Scholars waving around the How to Read paperbacks during Congressional hearings, wildly indignant at Critchley's implicit equation of Shakespeare and Hitler as "classic authors" who are "vital to the canon."
False alarm! Sure, the appearance of the Fuhrer alongside the Bard is a bit of a provocation. But Neil Gregor, the author of How to Read Hitler, is a professor of modern German history at the University of Southampton, and under no illusions about the Fuhrer's originality as a thinker or competence as a writer.
About Mein Kampf, Gregor notes that there is "an unmistakably 'stream of consciousness' quality to the writing, which does not appear to have undergone even the most basic editing, let alone anything like polishing." Although Gregor does not mention it, the title Hitler originally gave to the book reveals his weakness for the turgid and the pompous: Four and a Half Years of Struggle against Lies, Stupidity and Cowardice. (The much snappier My Struggle was his publisher's suggestion.)
Incompetent writers make history, too. And learning to read them is not that easy. The fact that Hitler had ideas, rather than just obsessions, is disobliging to consider. Many of the themes and images in his writing reflect an immersion in the fringe literature of his day -- the large body of ephemeral material analyzed in Fritz Stern in his classic study The Politics of Cultural Despair: The Rise of the Germanic Ideology.
But Gregor for the most part ignores this influence on Hitler. He emphasizes, instead, the elements of Hitler's thinking that were, in their day, utterly mainstream. He could quote whole paragraphs Carl de Clausewitz on strategy. And his racist world view drew out the most virulent consequences of the theories of Arthur de Gobineau and Houston Stewart Chamberlain.(While Hitler was dictating his memoirs in a prison following the Beer Hall Putsch, he could point with admiration to one effort to translate their doctrines into policy: The immigration restrictions imposed in the United States in the 1920s.)
Gregor's method is to select passages from Mein Kampf and from an untitled sequel, published posthumously as Hitler's Second Book. He then carefully unpacks them -- showing what else is going on within the text, beneath the level of readily paraphrasable content. With his political autobiography, Hitler was not just recycling the standard complaints of the extreme right, or indulging in Wagnerian arias of soapbox oratory. He was also competing with exponents of similar nationalist ideas. He wrote in order to establish himself as the (literally) commanding figure in the movement.
So there is an implicit dialogue going on, disguised as a rather bombastic monologue. "Long passages of Hitler's writings," as Gregor puts it, "take the form of an extended critique of the political decisions of the late nineteenth century.... Hitler reveals himself not only as a nationalist politician and racist thinker, but -- this is a central characteristic of fascist ideology -- as offering a vision of revitalization and rebirth following the perceived decay of the liberal era, whose failings he intends to overcome."
The means of that "overcoming" were, of course, murderous in practice. The vicious and nauseating imagery accompanying any mention of the Jews -- the obsessive way Hitler constantly returns to metaphors of disease, decay, and infestation -- is the first stage of a dehumanization that is itself an incipient act of terror. The genocidal implications of such language are clear enough. But Gregor is careful to distinguish between the racist stratum of Hitler's dogma (which was uncommonly virulent even compared to the "normal" anti-Semitism of his day) and the very widespread use of militarized imagery and rhetoric in German culture following World War I.
"Many of the anti-Semitic images in Hitler's writing can be found in, say, the work of Houston Stewart Chamberlain," writes Gregor. "Yet when reading Chamberlain's work we hardly sense that we are dealing with an advocate of murder. When reading Hitler, by contrast, we often do -- even before we have considered the detail of what he is discussing. This is because the message is not only to be found in the arguments of the text, but is embedded in the language itself."
How to Read Hitler is a compact book, and a work of "high popularization" rather than a monograph. The two short pages of recommended readings at the end are broad, pointing to works of general interest (for example, The Coming of the Third Reich by Richard Evans) rather than journal articles. It will find its way soon enough into high-school and undergraduate history classrooms -- not to mention the demimonde of "buffs" whose fascination with the Third Reich has kept the History Channel profitable over the years.
At the same time, Gregor's little book is an understated, but very effective, advertisement for the "cultural turn" in historical scholarship. It is an example, that is, of one way historians go about examining not just what documents tell us about the past, but how the language and assumptions of a text operated at the time. His presentation of this approach avoids grand displays of methodological intent. Instead the book just goes about its business -- very judiciously, I think.
But there is one omission that is bothersome. Perhaps it is just an oversight, or, more likely, a side effect of the barriers between disciplines. Either way, it is a great disservice that How to Read Hitler nowhere points out the original effort by someone writing in English to analyze the language and inner logic of Mein Kampf -- the essay by Kenneth Burke called "The Rhetoric of Hitler's 'Battle,' " published in The Southern Review in 1939. (In keeping with my recent enthusing over the "golden age" of the academic literary quarterly, it is worth noting that the Review was published at Louisiana State University and edited by a professor there named Robert Penn Warren.)
Burke's essay was, at the time, an unusual experiment: An analysis of a political text using the tools of literary analysis that Burke had developed while studying Shakespeare and Coleridge. He had published the first translations of Thomas Mann's Death in Venice and of portions of Oswald Spengler's Decline of the West -- arguably a uniquely suitable preparation for the job of reading Hitler. And just as various German émigrés had tried to combine Marx and Freud in an effort to grasp "the mass psychology of fascism" (as Wilhelm Reich's title had it), so had Burke worked out his own combination of the two in a series of strange and brilliant writings published throughout the Depression.
But he kept all of that theoretical apparatus offstage, for the most part, in his long review-essay on a then-new translation of Mein Kampf. Instead, Burke read Hitler's narrative and imagery very closely -- showing how an "exasperating, even nauseating" book served to incite and inspire a mass movement.
This wasn't an abstract exercise. "Let us try," wrote Burke, "to discover what kind of 'medicine' this medicine man has concocted, that we may know, with greater accuracy, exactly what to guard against, if we are to forestall the concocting of similar medicine in America."
Burke's analysis is a [ital]tour de force[ital]. Revisiting it now, after Gregor's How to Read volume, it is striking how much they overlap in method and implication. In 1941, Burke reprinted it in his collection The Philosophy of Literary Form, which is now available from the University of California Press. You can also find it in a very useful anthology of Burke's writings called On Symbols and Society, which appears in the University of Chicago Press's series called "The Heritage of Sociology."
"Above all," wrote Burke in 1939, "I believe we must make it apparent that Hitler appeals by relying upon a bastardization of fundamentally religious patterns of thought. In this, if properly presented, there is no slight to religion. There is nothing in religion proper that requires a fascist state. There is much in religion, when misused, that does lead to a fascist state. There is a Latin proverb, Corruptio optimi pessima, 'the corruption of the best is the worst.' And it is the corruptors of religion who are a major menace to the world today, in giving the profound patterns of religious thought a crude and sinister distortion."
For most scholarly journals, the transition away from the print format and to an exclusive reliance on the electronic version seems all but inevitable, driven by user preferences for electronic journals and concerns about collecting the same information in two formats. But this shift away from print, in the absence of strategic planning by a higher proportion of libraries and publishers, may endanger the viability of certain journals and even the journal literature more broadly -- while not even reducing costs in the ways that have long been assumed.
Although the opportunities before us are significant, a smooth transition away from print and to electronic versions of journals requires concerted action, most of it individually by libraries and publishers.
In reaching this conclusion, we rely largely on a series of studies, of both publishers and libraries, in which we examined some of the incentives for a transition and some of the opportunities and challenges that present themselves. Complete findings of our library study, on which we partnered with Don King and Ann Okerson, were published as The Nonsubscription Side of Periodicals. We also recently completed a study of the operations of 10 journal publishers, in conjunction with Mary Waltham, an independent publishing consultant.
Taken together, these studies suggest that an electronic-only environment would be more cost-effective than print-only for most journals, with cost savings for both libraries and publishers. But this systemwide perspective must also be balanced against a more textured examination of libraries and publishers.
On the publisher side, the transition to online journals has been facilitated by some of the largest publishers, commercial and nonprofit. These publishers have already invested in and embraced a dual-format mode of publishing; they have diversified their revenue streams with separately identifiable income from both print and now increasingly electronic formats. Although the decreasing number of print subscriptions may have a negative impact on revenues, these publishers’ pricing has evolved alongside the economies of online only delivery to mitigate the effects of print cancellations on the bottom line.
The trend has been to adopt value-based pricing that recognizes the convenience of a single license serving an entire campus (rather than multiple subscriptions), with price varying by institutional size, intensity of research activity, and/or number of online users. By “flipping” their pricing to be driven primarily by the electronic version, with print effectively an add-on, these publishers have been able to manage the inevitable decline of their print business without sacrificing net earnings. They are today largely agnostic to format and, when faced with price complaints, are now positioned to recommend that libraries consider canceling their print subscriptions in favor of electronic-only access.
Other journal publishers, especially smaller nonprofit scholarly societies in the humanities and social sciences and some university presses, are only beginning to make this transition. Even when they publish electronic versions in addition to print, these publishers have generally been slower to reconceive their business models to accommodate a dual-format environment that might rapidly become electronic-only. Their business models depend on revenues received from print, in some cases with significant contributions from advertising, and are often unable to accommodate significant print cancellations in favor of electronic access.
Until recently, this has perhaps not been unreasonable, as demand for electronic journals has been slower to build in the humanities and some social science disciplines. But the business models of these publishers are now not sufficiently durable to sustain the journals business in the event that libraries move aggressively away from the print format.
Many American academic libraries have sought to provide journals in both print and electronic formats for the past 5 to 10 years. The advantages of the electronic format have been clear, so these were licensed as rapidly as possible, but it has taken time for some faculty members to grow comfortable with an exclusive dependence on the electronic format. In addition, librarians were concerned about the absence of an acceptable electronic-archiving solution, given that that their cancellation of print editions would prevent higher education from depending on print as the archival format.
In the past year or two, the movement away from print by users in higher education has expanded and accelerated. No longer is widespread migration away from print restricted to early adopters like Drexel and Suffolk Universities; it has become the norm at a broad range of academic institutions, from liberal arts colleges to the largest research universities. Ongoing budget shortfalls in academe have probably been the underlying motivation. The strategic pricing models offered by some of the largest publishers, which offer a price reduction for the cancellation of print, have provided a financial incentive for libraries to contemplate completing the transition.
Faced with resource constraints, librarians have been required to make hard choices, electing not to purchase the print version but only to license electronic access to many journals -- a step more easily made in light of growing faculty acceptance of the electronic format. Consequently, especially in the sciences, but increasingly even in the humanities, library demand for print has begun to fall. As demand for print journals continues to decline and economies of scale of print collections are lost, there is likely to be a tipping point at which continued collecting of print no longer makes sense and libraries begin to rely only upon journals that are available electronically. As this tipping point approaches, at unknown speed, libraries and publishers need to evaluate how they can best manage it. We offer several specific recommendations.
First, for those publishers that have not yet developed a strategy for an electronic-only journals environment and the transition to it, the future is now. Today’s dual-format system can only be managed effectively with a rigorous accounting of the costs and revenues of print and electronic and how these break down by format. Because some costs incurred irrespective of format are difficult to allocate, this accounting is complicated. It is also, however, critical, allowing publishers to understand the performance of each format as currently priced and, as a result, to project how the transition to an electronic-only environment would affect them. Publishers that do not immediately undertake these analyses and, if necessary, adjust their business models accordingly, may suffer dramatically as the transition accelerates and libraries reach a tipping point.
Second, in this transition, libraries and higher education more broadly should consider how they can support the publishers that are faced with a difficult transition. A disconcerting number of nonprofit publishers, especially scholarly societies and university presses that have the greatest presence in the humanities and social sciences fields, have a particularly complicated transition to make. The university presses and scholarly societies have been traditionally strong allies of academic libraries. They may have priced their electronic journals generously (and unrealistically). Consequently, a business model revamped to accommodate the transition may often result in a significant price increase for the electronic format. In cases where price increases are not predatory but rather adjustments for earlier unrealistic prices, libraries should act with empathy. If libraries cancel journals based on large percentage price increases (even when, measured in dollars, the increases are trivial), they may unintentionally punish lower-price publishers struggling to make the transition as efficiently as possible.
Third, this same set of publishers is particularly vulnerable, because their strategic planning must take place in the absence of the working capital and the economies of scale on which larger publishers have relied. As a result, some humanities journals published by small societies are not yet even available electronically. The community has a need for collaborative solutions like Project Muse or HighWire, (initiatives that provide the infrastructure to create and distribute electronic journals) for the scholarly societies that publish the smaller journals in the humanities and social sciences. But if such solutions are not developed or cannot succeed in relatively short order on a broader scale, the alternative may be the replacement of many of these journals with blogs, repositories, or other less formal distribution models.
Fourth, although libraries today face difficult questions about whether and when to proceed with electronic-only access to traditionally print journals, they should try to manage this transition strategically and, in doing so, deserve support from all members of the higher education community. It has been unusual thus far for libraries to undertake a strategic, all-encompassing format review process, since it is often far more politically palatable to cancel print versions as a tactical retreat in the face of budgetary pressures. But a chaotic retreat from print will almost certainly not allow libraries to realize the maximum potential cost savings, whereas a managed strategic format review can permit far more effective planning and cost savings.
Beyond a focus on local costs and benefits, there are a number of broader issues that many libraries will want to consider in such a strategic format review. The widespread migration from print to electronic seems likely to eliminate library ownership of new accessions, with licensing taking the place of purchase. In cases where ownership led to certain expectations or practices, these will have to be rethought in a licensing-only environment. From our perspective, the safeguarding of materials for future generations is among the most pressing practices deserving reconsideration. Questions about the necessity of developing or deploying electronic archiving solutions, and the adequacy of the existing solutions, deserve serious consideration by all libraries contemplating a migration away from print resources. In addition, the transition to electronic journals begins to raise questions about how to ensure the preservation of existing print collections. Many observers have concluded that a paper repository framework is the optimal solution, but although individual repositories have been created at the University of California, the Five Colleges, and elsewhere, the organizational work to develop a comprehensive framework for them has yet to begin.
The implications both of licensing on archiving and of the future of existing print collections are addressable as part of any library’s strategic planning for the transition to an electronic-only environment -- but all too often are being forgotten under the pressure of the budgetary axe.
These challenges appear to us to be some of the most urgent facing libraries and publishers in the nearly inevitable transition to an electronic-only journals environment. Both libraries and publishers should proceed under the assumption that the transition may take place fairly rapidly, as either side may reach a tipping point when it is no longer cost-effective to publish or purchase any print versions. It is not impossible for this transition to occur gracefully, but to do so will require the concerted efforts of individual libraries and individual publishers.
Eileen Gifford Fenton and Roger C. Schonfeld
Eileen Gifford Fenton is executive director of Portico, whose mission is to preserve scholarly literature published in electronic form and to ensure that these materials remain accessible. Portico was launched by JSTOR and is being incubated by Ithaka, with support from the Andrew W. Mellon Foundation. Roger C. Schonfeld is coordinator of research for Ithaka, a nonprofit organization formed to accelerate the productive uses of information technologies for the benefit of academia. He is the author of JSTOR: A History (Princeton University Press, 2003).Â
Graphs, Maps, Trees: Abstract Models for a Literary History is a weird and stimulating little book by Franco Moretti, a professor of English and comparative literature at Stanford University. It was published a few months ago by Verso. But observation suggests that its argument, or rather its notoriety, now has much wider circulation than the book itself. That isn’t, I think, a good thing, though it is certainly the way of the world.
In a few months, Princeton University Press will bring out the first volume of The Novel: History, Geography, and Culture -- a set of papers edited by Moretti, based on the research program that he sketches in Graphs, Maps, Trees. (The Princeton edition of The Novel is a much-abridged translation of a work running to five volumes in Italian.) Perhaps that will redefine how Moretti’s work is understood. But for now, its reputation is a hostage to somewhat lazy journalistic caricature -- one mouthed, sometimes, even by people in literature departments.
What happened, it seems, is this: About two years ago, a prominent American newspaper devoted an article to Moretti’s work, announcing that he had launched a new wave of academic fashion by ignoring the content of novels and, instead, just counting them. Once, critics had practiced “close reading.” Moretti proposed what he called “distant reading.” Instead of looking at masterpieces, he and his students were preparing gigantic tables of data about how many books were published in the 19th century.
Harold Bloom, when reached for comment, gave one of those deep sighs for which he is so famous. (Imagine Zero Mostel playing a very weary Goethe.) And all over the country, people began smacking their foreheads in exaggerated gestures of astonishment. “Those wacky academics!” you could almost hear them say. “Counting novels! Whoever heard of such a thing? What’ll those professors think of next -- weighing them?”
In the meantime, it seems, Moretti and his students have been working their way across 19th century British literature with an adding machine -- tabulating shelf after shelf of Victorian novels, most of them utterly forgotten even while the Queen herself was alive. There is something almost urban legend-like about the whole enterprise. It has the quality of a cautionary tale about the dangers of pursuing graduate study in literature: You start out with a love of Dickens, but end up turning into Mr. Gradgrind.
That, anyway, is how Moretti’s “distant reading” looks ... well, from a distance. But things take on a somewhat different character if you actually spend some time with Moretti’s work itself.
As it happens, he has been publishing in English for quite some while: His collection of essays called Signs Taken for Wonders: On the Sociology of Literary Forms (Verso, 1983) was, for a long time, the only book I’d ever read by a contemporary Italian cultural theorist not named Umberto Eco. (It has recently been reissued as volume seven in Verso’s new Radical Thinkers series.) The papers in that volume include analyses of Restoration tragedy, of Balzac’s fiction, and of Joyce’s Ulysses.
In short, then, don’t believe the hype – the man is more than a bean-counter. There is even an anecdote circulating about how, during a lecture on “distant reading,” Moretti let slip a reference that he could only have known via close familiarity with an obscure 19th century novel. When questioned later -– so the story goes -– Moretti made some excuse for having accidentally read it. (Chances are this is an apocryphal story. It sounds like a reversal of David Lodge’s famous game of “intellectual strip-poker” called Humiliation.)
And yet it is quite literally true that Moretti and his followers are turning literary history into graphs and tables. So what’s really going on with Moretti’s work? Why are his students counting novels? Is there anything about “distant reading” that would be of interest to people who don’t, say, need to finish a dissertation on 19th century literature sometime soon? And the part, earlier, about how the next step would be to weigh the books -- that was a joke, right?
To address these and many other puzzling matters, I have prepared the following Brief Guide to Avoid Saying Anything Too Dumb About Franco Moretti.
He is doing literary history, not literary analysis. In other words, Moretti is not asking “What does [insert name of famous author or novel here] mean?” but rather, “How has literature changed over time? And are there patterns to how it has changed?” These are very different lines of inquiry, obviously. Moretti’s hunch is that it might be possible to think in a new way about what counts as “evidence” in cultural history.
Yes, in crunching numbers, he is messing with your head. The idea of using statistical methods to understand the long-term development of literary trends runs against some deeply entrenched patterns of thought. It violates the old idea that the natural sciences are engaged in the explanation of mathematically describable phenomena, while the humanities are devoted to the interpretation of meanings embedded in documents and cultural artifacts.
Many people in the humanities are now used to seeing diagrams and charts analyzing the structure of a given text. But there is something disconcerting about a work of literary history filled with quantitative tables and statistical graphs. In doing so, Moretti is not just being provocative. He’s trying to get you to “think outside the text,” so to speak.
Moretti is taking the long view.... A basic point of reference for his “distant reading” is the work of Fernand Braudel and the Annales school of historians who traced the very long-term development of social and economic trends. Instead of chronicling events and the doings of individuals (the ebb and flow of history), Braudel and company looked at tendencies taking shape over decades or centuries. With his tables and graphs showing the number (and variety) of novels offered to the reading public over the years, Moretti is trying to chart the longue dure’e of literary history, much as Braudel did the centuries-long development of the Mediterranean.
Some of the results are fascinating, even to the layperson’s eye. One of Moretti’s graphs shows the emergence of the market for novels in Britain, Japan, Italy, Spain, and Nigeria between about 1700 and 2000. In each case, the number of new novels produced per year grows -- not at the smooth, gradual pace one might expect, but with the wild upward surge one might expect of a lab rat’s increasing interest in a liquid cocaine drip.
“Five countries, three continents, over two centuries apart,” writes Moretti, “and it’s the same pattern ... in twenty years or so, the graph leaps from five [to] ten new titles per year, which means one new novel every month or so, to one new novel per week. And at that point, the horizon of novel-reading changes. As long as only a handful of new titles are published each year, I mean, novels remain unreliable products, that disappear for long stretches of time, and cannot really command the loyalty of the reading public; they are commodities, yes, but commodities still waiting for a fully developed market.”
But as that market emerges and consolidates itself -- with at least one new title per week becoming available -- the novel becomes “the great capitalist oxymoron of the regular novelty: the unexpected that is produced with such efficiency and punctuality that readers become unable to do without it.”
And then the niches emerge: The subgenres of fiction that appeal to a specific readership. On another table, Moretti shows the life-span of about four dozen varieties of fiction that scholars have identified as emerging in British fiction between 1740 and 1900. The first few genres appearing in the late 18th century (for example, the courtship novel, the picaresque, the “Oriental tale,” and the epistolary novel) tend to thrive for long periods. Then something happens: After about 1810, new genres tend to emerge, rise, and decline in waves that last about 25 years each.
“Instead of changing all the time and a little at a time,” as Moretti puts it, “the system stands still for decades, and is then ‘punctuated’ by brief bursts of invention: forms change once, rapidly, across the board, and then repeat themselves for two [to] three decades....”
Genres as distinct as the “romantic farrago,” the “silver-fork novel,” and the “conversion novel” all appear and fade at about the same time -– to be replaced a different constellation of new forms. It can’t, argues Moretti, just be a matter of novelists all being inspired at the same time. (Or running out of steam all at once.) The changes reflect “a sudden, total change of their ecosystem."
Moretti is a cultural Darwinist, or something like one. Anyway, he is offering an alternative to what we might call the “intelligent design” model of literary history, in which various masterpieces are the almost sacramental representatives of some Higher Power. (Call that Power what you will -– individual genius, “the literary imagination,” society, Western Civilization, etc.) Instead, the works and the genres that survive are, in effect, literary mutations that possess qualities that somehow permit them to adapt to changes in the social ecosystem.
Sherlock Holmes, for example, was not the only detective in Victorian popular literature, nor even the first. So why is it that we still read his adventures, and not those of his competitors? Moretti and his team looked at the work of Conan Doyle’s rivals. While clues and deductions were scattered around in their texts, the authors were often a bit off about how they were connected. (A detective might notice the clues, then end up solving the mystery through a psychic experience, for example.)
Clearly the idea of solving a crime by gathering clues and decoding their relationship was in the air. It was Conan Doyle’s breakthrough to create a character whose “amazing powers” were, effectively, just an extremely acute version of the rational powers shared by the reader. But the distinctiveness of that adaptation only comes into view by looking at hundreds of other texts in the literary ecosystem.
This is the tip of the tip of the iceberg. Moretti’s project is not limited by the frontiers of any given national literature. He takes seriously Goethe’s idea that all literature is now world literature. In theory, anyway, it would be possible to create a gigantic database tracking global literary history.
This would require enormous computational power, of course, along with an army of graduate students. (Most of them getting very, very annoyed as they keypunched data about Icelandic magazine fiction of the 1920s into their laptops.)
My own feeling is that life is much too short for that. But perhaps a case can be made for the heuristic value of imagining that kind of vast overview of how cultural forms spread and mutate over time. Only in part is Moretti’s work a matter of counting and classifying particular works. Ultimately, it’s about how literature is as much a part of the infrastructure of ordinary life as the grocery store or Netscape. And like them, it is caught up in economic and ecological processes that do not respect local boundaries.
That, anyway, is an introduction to some aspects of Moretti’s work. I’ve just learned that Jonathan Goodwin, a Brittain Postdoctoral Fellow at Georgia Tech, is organizing an online symposium on Moretti that will start next week at The Valve.
Goodwin reports that there is a chance Moretti himself may join the fray. In the interim, I will be trying to untangle some thoughts on whether his “distant reading” might owe something to the (resolutely uncybernetic) literary theory of Georg Lukacs. And one of the participants will be Cosma Shalizi, a visiting assistant professor of statistics at Carnegie Mellon University.
It probably wouldn’t do much good to invite Harold Bloom into the conversation. He is doubtless busy reciting Paradise Lost from memory, and thinking about Moretti would not be good for his health. Besides, all the sighing would be a distraction.
I can’t remember when I snapped. Was it the faculty seminar in which the instructor used the phrase “the objectivity, for it is not yet a subjectivity” to refer to a baby? Maybe it was the conference in which the presenter spoke of the need to “historicize” racism, rambled through 40 minutes of impenetrable jargon to set up “new taxonomies” to “code” newspapers and reached the less-than-startling conclusion that five papers from the 1820s “situated African-Americans within pejorative tropes.” Could it have been the time I evaluated a Fulbright applicant who filled an entire page with familiar words, yet I couldn’t comprehend a single thing she was trying to tell me? Perhaps it was when I edited a piece from a Marxist scholar who wouldn’t know a proletarian if one bit him in the keister. Or maybe it just evolved from day-to-day dealings with undergraduates hungry for basic knowledge, hold the purple prose.
At some point, I lost it. I began ranting in the faculty lounge. I hurled the Journal of American History/Mystery across the library, muttered in the shower, and sent befuddled e-mails to colleagues. I’m fine now. Once I unburdened I found I was not alone; lots of fellow academics agree that their colleagues couldn’t write intelligible explanations of how to draw water from the tap. From this was born the Society for Intellectual Clarity (SIC). We intend to launch a new journal, SIC PUPPY (Professors United in Plain Prose Yearnings) as soon as we find someone whose writing is convoluted enough to draft our grant application. (We’re told we should seek recruits among National Science Foundation recipients.)
Until the seed money comes in our journal is purely conceptual, but upon start-up SIC PUPPY will enact the following guidelines for submissions.
Titles: Brevity is a virtue. Titles with colons are discouraged. Any title with a colon, semi-colon, and a comma will be rejected on principle. We accept no responsibility for doodles and exclamatory obscenities scrawled on the returned text, even if you do enclose a self-addressed stamped envelope.
Style: If any manuscript causes one of our editors to respond to a late-night TV ad promising to train applicants for “an exciting career in long-distance trucking,” the author of said manuscript will be deemed a boring twit and his or her work will be returned. See above for doodle disclaimers.
Audience: Hey, would it kill you to write something an undergrad might actually read? If so, please apply for permanent residency in Bora Bora.
Terminology: If any author desires to invent a new term to describe any part of the research, refer to Greta Garbo’s advice on desire in the film Ninotchka: “Suppress it.” There are 171,476 active words in the English language and the authors of SIC PUPPY are confident that at least one of them would be adequate.
Nouns and Verbs: Among those 171, 476 words are some that are designated as nouns and others clearly meant as verbs. Do not confuse the two. SIC PUPPY refuses to conference with anyone about this. We have prioritized our objectives.
Thesis: We insist that you have one. If you don’t have anything to say, kindly refrain from demonstrating so. We do not care what Bakhtin, Derrida, Jameson, Marx, Freud, or Foucault have to say about your subject or any other. We’ve read them; we know what they think.
Academic Catfights: The only person who gives a squanker’s farley about literature reviews and historiography is your thesis adviser. We request that you get on with the article and reduce arcane debates to footnotes. The latter should be typed in three-point Windings font.
Editing for Smugness: If your article was originally a conference paper and, if at any time, you looked up from your text and smiled at your own cleverness, please delete this section and enroll in a remedial humility course.
No Silly Theories:SIC PUPPY does not care if a particular theory is in vogue; we will not consider silly ones. For example, bodies are bodies, not “texts” and dogs are dogs; they do not “signify” their “dogginess” through “signifier” barks. While we’re on the subject, we at SIC PUPPY have combed scientific journals to confirm that time machines do not exist. We thus insist that human beings can be postpartum or postmortem, but not postmodern.
Privileging Meaning: We believe that sometimes you’ve got to call it like it is, even if that entails using a label or category. We know that some of you think we shouldn’t privilege any meaning over another. To this we say, “We’re the editors, not you, and we intend to use our privileged positions of power to label those who reject categories ‘ninnies.’ So there!”
Citations: We insist that you use the Chicago Manual of Style for all citations. Not because we love it, but because it annoys us no end to see parentheses in the middle of text we’re trying to read. Why we read a theory on ellipses (Bakhtin, 1934) just last night describing how English authors (Wilde, 1905; Shaw, 1924) sought to embed Chartist messages (S. Webb, 1891) into....
Complaints: In the course of preparing a journal it is inevitable that typos will appear, that medieval French words will go to print with an accent aigu where an accent grave should have been, and that edits will be made to what you were sure was perfect prose (but wasn’t). Do not call the editors to complain that we’ve humiliated you before your peers and have ruined your academic career. SIC PUPPY will not waste time telling you to get a life; we will direct your call to the following pre-recorded message: “Thhhhhwwwwwwwpt!”
Satire and Irony: To paraphrase the folksinger Charlie King, serious people are ruining our world. If you do not understand satire, or confuse irony with cynicism, go away. Try therapy ... gin ... a warm bath ... anything! Except teaching or writing.
Robert E. Weir is a former senior Fulbright scholar who teaches at Smith College and the University of Massachusetts.
Over the past few days, as perhaps you have heard, it has become more or less impossible to get hold of a copy of "Ready to Die" (1994) -- the classic (and prophetically named) debut album by the Notorious B.I.G., a gangster rapper killed in a shooting in 1997.
Well, perhaps "impossible" is overstating things. But expensive, anyway. Secondhand copies of the CD, recently selling for $6 each on Amazon, now fetch $40; and the price is bound only to go up from there. "Ready to Die" was withdrawn last week after a jury found that one of the tracks incorporated an unlicensed sample from a song originally recorded in 1992 by the Ohio Players -- the band best remembered for "Love Roller Coaster," a disco hit of the late 1970s. (Also, for an album cover featuring a naked woman covered in honey.)
Learning about the court case, I was, admittedly, shocked: Who knew the Ohio Players were still around? The Washington Post called them "funk dignitaries." Somehow that honorific phrase conjures an image of them playing gigs for the American Association of Retired Persons. They will be splitting a settlement of $4.2 million with their lawyers, which probably means a few more years on the road for the band.
Apart from that, the whole matter came very close to being what, in the journalistic world, is called a "dog bites man" story -- a piece of news that is not really news at all. Digital technology now makes it very easy for one musician to copy and modify some appealing element from another musician's recording. Now lawyers hover over new records, listening for any legally actionable borrowing. Such cases are usually settled out of court -- for undisclosed, but often enormous, sums. The most remarkable thing about the "Ready to Die" case is that it ever got to trial.
More interesting than the legal-sideshow aspect, I think, is the question of how artists deal with the situation. Imitation, allusion, parody, borrowing stray bits of melody or texture -- all of this is fundamental to creativity. The line between mimicry and transformation is not absolute. And the range of electronic tools now available to musicians makes it blurrier all the time.
Using a laptop computer, it would be possible to recreate the timbre of Jimi Hendrix's guitar from the opening bars of "Voodoo Chile (Slight Return)" in order to color my own, rather less inspired riffs. This might not be a good idea. But neither would it be plagiarism, exactly. It's just an expedited version of the normal process by which the wealth of musical vocabulary gets passed around.
That, at least, would be my best argument if the Hendrix estate were to send a cease-and-desist letter. As it probably would. An absorbing new book by Joanna Demers, Steal This Music: How Intellectual Property Law Affects Musical Creativity,published by the University of Georgia Press, is full of cases of overzealous efforts to protect musical property. Some would count as implausible satire if they hadn't actually happened: There was, for example, the legal action taken to keep children from singing "This Land is Your Land" at summer camp.
Demers, an assistant professor of music history and literature at the University of Southern California, shows how the framework of legal control over music as intellectual property has developed in the United States. It began with copyright for scores, expanded to cover mechanical reproduction (originally, via player-piano rolls), and now includes protection for a famous performer's distinctive qualities as an icon. Today, the art (or whatever it is) of the Elvis impersonator is a regulated activity -- subject to the demands of Elvis Presley Enterprises, Inc., which exercises control over "not only his physical appearance and stage mannerisms but also the very quality of his voice," as Demers notes. "Impersonators who want to exhibit their vocal resemblance to Elvis can legally do so only after paying EPE."
What the King would say about this is anybody's guess. But as Demers reminds us, it probably wouldn't make any difference in any case: It is normally the corporate "content provider," not the artist, who now has discretion in taking legal action. The process of borrowing and modifying (whether of folk music by classical composers or Bootsy Collins bass-lines by hip-hop producers) is intrinsic to making music. But it is now increasingly under the influence of people who never touch an instrument.
It is impressive that so trim a book can give the reader so broad a sense of how musical creativity is being effected by the present intellectual property regime. The author's note indicates that Demers, apart from her academic duties, serves as "a freelance forensic musicologist" -- one of those professional sub-niches that didn't exist until fairly recently. Intrigued by this, I asked her about it.
The term "is definitely over the top," she admits, "but I can't take credit for it. It just refers to music specialists who assess borrowings and appropriations, sometimes in the courtroom but most often before any lawsuits are filed." The American Musicological Society provides a referral list of forensic consultants, which is where potential clients find her.
She's been at it for three years -- a period coinciding with her first full-time academic post. "As far as I know," she says, "I don't get any credit at USC for this type of work. I'm judged pretty much solely on research and teaching plus a bit of committee work. I do have a few colleagues at USC who've also done this sort of work. It's a nice source of extra revenue from time to time, but as far as I know, there are only two or three folks around the world who could survive doing this alone full-time."
Demers is selective about taking on freelance cases. "Some are legit," she says, "while others are sketchy, so I try to be choosy about which cases I'll take on." At one point, she was contacted "by a person who was putting together a lawsuit against a well-known singer/songwriter for plagiarizing one of his songs. His approach was to begin by telling me how serious the theft was, but he wanted me to commit to working for him before showing me the two songs. Needless to say, we ended up not working together. Most cases, though, are preemptive in the sense that producer or label wants to ensure that materials are 'infringement free' before releasing them."
There is an interesting tension -- if not, necessarily, a conflict -- between her scholarship and her forensic work. "The challenge for me in consulting," as Demers puts it, "is that I have to give advice based on what copyright law currently states. I don't agree with many aspects of that law, but my opinion can't get in the way of warning a client that s/he may be committing an actionable infringement."
In reading her book, I was struck by the sense that Demers was also describing something interesting and salutary. All the super-vigilant policing of musical property by corporations seems to have had an unintended consequence -- namely, the consolidation of a digital underground of musicians who ignore the restrictions and just do what they feel they must. The tools for sampling, altering, and otherwise playing with recorded sound get cheaper and easier to use all the time. Likewise, the means for circulating sound files proliferate faster than anyone can monitor.
As a geezer old enough to remember listening to Talking Heads on eight-track tape, I am by no means up to speed on how to plug into such networks. But the very idea of it is appealing. It seems as if the very danger of a cease-and-desist order might become part of the creative process. I asked Demers if she thought that sounded plausible.
"Yes, exactly," she answered. "I don't want to come out and condone breaking the law, because even in circumstances where one could argue that something truly creative is happening, the borrower risks some pretty serious consequences if caught. But yes, this has definitely cemented the distinctions between 'mainstream' and 'underground or independent' in a way that actually bodes better for the underground than the mainstream. Major labels just aren't going to be attractive destinations for new electronica and hip-hop talent if this continues. And if there is a relatively low risk of getting caught, there are always going to be young musicians willing to break the law."
The alternative to guerilla recording and distribution is for musicians to control their own intellectual property -- for one thing, by holding onto their copyrights, though that is usually the first thing you lose by signing with a major label. "What I like to tell undergrads passing through USC," says Demers, "is that the era of mega-millions-earning stars is really coming to a close, and they can't expect to make large sums of money through music. What they should aim to do is not lose money, and there are several clever ways to avoid this, like choosing a label that allows the artist to retain control over the copyrights."
One problem is that artists often lack a sense of their options. "The situation is better than it used to be," Demers says, "but still, most artists are naive about how licensing works. They come with ideas to the studio and then realize that they must take out a loan in order to license their materials. Labels don't license samples; artists do. And if a lawsuit develops, most of the time, the label cuts the artist loose and says, 'It's your problem.' "
There is an alternative, at least for musicians whose work incorporates recontextualized sound fragments from other sources. "The simple way around this," she continues, is for an artist who uses sampling to connect up "the millions (there are that many) who are willing to let their work be sampled cheaply or for free."
But as Steal This Music suggests, the problem runs deeper than the restrictions on "sampladelia." Had the Copyright Term Extension Act (CTEA) of 1998 been enacted 50 years earlier, you have to doubt that anyone would have dared to invent rock and roll. The real burden for correcting the situation, as Demers told me, falls on the public.
"I am pretty confident that content providers will continue to lobby for extending the copyright term," she says, "The CTEA passed because of the pressure that Disney and Time Warner put on Congress, and was abetted by the fact that the public was largely silent. But we're at a different point than we were in the late 1990s, and organizations like Public Knowledge and Creative Commons and scholars like Lawrence Lessig have done a good job of spreading the word about what extending copyrights does to creativity. Next time Congress has a copyright extension bill in front of it, I hope that voters will get busy writing letters."