If not for the recent online buzz about whether or not President Bush has resumed drinking, most of us never would have heard the allegations. The story was, after all, broken by The National Enquirer – a paper not taken in this household, you may be sure. We are loyal to the Weekly World News instead.
The cover of the Enquirer is always full of the faces and first names of celebrities, very few of which I recognize -- while the reporters at the News do the kind of hard journalistic digging needed to reveal, for example, Saddam Hussein’s efforts to clone dinosaurs for use as weapons of mass destruction. Some years ago, there was an off-Broadway musical inspired by WWN coverage of the amazing saga of the half-human Bat Boy. I’m always keen to read updates about that brave little guy.
But a scoop is a scoop. More interesting than the Enquirer story itself has been the response to it -- not just its prime spot in Slate’s roundup of trash news, but the loud blog feedback, followed by the metacommentary by Jonathan Dresner at Cliopatria, which was remarkably sober. (Didn't see that one coming, did you?)
A decade has passed since the earliest syllabus was prepared for a course called “Tabloid Culture.” Now it’s a regular area of scholarly specialization ( with conferences), so it’s hard to know how anybody keeps up with all the secondary literature, let alone the two-headed alien babies.
As it happens, the first paper on cultural studies by an American academic I ever came across -- more than 20 years ago, in fact -- was a pioneering study in the field of tabloid hermeneutics. Stephanie Greenhill presented “ The National Enquirer: A Secret Method for the Mastery of Life” at the Southwest Graduate Student Conference in Comparative Literature, held in March 1982 at the University of Texas at Austin. The proceedings were published the following year by UT’s Comp Lit program -- using what appears to have been a very, very early desktop publishing program. The volume provides no information about the contributors. Nor is there any trace of Stephanie Greenhill’s subsequent career as a scholar available online. [See update below.]
But a vague memory of her argument has been at the back of my mind ever since the current Enquirer story began pinging around the blogosphere. It took some digging, but I’ve located my copy of the proceedings and reread Greenhill’s paper. And so, in the spirit of honoring a forgotten pioneer, here is a precis of her work, and an application of it to interpreting “Bush’s Booze Crisis.”
My recollection had it that Greenhill must have been one of the first American academics to draw on the first generation of cultural-studies scholars – the early theoretical work of Stuart Hall and others at the Birmingham Center for Contemporary Cultural Studies in England. (The center closed three years ago.) But in fact, rereading her paper now, I see that Greenhill was actually looking at the tabloid from within a completely different framework: that of folklore.
An interesting rewriting of things, given the subsequent fate of each discipline over the following two decades. By the 1990s, the American version of cultural studies was on the rise, while folklore programs were shutting down. If the stereotype had it that someone with a background in cultural studies wore hipster eyeglasses and a complicated haircut, the other field had a much less flattering icon: namely, the Comic Book Store Guy on "The Simpsons," responding to one of Bart’s pranks by saying, “I do not deserve this! I have a Ph.D. in folklore and mythology!”
With understandable frustration, some folklorists have insisted for years that they were doing cultural studies long before anybody thought to call it that. And rereading Greenhill’s paper after all these years, I’m inclined to think they have a case. Her analysis stresses how the Enquirer -- which is, arguably, as debased a piece of mass-produced junk as ever issued by a printing press -- actually replicates some features we associate with oral or traditional forms of culture.
Not that you’d notice it right away, of course. “It can be a very disturbing experience to read the Enquirer,” she writes. “The physical layout encourages the feeling of alienation. One’s eyes are forced to search up and down in order to find everything on the page. One cannot even look only at headline-sized materials to get an overview; there are a number of single-line quotations which force the eye constantly to refocus. Perhaps it is this format, rather than the content, which is the source of the subjective sense that the Enquirer is a fragmenting rather than a communal force.”
But the ads, and perhaps especially the articles, recycle many of the basic themes found in folklore. “Collections such as Flanders and Brown’s Folk-Songs from Vermont,” notes Greenhill, “deal with many of the subjects equally beloved of the Enquirer: illicit love, the bizarre, violence, death, satire, and religion.”
Furthermore, many of the stories in the tabloid’s pages lend themselves to exactly the kind of structuralist analysis that Claude Levi-Strauss performed on myths gathered by anthropologists. In short, they are efforts to resolve binary oppositions such as that between nature and culture, male and female, life and death.
Consider, if you will, “Miracle Baby: Love Overcomes Incredible Odds for Paralyzed Wife and Her Gentle Giant” -- recounting how a very tall bodybuilder and his very small, paralyzed wife created their happy family. Their small child is, as Greenhill puts it, “obviously the symbolic synthesis of the two.... By creating a balance between these opposites, a state of normalcy will result.”
The classical balance typical of the Enquirer may explain my own preference for the rather more surreal landscape of the Weekly World News, in which normality is constantly menaced by (for example) self-replacing androids who “breed like flies.” And it was WWN that revealed that undergraduates aren’t the only ones spending all semester on drinking binges. So do 8 out of 10 of their professors!
And now our Commander in Chief is staggering down the same path. Or so we are told by the Levi-Straussian structuralists at the Enquirer. I think Greenhill’s paper helps clarify some things about the response to this news (if that’s what it is) -- and, in fact, elucidates some things happening between the lines of the story itself.
Her analysis emphasizes that a folkloric work (song, legend, tabloid) serves to help hold a community together. And it can play that role whether or not everyone in the community quite believes it to be literally true. That point has been made more recently, with great force, by the Pennsylvania State University-Hazleton folklorist Bill Ellis, whose papers in the book Aliens, Ghosts, and Cults: Legends We Live (University Press of Mississippi, 2001) deserves to be better known.
To sum up the point as Ellis makes it: The important thing to understand about any form of contemporary folklore (for example, the urban legends constantly making the rounds of e-mail and conversation) is that the debate over its truth or falsity is part of how it circulates. Such folklore helps define the limits of what a community believes. Or rather, what it hopes or fears might be believable.
That certainly applies to the online conversation over “Bush’s Booze Crisis” – in which some very fine hairs have been split over the epistemological question of whether something might be true even if it’s in the Enquirer.
That story can be read as a criticism of the President, in keeping with a populist distrust of power that Greenhill finds operating throughout the tabloid. But it is also, at the same, time, a negation of the image of him that has emerged in recent months as someone utterly out of touch with the news about Iraq and Katrina. It shows him, rather, as wounded to the core. What might look like ignorance and indifference are actually the signs that awesome responsibility has left him in unimaginable pain. He is human, all too human.
Does this have any political consequence at all, in the real world? I don’t know. But it does tend to confirm the basic insightfulness of Greenhill’s paper – tucked away in a scholarly publication now forgotten, probably, by everyone except the contributors. And maybe even by them. The final line of her essay sums it up nicely: “The attractiveness of the Enquirer could be that its readers can pick and choose both their tales and their morals from a certain range of possibilities, and know that others are doing the same thing.”
UPDATE: The mystery regarding the fate of the author of the 1982 paper has now been cleared up -- for it turns out that her name was actually Pauline (not Stephanie) Greenhill. I regret the error, without being quite responsible for it. As luck would have it, a glitch from 23 years ago has come back to haunt us.
In an e-mail note, Ms. Greenhill explains that she was indeed the folklore grad student at the University of Texas at Austin "who wrote the article on the National Enquirer, published in the student conference proceedings so many years ago." But she wasn't actually there to present her paper. "My friend STEPHANIE Kane gave it on my behalf, and somehow the editors mixed up our two names.... The conference organisers did put in an errata sheet correcting the mistake, but we all know what happens to those sorts of things."
Today "the not very mysterious" Pauline Greenhill, as she signs herself, is a professor of women's studies at the University of Winnipeg. A list of her scholarly publications since 1993 is available at this Web page.
On Thursday, by appointment, this column makes its rendezvous with the most recent output of the book trade, with university press offerings a speciality -- even though many catalogues crossing my desk seem divested of many scholarly titles at all. More and more of them seem filled instead with listings for regional cookbooks, detective novels, and photographic albums devoted to the flowers visible from the bicycle trails of state parks, suitable for purchase in the gift shop.
Is this an exaggeration? No, friends, it is not. Serious books do, of course, appear; right now, I am reading several at one go, each of them at least 500 pages long. (No doubt, as Kant said, it was too much work to write a short one.) But even these ponderous tomes are sometimes manifestly worse for wear -- victims of the (rolling and interminable) crisis in academic publishing.
One book from last year is sinking ever-lower in my must-read pile simply because there has not been time to set up an appointment with my oculist: It was set in type just slightly larger than that used to list the side-effects of a new drug. The editor of the volume in question says he was startled when he first saw it. And the readers, if any, feel his pain -- a sharp throbbing sensation, after about 20 minutes.
Easier on the eyes, but no less appalling, are the latest reports from the Library and Information Statistics Unit at Loughborough University, in the UK. Every six months, LISU crunches the numbers regarding British and American academic book prices. Since 1987, the statisticians have been compiling and analyzing the prices of books in 64 subject categories that are "closely relevant to acquisitions librarians’ needs." Categories include law, engineering, medicine (both human and veterinary), the various arts, and several branches each of the humanities and the social sciences.
Thanks to LISU's analysis of how prices are varying, librarians in charge of research collections have some sense of how to plan their budgets. Claire Creaser, the deputy director and senior statistician for LISU, has kindly provided me with the latest reports, covering the first six months of this year. (A more extensive presentation of the material is available for sale in CD-ROM format, including masses of data in Excel spreadsheets, if that sort of thing does not terrify you.)
First, the bad news: During 2004 to 2005, the overall average price for an academic book from an American publisher has gone up 2.2 percent compared to the previous year. "This compares to a slight fall in prices for UK books over the same period," as LISU notes, "and continues the recent trend for prices to rise rather more rapidly in the US than the UK." Indeed, the report focusing on British prices notes that they have gone down 4.8 percent over the past five years. A comparison with the comparable table for American academic titles shows prices increasing by 35 percent since 1999-2000.
Then, the rest of the bad news: "There is no consistency or pattern in the half-yearly price changes [for US titles] over recent years, which can only make budgets more difficult for librarians." Prices in a few areas have gone down. (The 761 philosophy titles from American presses between January and June were 2 percent cheaper than the previous lot.) But the vast majority of subject categories have shown an increase in price. A graph covering the past two decades shows that, sometime around mid-1996, the average cost of academic books began to shoot ahead of the retail price index. The gap between them is now wider than at any point in LISU’s record.
The figures are, in short, what you’d figure: Not only are scholarly books getting more expensive, but most of them are growing more expensive faster than other commodities. I asked Claire Creaser if her associates had tried to extrapolate from their (by now, enormous) data set. She declined, saying, "We do not analyze trends in any detail, and do not make forecasts." But she did point to a summary of budget trends for UK libraries. Up to a third of the budget for a British academic library may come from US publishers -- a reminder that the increasing price of our scholarly exports does have a global effect.
Meanwhile, closer to home, Intellectual Affairs faces the continuing balancing act of determining how to budget the scarcest of resources -- namely, reading time. It is a common enough problem, of course. But for a journalist, there is the added complication known as the university press publicist -- though, come to think of it, scholars encounter them as well, usually while roaming the exhibit hall at a convention.
Now, in my line of work, eavesdropping on such conversations sometimes actually counts as research. (The discussion may possess anthropological significance.) But actually having a face-to-face with a university press publicist is very often a maddening thing.
A handful of them are genuine book people. They are in love with a few of the new titles, reasonably well informed about the rest, and smart enough to have established sound lines of communication with the acquisitions editors – so that, while talking to them, you just might find out what the press’s big titles for 2008 will be. Such publicists are rare.
As for the rest.... If you ask what the most important, interesting, or otherwise eyestrain-worthy things their press has to offer, they will pull out the catalog and – please understand that I am not making this up -- begin to read it aloud to you, sometimes in a manner suggesting that this is the first time they have had occasion to pay so much attention to it.
The tedium of it for everyone is heartbreaking. The molars grind. The mind wanders. Mine, actually, finds itself trapped in an audiovisual center where someone is screening a video of Leonid Brezhnev giving a four-hour speech on the need to increase oatmeal production in 1974. Was the man happy? Was his audience? And just how many more pages are there in this catalog, anyway?
Isn't that a bit harsh? Aren't publicists rather low on the chain of command, the service workers of the publishing world? Yes, and yes, respectively. But the scene just described is (for any satirical embellishment) a familiar one, and it does no good for anybody -- not for the press, not for its authors, and certainly not for the larger public.
Nor are there grounds for thinking the arrangement will correct itself. I don't intend to be mean-spirited about it -- but perhaps a frank expression of irritation and dismay would be better than pretending that the lackadaisical status quo ante has been anything but ridiculous.
It might be as simple as making sure that editors spend more time with publicists, and encourage them to read some of the annual output. For that matter, people new to the publicist's trade could be encouraged to talk with the authors. My sense is that really good publicists do all of this anyway, just for the pleasure of it. But that doesn't mean the skills and habits can't be taught. It would be to everyone's advantage if they were.
There must be a better way. And one possibility that has emerged comes to my attention via Colleen Lanick, the publicity manager for MIT Press, who is one of the good ones. Earlier this week, she pointed out the new MIT PressLog and a similar blog run by Oxford University Press.
To find out if any other academic publishers had climbed aboard this particular bandwagon, I contacted Brenda McLaughlin, the communications manager for the Association of American University Presses. She mentioned the one maintained by Cork University Press, in Ireland. But otherwise, Oxford and MIT seem to be in the vanguard -- though McLaughlin says that AAUP itself now has a restricted-access blog for its members under development.
An interesting development, then – if only for the timing, given the recent wave of anguish over the danger that a reputation for blogging might pose to an academic on the job market. (The rather hysterical tone prevailing in some quarters calls to mind what sociologists call "moral panic." But the iron cage of bureaucracy is, after all, a strange thing: Today, there are timid souls who worry that a prospective colleague's blog might be a record of torrid threesomes indulged while plotting to assassinate the dean. Tomorrow they will be retired, or laughed off campus -- whereupon blogging might well become mandatory, rather than forbidden. Stranger things do happen.
With the shrinking space for coverage of books in the mainstream media, it’s understandable that academic presses might seize on the blog form as a venue to push their product. "We've been tracking the blog activities of our authors and various postings about our books for some time," Colleen Lanick of MIT told me. "So we thought it might be an interesting experiment to try this out in some form, ourselves....We are encouraging our authors to write original pieces for the log about how their work relates to current events."
Sanford Thatcher, the director of Penn State University Press, sounds a lot more skeptical of the idea of blogs run by academic publishers. Joining us on the telephone during the conference call was Tony Sanfillipo, the marketing and sales director for the PSU Press -- who, like Thatcher, is lately in the midst of dealing with the Google Print Libraries Project. They aren’t technophobes, but that doesn’t make them blog boosters.
"When people think of blogging," said Thatcher, "they think for the most part of political blogs, of argument. The Oxford and MIT blogs look very, very commercial. Wrigley Spearmint Gum has its own blog, if I remember correctly, but I’m just not sure it makes commercial sense." He suggests that the Books for Understanding project, sponsored by the AAUP, might be a better example of how academic publishers can make the public aware of their titles. The project offers bibliographies of university-press titles that are relevant to current events.
"The blog entries [for MIT and Oxford] don’t look that different from catalog copy to me," said Sanfillipo. "They don’t really engage the reader the way a blog does. You don’t see any trackbacks or comments for the entries, or not at least not many."
A fair point, though the sites are new and very much under development. Brenna McLaughlin from the Association of American University Presses says that the phenomenon is in its infancy. "It’s having its time," she said. "We’re just starting to realize the need to reimagine traditional publicity and communications work. That means you have to face the staffing issue: How much time do people have for this at a small press? But I do believe it will become more common."
With any luck, the chance to "reimagine traditional publicity and communications work" will include a careful evaluation of the tendency to read university press catalog copy out loud to functionally literate adults. (If so, I’m all for it.) Not that audio doesn’t have its place. Colleen Lanick mentions that MIT Press is now "working on developing a podcasting feature for the log where we can conduct interviews with authors and they can read segments of their books."
For a moment, you can envision a world in which more people become interested in academic press titles – so they begin to sell better, and maybe the prices go down a bit, since the publishers don’t have so many copies returned from Border’s.... And then comes news of a study in England showing that "taxi drivers, pub landlords and hairdressers -- often seen as barometers of popular trends -- found that nearly 90 percent had no idea what a podcast is and more than 70 percent had never heard of blogging." When asked about the latter term, many people thought the questioner was referring to “dogging” instead.
“Dogging,” as the Reuters news agency explains, “is the phenomenon of watching couples have sex in semi-secluded places such as out-of-town car parks. News of such events are often spread on Web sites or by using mobile phone text messages.”
Moral panic or no, you probably won’t get hired if the faculty search committee thinks you are dogging.
Two years ago, The Virginia Quarterly Review published an essay called "Quarterlies and the Future of Reading." The author, George Core, has been the editor of The Sewanee Review since 1973 -- at which time, it was already a venerable institution, one of the oldest publications of its kind in the United States.
By "publications of its kind," I mean the general-interest cultural quarterlies, usually published by universities or liberal-arts colleges. Unlike scholarly journals, they aren't focused on a particular field. Usually they offer a mixture of contemporary poetry and fiction with essays that are learned but nonspecialist.
It pays to be explicit about that, because so few people keep track of the university quarterlies now. Many of them are still around, with a modest if reliable subscriber base in the libraries. But it often seems as if they continue just by inertia.
It is always a sentimental gesture to speak of a golden age. But what the hell: The years between, say, 1925 and 1965 were a glorious time for such journals. Then, as now, their circulations were usually modest. But the worlds of publishing and of the university were smaller, and the quarterlies had a disproportionately large role to play. Truman Capote once mentioned in an interview how he knew he had "arrived" as a young writer in the 1940s: On the same day, he received two or three letters from the editors of quarterlies accepting his short stories for publication in their pages.
In his essay, two years ago, Core insisted that "the literary quarterly ... has been the linchpin of civilization since the 18th century." And if things did not look encouraging at the dawn of the 21st century ... well, so much the worse for what that says about civilization. "The average librarian these days, like many members of various departments in the humanities," wrote Core, "has become hostile to books and hostile to reading." Needless to say, technology is to blame.
It is hard to know what to make of the fact that Core's essay is now available online. I mean, sure, you can read it that way, but would he really want that?
But cantankerousness in defense of the quarterly is no vice. Nor, for that matter, is it a virtue to overstate how much the university-based general-interest periodical has declined. The situation may not be good, but it is not quite catastrophic.
Core's Sewanee Review is hopelessly out of touch with many trends in contemporary literary studies -- which is one reason it is still worth reading. But The Minnesota Review is very much in touch with developments in cultural theory, while also publishing a good deal of poetry, fiction, and personal essays. I've been reading it with interest for 25 years now (during which time it has never actually been published or edited in Minnesota). The Common Review, published by the Great Books Foundation, occupies a niche somewhere between the old-fashioned university quarterly and magazines such as Harper's and The Atlantic Monthly.
The list could go on -- and if it did, would have to include such non-academic, sui generis publications as N+1, which I've been urging upon the attention of startled bystanders ever since seeing the prototype pamphlet that appeared in advance of its debut, not quite two years ago. A third issue is now at newsstands. (See also the magazine's Web site. )
And if you want to see some interesting and successful experiments in updating the whole format, keep an eye out for Boston Review and The Virginia Quarterly Review.
The September/October issue of Boston Review marks its 30th anniversary. Coming out six times a year, and as a tabloid, BR might at first seem to bear little resemblance to the university quarterlies of any era. But those differences are superficial. The mixture of political, philosophical, and literarydiscussion calls to mind the early Partisan Review, the most agenda-setting of the quarterlies published at mid-century.
Boston Review's anniversary issue contains a 12-page anthology of poems and essay organized by year -- including work by (to give a partial list) John Kenneth Galbraith, Adrienne Rich, Rita Dove, Ralph Nader, George Scialabba, and Martha Nussbaum. (Somebody on that list is bound to interest or agitate you.) One of the selections for 1993 comes from an essay by Christopher Hitchens called "Never Trust Imperialists." It would have been interesting to hear the editorial discussion that resulted in that one being included.
As for The Virginia Quarterly Review, it has come under a new editor, Ted Genoways, who seems to have ignored entirely the worries expressed in George Core's ruminations on the state of the quarterly. In its new incarnation, [ital]VQR[ital] is colorful, topical, even a bit flashy -- with the latest issue offering a gallery of photographs from Vietnam as well as a pull-out comic book by Art Spiegelman, along with poetry, fiction, a play, and several essays.
Some of the once-vital quarterlies ended up becoming so polite and reserved that their audiences did not so much read them as search each issue for signs of a pulse. You certainly don't have that problem with Boston Review or VQR. Recent discussion of standards for "public scholarship" has emphasized the possibility of creating venues "at the interface of campus and community."
Arguably, that is what the quarterlies and review always were -- and their revitalization now can only be a good sign.
It is disagreeable to approach the cashier with a book called How to Read Hitler. One way to take the stink off would be to purchase one or two other volumes in the new How to Read series published by W. W. Norton, which also includes short guides to Shakespeare, Nietzsche, Freud, and Wittgenstein. But at the time, standing in line at a neighborhood bookstore a couple weeks ago, I wasn't aware of those other titles. (The only thing mitigating the embarrassment was knowing that my days as a skinhead, albeit a non-Nazi one, are long over.) And anyway, the appearance of Adolf Hitler in such distinguished literary and philosophical company raises more troubling questions than it resolves.
"Intent on letting the reader experience the pleasures and intellectual stimulation in reading classic authors," according to the back cover, "the How to Read series will facilitate and enrich your understanding of texts vital to the canon." The series editor is Simon Critchley, a professor of philosophy at the New School in New York City, who looms ever larger as the guy capable of defending poststructuralist thought from its naysayers. Furthermore, he's sharp and lucid about it, in ways that might just persuade those naysayers to read Derrida before denouncing him. (Yeah, that'll happen.)
Somehow it is not that difficult to imagine members of the National Association of Scholars waving around the How to Read paperbacks during Congressional hearings, wildly indignant at Critchley's implicit equation of Shakespeare and Hitler as "classic authors" who are "vital to the canon."
False alarm! Sure, the appearance of the Fuhrer alongside the Bard is a bit of a provocation. But Neil Gregor, the author of How to Read Hitler, is a professor of modern German history at the University of Southampton, and under no illusions about the Fuhrer's originality as a thinker or competence as a writer.
About Mein Kampf, Gregor notes that there is "an unmistakably 'stream of consciousness' quality to the writing, which does not appear to have undergone even the most basic editing, let alone anything like polishing." Although Gregor does not mention it, the title Hitler originally gave to the book reveals his weakness for the turgid and the pompous: Four and a Half Years of Struggle against Lies, Stupidity and Cowardice. (The much snappier My Struggle was his publisher's suggestion.)
Incompetent writers make history, too. And learning to read them is not that easy. The fact that Hitler had ideas, rather than just obsessions, is disobliging to consider. Many of the themes and images in his writing reflect an immersion in the fringe literature of his day -- the large body of ephemeral material analyzed in Fritz Stern in his classic study The Politics of Cultural Despair: The Rise of the Germanic Ideology.
But Gregor for the most part ignores this influence on Hitler. He emphasizes, instead, the elements of Hitler's thinking that were, in their day, utterly mainstream. He could quote whole paragraphs Carl de Clausewitz on strategy. And his racist world view drew out the most virulent consequences of the theories of Arthur de Gobineau and Houston Stewart Chamberlain.(While Hitler was dictating his memoirs in a prison following the Beer Hall Putsch, he could point with admiration to one effort to translate their doctrines into policy: The immigration restrictions imposed in the United States in the 1920s.)
Gregor's method is to select passages from Mein Kampf and from an untitled sequel, published posthumously as Hitler's Second Book. He then carefully unpacks them -- showing what else is going on within the text, beneath the level of readily paraphrasable content. With his political autobiography, Hitler was not just recycling the standard complaints of the extreme right, or indulging in Wagnerian arias of soapbox oratory. He was also competing with exponents of similar nationalist ideas. He wrote in order to establish himself as the (literally) commanding figure in the movement.
So there is an implicit dialogue going on, disguised as a rather bombastic monologue. "Long passages of Hitler's writings," as Gregor puts it, "take the form of an extended critique of the political decisions of the late nineteenth century.... Hitler reveals himself not only as a nationalist politician and racist thinker, but -- this is a central characteristic of fascist ideology -- as offering a vision of revitalization and rebirth following the perceived decay of the liberal era, whose failings he intends to overcome."
The means of that "overcoming" were, of course, murderous in practice. The vicious and nauseating imagery accompanying any mention of the Jews -- the obsessive way Hitler constantly returns to metaphors of disease, decay, and infestation -- is the first stage of a dehumanization that is itself an incipient act of terror. The genocidal implications of such language are clear enough. But Gregor is careful to distinguish between the racist stratum of Hitler's dogma (which was uncommonly virulent even compared to the "normal" anti-Semitism of his day) and the very widespread use of militarized imagery and rhetoric in German culture following World War I.
"Many of the anti-Semitic images in Hitler's writing can be found in, say, the work of Houston Stewart Chamberlain," writes Gregor. "Yet when reading Chamberlain's work we hardly sense that we are dealing with an advocate of murder. When reading Hitler, by contrast, we often do -- even before we have considered the detail of what he is discussing. This is because the message is not only to be found in the arguments of the text, but is embedded in the language itself."
How to Read Hitler is a compact book, and a work of "high popularization" rather than a monograph. The two short pages of recommended readings at the end are broad, pointing to works of general interest (for example, The Coming of the Third Reich by Richard Evans) rather than journal articles. It will find its way soon enough into high-school and undergraduate history classrooms -- not to mention the demimonde of "buffs" whose fascination with the Third Reich has kept the History Channel profitable over the years.
At the same time, Gregor's little book is an understated, but very effective, advertisement for the "cultural turn" in historical scholarship. It is an example, that is, of one way historians go about examining not just what documents tell us about the past, but how the language and assumptions of a text operated at the time. His presentation of this approach avoids grand displays of methodological intent. Instead the book just goes about its business -- very judiciously, I think.
But there is one omission that is bothersome. Perhaps it is just an oversight, or, more likely, a side effect of the barriers between disciplines. Either way, it is a great disservice that How to Read Hitler nowhere points out the original effort by someone writing in English to analyze the language and inner logic of Mein Kampf -- the essay by Kenneth Burke called "The Rhetoric of Hitler's 'Battle,' " published in The Southern Review in 1939. (In keeping with my recent enthusing over the "golden age" of the academic literary quarterly, it is worth noting that the Review was published at Louisiana State University and edited by a professor there named Robert Penn Warren.)
Burke's essay was, at the time, an unusual experiment: An analysis of a political text using the tools of literary analysis that Burke had developed while studying Shakespeare and Coleridge. He had published the first translations of Thomas Mann's Death in Venice and of portions of Oswald Spengler's Decline of the West -- arguably a uniquely suitable preparation for the job of reading Hitler. And just as various German émigrés had tried to combine Marx and Freud in an effort to grasp "the mass psychology of fascism" (as Wilhelm Reich's title had it), so had Burke worked out his own combination of the two in a series of strange and brilliant writings published throughout the Depression.
But he kept all of that theoretical apparatus offstage, for the most part, in his long review-essay on a then-new translation of Mein Kampf. Instead, Burke read Hitler's narrative and imagery very closely -- showing how an "exasperating, even nauseating" book served to incite and inspire a mass movement.
This wasn't an abstract exercise. "Let us try," wrote Burke, "to discover what kind of 'medicine' this medicine man has concocted, that we may know, with greater accuracy, exactly what to guard against, if we are to forestall the concocting of similar medicine in America."
Burke's analysis is a [ital]tour de force[ital]. Revisiting it now, after Gregor's How to Read volume, it is striking how much they overlap in method and implication. In 1941, Burke reprinted it in his collection The Philosophy of Literary Form, which is now available from the University of California Press. You can also find it in a very useful anthology of Burke's writings called On Symbols and Society, which appears in the University of Chicago Press's series called "The Heritage of Sociology."
"Above all," wrote Burke in 1939, "I believe we must make it apparent that Hitler appeals by relying upon a bastardization of fundamentally religious patterns of thought. In this, if properly presented, there is no slight to religion. There is nothing in religion proper that requires a fascist state. There is much in religion, when misused, that does lead to a fascist state. There is a Latin proverb, Corruptio optimi pessima, 'the corruption of the best is the worst.' And it is the corruptors of religion who are a major menace to the world today, in giving the profound patterns of religious thought a crude and sinister distortion."
As a technology specialist working in the humanities, there are times when it's difficult not to feel like a walking contradiction.
I'm accustomed, as an academic blogger for 2+ years, to near-instantaneous feedback on my writing and participating in networked conversations that can spring up overnight and disappear just as quickly. As an academic bound to certain research expectations, though, conversational cycles still occur at the speed of print, unfolding in some cases over a period of years. Blogging is less about occupying a different space than it is working to a different rhythm, a difference that can be difficult to explain to those who don't do it.
As a blogger, I subscribe to more than 100 RSS feeds, syndicated content from the blogs of friends and acquaintances, news sites, and filter sites for particular topics. As an academic, though, I struggle to keep current with a much smaller number of journals in my field (rhetoric and composition), and this even though most of them only publish two to four times a year. Undoubtedly, this is in part because of the difference between "work" reading and "play" reading; it's far easier (and usually more pleasant) to skim a handful of blog entries than it is to give sustained attention to a journal article, after all.
But it's more than that. Although the quantity and quality of writing that I read online almost certainly differs from the scholarly reading I do, I would argue that the biggest change is that I practice reading differently. And this is a truth that, traditionally, disciplines in the humanities have been slow to accept. We are still prone to thinking of technology as something added to what are already substantial professional duties, instead of conceiving of it as a way of approaching those duties differently.
I've had opportunity in recent months to reflect on my various reading practices. In the spring of 2005, I was named the associate editor of the flagship journal ( College Composition and Communication, or CCC ) in my field and given the task of rethinking and redesigning its Web site, a task that would potentially bring together these separate domains. Most of the journals in my field have Web sites, and their quality tends to vary even more widely than the journals themselves.
What almost all of these sites have in common, though, is their central mission, which is to mirror their respective print journals. In other words, most of these sites provide little content beyond what is already available in the journals themselves (and in some cases, much less or after a significant lag).
The fact of the matter is that I don't think of these mirror sites as part of my online reading. There are times where it is convenient to look up a piece of information online, but only rarely do I end up at a journal Web site when I do. Although I may not be typical in this regard, I use these sites so infrequently that they seem little more than a cursory obligation added to the workload of the journals' editors. As I began the task of rethinking CCC Online, then, one of my chief motivations was to practice what I've preached above, to conceive of the site as a way for my colleagues to experience the journal in different and (ideally) productive ways, rather than as a mere repository mirroring the print version.
All three members of the development team (myself and two research assistants) are active academic bloggers, and ultimately, it has been our experience with blogging, and with social software more generally, that has helped us in this process. In discussions of blogs, the tendency is to focus on content, on the identities of bloggers, and on the networking that occurs within and among these sites. In a more formal sense, though, blog platforms are simply content management systems, databases designed with an eye toward the kinds of publications that we now think of as blogs. Most important, perhaps, they are scaled for personal use.
In the case of CCC Online, we use Movable Type for the site's infrastructure, which allows us to duplicate a number of the features typical to large-scale databases. We provide a dedicated search engine, for instance, and can notify subscribers about updates to the site, over e-mail and RSS.
While we have taken advantage of some of the generic features of blogs, we haven't turned the journal's Web site into a blog. The problem we're addressing isn't a dearth of information, but an overload. With the publication of a new issue of CCC, we add an entry for each article, and that entry contains metadata about the article: abstract, keywords, bibliography, and a permalink for the article itself (which remains password-protected for subscribers only). At the same time, we are slowly adding similar data for the journal's back issues.
Building a centralized archive for the journal's metadata allows us to do a number of things:
It makes the contents of the journal available not only to search engines, but to bookmarking services like del.icio.us, CiteULike and others.
It allows us to include insular links to the articles, allowing users to follow up on citations of other CCC articles.
Through the use of bi-directional trackback links, we also include "Works Citing," links to essays that have taken up (and cited) a particular article.
Finally, we use del.icio.us, a social bookmarking application, to offer users an associative interface for browsing the journal.
This last feature deserves a little more explanation. Providing a static page for each journal article allows our colleagues to bookmark those pages (and to tag them with their own keywords) in an application like del.icio.us. What we've done is to open our own del.icio.us account, and to bookmark each article, tagging them with both author-supplied and textually-generated keywords. The result is an associative network that goes beyond citation to connect articles on similar topics.
In the case of each of these features, our focus is not on adding new content, but rather on devising ways for users to approach and/or manage the content we are already generating as a discipline. The principle behind each of these features is the same: we've tried to focus site development around the question of how the site might be used instead of what information the site contains.
CCC Online is in many ways still a mirror site, but it's a mirror that can be manipulated in a variety of ways, offering our colleagues different perspectives on the journal's content, perspectives that are impossible to duplicate in print. We've worked to make the site as productive as possible, integrating more efficient management of the journal's content with opportunities for exploration and invention.
Depending on the discipline or profession where you find yourself, our site may seem either painfully obvious or distressingly opaque. For our colleagues in the humanities, though, we hope it serves as a model of what we might accomplish by turning to social software to inform (and improve) our practices as readers and writers.
In addition to managing CCC Online, Collin Brooke is an assistant professor of rhetoric and writing at Syracuse University, where he also directs the composition and cultural rhetoric doctoral program.
Jerome Karabel's The Chosen is the big meta-academic book of the season -- a scholarly epic reconstructing "the hidden history of admission and exclusion at Harvard, Yale, and Princeton," as the subtitle puts it. Karabel, who is a professor of sociology at the University of California at Berkeley, has fished documents out of the archive with the muckraking zeal worthy of an investigative journalist. And his book, published this month by Houghton Mifflin, is written in far brisker narrative prose than you might expect from somebody working in either sociology or education. That's not meant as a dis to those worthy fields. But in either, the emphasis on calibrating one's method does tend to make storytelling an afterthought.
For Karabel really does have a story to tell. The Chosen shows how the gentlemanly anti-Semitism of the early 20th century precipitated a deep shift in how the country's three most prestigious universities went about the self-appointed task of selecting and grooming an elite.
It is (every aspect of it, really) a touchy subject. The very title of the book is a kind of sucker-punch. It is an old anti-Jewish slur, of course. It's an allusion to Jehovah's selection of the Jews as the Chosen People, of course. It's also a term sometimes used, with a sarcastic tone, as an ethnic slur. But Karabel turns it back against the WASP establishment itself -- in ways too subtle, and certainly too well-researched, to be considered merely polemical. (I'm going to highlight some of the more rancor-inspiring implications below, but that is due to my lack of Professor Karabel's good manners.)
The element of exposé pretty much guarantees the book a readership among people fascinated or wounded by the American status system. Which is potentially, of course, a very large readership indeed. But "The Chosen" is also interesting as an example of sociology being done in almost classical vein. It is a study of what, almost a century ago, Vilfredo Pareto called "the circulation of elites" -- the process through which "the governing elite is always in a state of slow and continuous transformation ... never being today what it was yesterday."
In broad outline, the story goes something like this. Once upon a time, there were three old and distinguished universities on the east coast of the United States. The Big Three were each somewhat distinctive in character, but also prone to keeping an eye on one another's doings.
Harvard was the school with the most distinguished scholars on its faculty -- and it was also the scene of President Charles Eliot's daring experiment in letting undergraduates pick most of their courses as "electives." There were plenty of the "stupid young sons of the rich" on campus (as one member of the Board of Overseers put it in 1904), but the student body was also relatively diverse. At the other extreme, Princeton was the country club that F. Scott Fitzgerald later described in This Side of Paradise. (When asked how many students there were on campus, a Princeton administrator famously replied, "About 10 percent.")
Finally, there was Yale, which had crafted its institutional identity as an alternative to the regional provincialism of Harvard, or Princeton's warm bath of snobbery. It was "the one place where money makes no difference ... where you stand for what you are," in the words of the then-beloved college novel Dink Stover, about a clean-cut and charismatic Yalie.
But by World War One, something was menacing these idyllic institutions: Namely, immigration in general and "the Hebrew invasion" in particular. A meeting of New England deans in the spring of 1918 took this on directly. A large and growing percentage of incoming students were the bright and driven children of Eastern European Jewish immigrants. This was particularly true at Harvard, where almost a fifth of the freshman class that year was Jewish. A few years later, the figure would reach 13 percent at Yale -- and even at Princeton, the number of Jewish students had doubled its prewar level.
At the same time, the national discussion over immigration was being shaped by three prominent advocates of "scientific" racism who worried about the decline of America's Nordic stock. They were Madison Grant (Yale 1887), Henry Fairfield Osborne (Princeton 1877), and Lothrop Stoddard (Harvard 1905).
There was, in short, an air of crisis at the Big Three. Even the less robustly bigoted administrators worried about (as one Harvard official put it) "the disinclination, whether justified or not, on the part of non-Jewish students to be thrown into contact with so large a proportion of Jewish undergraduates."
Such, then, was the catalyst for the emergence, at each university, of an intricate and slightly preposterous set of formulae governing the admissions process. Academic performance (the strong point of the Jewish applicants) would be a factor -- but one strictly subordinated to a systematic effort to weigh "character."
That was an elusive quality, of course. But administrators knew when they saw it. Karabel describes the "typology" that Harvard used to make an initial characterization of applicants. The code system included the Boondocker ("unsophisticated rural background"), the Taconic ("culturally depressed background," "low income"), and the Krunch ("main strength is athletic," "prospective varsity athlete"). One student at Yale was selected over an applicant with a stronger record and higher exam scores because, as an administrator put it, "we just thought he was more of a guy."
Now, there is a case to be made for a certain degree of flexibility in admissions criteria. If anything, given our reflex-like tendency to see diversity as such as an intrinsic good, it seems counterintuitive to suggest otherwise. There might be some benefit to the devil's-advocate exercise of trying to imagine the case for strictly academic standards.
But Karabel's meticulous and exhaustive record of how the admissions process changed is not presented as an argument for that sort of meritocracy. First of all, it never prevailed to begin with.
A certain gentlemanly disdain for mere study was always part of the Big Three ethos. Nor had there ever been any risk that the dim sons of wealthy alumni would go without the benefits of a prestigious education.
What the convoluted new admissions algorithms did, rather, was permit the institutions to exercise a greater -- but also a more deftly concealed -- authority over the composition of the student body.
"The cornerstones of the new system were discretion and opacity," writes Karabel; "discretion so that gatekeepers would be free to do what they wished and opacity so that how they used their discretion would not be subject to public scrutiny.... Once this capacity to adapt was established, a new admissions regime was in place that was governed by what might be called the 'iron law of admissions': a university will retain a particular admissions policy only so long as it produces outcomes that correspond to perceived institutional interests."
That arrangement allowed for adaptation to social change -- not just by restricting applicants of one minority status in the 1920s, but by incorporating underrepresented students of other backgrounds later. But Karabel's analysis suggests that this had less to do with administratorsbeing "forward-looking and driven by high ideals" than it might appear.
"The Big Three," he writes, "were more often deeply conservative and surprisingly insecure about their status in the higher education pecking order.... Change, when it did come, almost always derived from one of two sources: the continuation of existing policies was believed to pose a threat either to vital institutional interests (above all, maintaining their competitive positions) or to the preservation of the social order of which they were an integral -- and privileged -- part."
Late in the book, Karabel quotes a blistering comment by the American Marxist economist Paul Sweezy (Exeter '27, Harvard '31, Harvard Ph.D. '37) who denounced C. Wright Mills for failing to grasp "the role of the preparatory schools and colleges as recruiters for the ruling class, sucking upwards the ablest elements of the lower classes." Universities such as the Big Three thus performed a double service to the order by "infusing new brains into the ruling class and weakening the potential leadership of the working class."
Undoubtedly so, once upon a time -- but today, perhaps, not so much. The neglect of their duties by the Big Three bourgeoisie is pretty clear from the statistics.
"By 2000," writes Karabel, "the cost of a year at Harvard, Yale, and Princeton had reached the staggering sum of more than $35,000 -- an amount that well under 10 percent of American families could afford....Yet at all three institutions, a majority of students were able to pay their expenses without financial assistance -- compelling testimony that, more than thirty years after the introduction of need-blind admissions, the Big Three continued to draw most of their students from the most affluent members of society." The number of students at the Big Three coming from families in the bottom half of the national income distribution averages out to about 10 percent.
All of which is (as the revolutionary orators used to say) no accident. It is in keeping with Karabel's analysis that the Big Three make only as many adjustments to their admissions criteria as they must to keep the status quo ante on track. Last year, in a speech at the American Council on Education, Harvard's president, Larry Summers, called for preferences for the economically disadvantaged. But in the absence of any strong political or social movement from below -- an active, noisy menace to business as usual -- it's hard to imagine an institutionalized preference for admitting students from working families into the Big Three. (This would have to include vigorous and fairly expensive campaigns of recruitment and retention.)
As Walter Benn Michaels writes in the latest issue of N+1 magazine, any discussion of class and elite education now is an exercise in the limits of the neoliberal imagination. (His essay was excerpted last weekend in the Ideas section of The Boston Globe.
"Where the old liberalism was interested in mitigating the inequalities produced by the free market," writes Michaels, " neoliberalism -- with its complete faith in the beneficence of the free market -- is interested instead in justifying them. And our schools have a crucial role to play in this. They have become our primary mechanism for convincing ourselves that poor people deserve their poverty, or, to put the point the other way around, they have become our primary mechanism for convincing rich people that we deserve our wealth."
How does this work? Well, it's no secret that going to the Big Three pays off. If, in theory, the door is open to anyone smart and energetic, then everything is fair, right? That's equality of opportunity. And if students at the Big Three then turn out to be drawn mainly from families earning more than $100,000 per year....
Well, life is unfair. But the system isn't.
"But the justification will only work," writes Michaels, if "there really are significant class differences at Harvard. If there really aren't -- if it's your wealth (or your family's wealth) that makes it possible for you to go to an elite school in the first place -- then, of course, the real source of your success is not the fact that you went to an elite school but the fact that your parents were rich enough to give you the kind of preparation that got you admitted to the elite school. The function of the (very few) poor people at Harvard is to reassure the (very many) rich people at Harvard that you can't just buy your way into Harvard."
The rule of law is to the routines of an ordinary, civilized existence roughly what oxygen is to long division. It's not actually part of the equation, but you can't actually make the calculations without it. Not that it guarantees justice. There may be lapses, omissions, misfires; and if the laws themselves are bad, then the rule of law can bring misery. And there's more to life than regularity. Profound truths may by revealed by transgression, charismatic authority, and ecstatic excesses embodying the creative and destructive dimensions of poetry, mystery, and the sacred. (That said, I'd still rather live in a city with zoning ordinances.)
In The Law in Shambles -- just published in the Prickly Paradigm series, distributed by the University of Chicago Press -- Thomas Geoghegan offers an incisive criticism, from the left, of the idea that the expression "rule of law" is at all appropriate to the way we live now. His booklet is conversational, wide-ranging, and absolutely terrifying. It deserves a wide readership.
In saying that Geoghegan's perspective comes "from the left," I've made room for misunderstandings that should be cleared up right away. First of all, he's not denouncing the whole concept of rule of law as a more or less streamlined way of carrying out the "golden rule" of capitalism, that he with the gold makes the rules. (That's the paleo-Marxist position. Some of the International Socialist Organization activists on your campus might make this argument.) Nor is Geoghegan criticizing actually existing constitutional democracy (as we might call it) from the vantage point of some "original position" of fairness, A Theory of Justice-style.
The author is a labor lawyer (though he has also been a fellow at the American Academy in Berlin). He's arguing from his own court cases, and from perceived trends -- not from first principles. He once loved the work of John Rawls, and the dream is not quite dead; but really, that was a long time ago. "Ever since he wrote that book," Geoghegan says, "it's as if someone with a voodoo doll put a hex on his whole approach."
No, Geoghegan's criticism is less abstract, more crunchy. It's not just that respect for the awesome majesty of the law is now largely pro forma. Rather, in important regards the whole edifice has been gutted; before long, there won't even be any nails holding the facade together.
The increasingly robust and strident contempt for the judiciary expressed by the American right is only part of it, if the most bewildering for anybody who remembers the old conservative motto of "law and order." Now the emphasis is just on order, plain and simple. And not in the sense conveyed by Jack Webb's no-nonsense demeanor on Dragnet. More like Joseph de Maistre's rhapsody over the hangman's role as cornerstone of civilization.
Which is worrisome, no doubt about it. But Geoghegan is more concerned about the low-key, day-to-day degradations of the rule of law. Consider, for example, the case of the rat turds. Geoghegan worked on a brief on behalf of workers who had lost their jobs when a chicken-processing plant shut down -- suing on their behalf under the Worker Adjustment and Retraining Notification (WARN) Act, which requires that a factory owner give employees 60 days notice that a plant will be shut down. To this, the chicken-processing guy had a ready answer: He had been shut down by the Department of Agriculture for health-code violations -- an unforeseen contingency, he said.
He had an argument, says Geoghegan: "Yes, he may have done bad things, and let rats run wild, and let rats shit on the chicken meat. And yes, it is even true that the inspectors of the Department of Agriculture gave him 'write-ups.' But here is the issue: Was it reasonable for the owner to foresee that the DOA would enforce its own regulation?" After all, everybody in the business knows that you get the write-up and pay the fine.
"That," as Geoghegan says, "was his claim: DOA is more or less a joke. Under Bush I, then Clinton, and then Bush II, it's gotten worse.... Now comes the ruling of the district judge, who is a liberal, a Clinton appointee: Yes, he says, it was unforeseeable. It was as if he took judicial notice that, as a matter of common knowledge, the government does not enforce the laws.... In other words, the application of the rule of law is the equivalent of an 'act of God.' Like a hurricane."
Behind such incidents, the author sees at work an intricate play of mutually reinforcing tendencies. One is the weak and shrinking labor movement. Another is the long-term decline in voter turnout (thereby eroding any checks on wingnuttery that may exist within the ideological sphere). And then there is the systematic underfunding of enforcement for whatever regulations of the economic sphere are still on the books. The legal center of gravity of civil society shifts from contract (with its schedule of benefits and obligations) to tort (so that the only right that matters is the one to collect on damages).
Some of this is familiar, but ne'er so well expressed. Were there a serious movement in this country to challenge the present course of things, The Law in Shambles would be available in a grubby newsprint version selling for 10 cents, and distributed by the hundreds of thousands. Instead, you have to pay 10 bucks for it. This is not good.
Sometimes our tools are our politics, and that’s not always a good thing. Last week, the Copyright Clearance Center announced that it would integrate a “Copyright Permissions Building Block” function directly into Blackboard’s course management tools. The service automates the process of clearing copyright for course materials by incorporating it directly into the Blackboard tool kit; instructors post materials into their course space, and then tell the application to send information about those materials to CCC for clearance.
For many, this move offers welcome relief to the confusion currently surrounding the issue of copyright. Getting clearance for the materials you provide to your students, despite the help of organizations like CCC, is still a complicated and opaque chore. Instructors either struggle through the clumsy legal and financial details or furtively dodge the process altogether and hope they don’t get caught. With the centralization offered by CCC and now the automation offered by this new Blackboard add-on, the process will be more user-friendly, comprehensive, and close at hand. As Tracey Armstrong, executive vice president for CCC, put it, “This integration is yet another success in making the ‘right thing’ become the ‘easy thing.’”
Certainly, anything that helps get intellectual resources into the hands of students in the format they find most useful is a good thing. I have no doubt that both the CCC and Blackboard genuinely want the practical details of getting course materials together, cleared, and to the student to be less and less an obstacle to actually teaching with those materials. But I’m skeptical of whether this “easy thing” actually leads to the “right thing.” Making copyright clearance work smoothly overlooks the question of whether we should be seeking clearance at all -- and what should instead be protected by the copyright exception we’ve come to know as “fair use.”
Fair use has been the most important exception to the rules of copyright since long before it was codified into law in 1976, especially for educators. For those uses of copyrighted materials that would otherwise be considered an infringement, the fair use doctrine offers us some leeway when making limited use for socially beneficial ends.
What ends are protected can vary, but the law explicitly includes education and criticism -- including a specific reference to “multiple copies for classroom use.” It’s what lets us quote other research in our own without seeking permission, or put an image we found online in our PowerPoint presentations, or play a film clip in class. All of these actions are copyright violations, but would enjoy fair use protection were they ever to go to court.
But there is a dispute, among those who dispute these kinds of things, about exactly why it is we need fair use in such circumstances. Some have argued that fair use is a practical solution for the complex process of clearing permission. If I had to clear permission every single time I quoted someone else’s research or Xeroxed a newspaper article for my students -- figuring out who owns the copyright and how to contact them, then gaining permission and (undoubtedly) negotiating a fee -- I might be discouraged from doing so simply because it’s difficult and time-consuming. In the absence of an easy way to clear copyright, we have fair use as a way to “let it slide” when the economic impact is minimal and the social value is great.
Others argue that fair use is an affirmative protection designed to ensure that copyright owners don’t exploit their legal power to squelch the reuse of their work, especially when it might be critical of their ideas. If I want to include a quote in my classroom slides in order to demonstrate how derivative, how racist, or maybe just how incompetent the writer is, and copyright law compelled me to ask the writer’s permission to do it, he could simply say no, limiting my ability to powerfully critique the work. Since copyright veers dangerously close to a regulation of speech, fair use is a kind of First Amendment safety valve, such that speakers aren’t restricted by those they criticize by way of copyright.
This distinction was largely theoretical until organizations like CCC came along. With the help of new database technologies and the Internet, the CCC has made it much easier for people to clear copyright, solving some of the difficulty of locating owners and negotiating a fair price by doing it for us. The automatic mechanism being built into Blackboard goes one step further, making the process smooth, user-friendly, and automatic. So, if fair use is merely a way to account for how difficult clearing copyright can be, then the protection is growing less and less necessary. Fair use can finally be replaced by what Tom Bell called “fared use” -- clear everything easily for a reasonable price.
If, on the other hand, fair use is a protection of free speech and academic freedom that deliberately allow certain uses without permission, then the CCC/Blackboard plan raises a significant problem.
The fact that the fair use doctrine explicitly refers to criticism and parody suggests that it is not just for when permission is difficult to achieve, but when we shouldn’t have to ask permission at all. The Supreme Court said as much in Campbell v. Acuff-Rose (1994), when Justice Kennedy in a concurring decision noted that fair use “protects works we have reason to fear will not be licensed by copyright holders who wish to shield their works from criticism.” Even in a case in which permission was requested and denied, the court did not take this as a sign that the use was presumptively unfair. Fair use is much more than a salve for the difficulty of gaining permission.
Faculty and their universities should be at the forefront of the push for a more robust fair use, one that affirmatively protects “multiple copies for classroom use” when their distribution is noncommercial, especially as getting electronic readings to students is becoming ever cheaper and more practical.
Automating the clearance process undoes the possibility of utilizing, and more importantly challenging, this slow disintegration of fair use. Even if the Blackboard mechanism allows instructors simply not to send their information to CCC for clearance (and it is unclear if it is, or eventually could become, a compulsory mechanism), the simple fact that clearance is becoming a technical default means that more and more instructors will default to it rather than invoking fair use.
The power of defaults is that they demarcate the “norm”; the protection of pedagogy and criticism envisioned in fair use will increasingly deteriorate as automatic clearance is made easier, more obvious, and automatic. This concern is only intensified as Blackboard, recently merged with WebCT, continues to become the single, dominant provider of course management software for universities in the United States.
Technologies have politics, in that they make certain arrangements easier and more commonplace. But technologies also have the tendency to erase politics, rendering invisible the very interests and efforts currently working to establish “more copyright protection is better” as the accepted truth, when it is far from it.
As educators, scholars, librarians, and universities, we are in a rarified position to fight for a more robust protection of fair use in the digital realm, demanding that making “multiple copies for classroom use” means posting materials into Blackboard without needing to seek the permission of the copyright owners to do so.
The automation of copyright clearance now being deployed will work against this, continuing to shoehorn scholarship into the commercial model of information distribution, and erase the very question of what fair use was for -- not by squelching it, but simply by making it easier not to fight for it and harder to even ask if there’s an alternative.
Tarleton Gillespie is an assistant professor in the Department of Communication at Cornell University, and a Fellow with the Stanford Law School Center for Internet and Society.
It's only October, but already you can feel the nip of holiday commercialism in the air. That's especially true at the big chain stores for cultural goods, where the public-domain Dickens books and the discount CDs of Bing Crosby are now on display, priming the pump for seasonal cheer.
And making your way to the checkout counter, you might notice a new title positioned for maximum impulse-buying convenience: A small book called Festivus: The Holiday for the Rest of Us, by Allen Salkin, published by Warner Books.
Late last year, Salkin wrote an article for The New York Times about how some people now celebrate the "Seinfeld"-spawned faux tradition. More precisely, they (or rather, we) invite friends over to Festivus gatherings in early December -- in lieu of the regular Christmas, Hanukkah, or Kwanza parties. In the course of his reporting, Salkin learned about the Festivus party my wife and I have held in early December for some years now. He gave me a ring to discuss it.
Evidently this interview took place not long after I had downed a large cup of strong coffee -- for I distinctly recall doing a prolonged riff on how Festivus was a postmodern variant of the British social historian Eric Hobsbawm's concept of "invented tradition." This is an exercise known as "bullshitting." You could read a book about it.
None of my improvisation, alas, ended up in Salkin's article. (Nor was there any reference to my effort to add to the Festivus traditions by making the song "Now I Wanna Be Your Dog" by Iggy and the Stooges into a carol.)
Anyway, a couple of months after the piece ran, Salkin was back in touch. He had just gotten a contract to do a book on Festivus, and wondered if I might write up certain aspects of my rant for inclusion as a short essay.
Well, the book is now out. And the essay is in there ... but now in a form much abbreviated. The reference to Eric Hobsbawm, for example, has been removed. (A grievous omission, though it's possible that the great man would prefer it that way.) Some degree of cutting is to be expected. But what did come as a surprise was, rather, the addendum: A sarcastic little item running alongside the piece, scoring easy points off its "overintellectualization" of the holiday. (As though that were not a tendency the essay itself is mocking.)
It seems, in short, like a very curious way to repay someone who contributed his work for free. Then again, free-floating rancor was always the dominant tone on "Seinfeld."
In any case, I've retained rights to the essay, and am running the full text of it here, in the hope that this version be considered definitive by scholars in the field of Festivus studies. If any...
Each year, my wife and I invite friends to gather around the aluminum pole -- or at least the place it would be, if we ever got around to buying one -- and discuss the True Meaning of Festivus. Of course it's gotten so commercialized now. But Festivus is here to stay. After long cogitation (too long, probably) I've concluded that there is more to it than an excuse for non-religious seasonal holiday. Festivus is the postmodern "invented tradition" par excellence.
Admittedly, the phrase "postmodern 'invented tradition' " is something of a mouthful, but there is a more or less serious historical argument behind it. Let's see if I can make it with a straight face.
Once upon a time -- let's call this "the premodern era" and not get too picky about dates -- people lived in what we now think of as "traditional societies." Imagine being in a village where few people are literate, everybody knows your name, and not many people leave. A place with tradition, and plenty of it, right?
Well, yes and no. There are holidays and rituals and whatnot. As spring draws near, everybody thinks, "Time for the big party where we all eat and drink a lot and pretend for a few days not to notice each other humping like bunnies." (That one was a big hit even before New Orleans was on the map.) And yet people don't say, "We do X because it is our tradition." You do X because everybody else around here does it -- and as far as you know, they always have. Not doing it would be weird, almost unimaginable.
But then, starting maybe 300 years ago, things got modern. We tend to imagine that profound cultural dislocations (from war, industrialization, the global marketplace, yadda yadda yadda) only kicked in within recent decades. That's just because our attention spans are so short. Well before Queen Victoria planted her starchy skirt upon the throne, people were nostalgic for the old days.
And so, according to the British historian Eric Hobsbawm, they started inventing traditions from bits and pieces of the past. In the 19th century, for example, folks started singing "traditional Christmas carols" -- even though, for a couple of hundred years, they had celebrated the holiday with pretty much the same hymns they sang in church the rest of the year.
In short, if you say, "We do X because it's traditional," that is actually a pretty good sign that you are modern. It means you have enjoyed (and/or endured) a certain amount of progress. What you are really saying, in effect, is, "We ought to do X, even though we sort of don't actually have to." There is a world you have lost. Tradition is a way of imagining what it must have been like.
Postmodernism is what happens after you've been modern so long that "being modern" doesn't seem all that special -- but at the same time, you don't feel like "being traditional" is all it's cracked up to be, either. And you start putting things in quotation marks all the time.
Does that sound familiar? I could cite a bunch of stuff here about "the decline of metanarratives" and "the simulacrum." But if you're a "Seinfeld" fan, you've had a pretty good taste of pomo without the theory.
What makes Festivus a postmodern invented tradition is that it comes straight out of the mass media, without any moorings in a vague sense of reviving something lost or forgotten. Nobody ever felt a yearning to celebrate it. Frank Costanza just makes the holiday up, and all the "traditions" that go with it. It's hyper-individualistic -- the perfect holiday for the culture of narcissism. The beauty of the Festivuscelebration is that it lays bare all the stuff that you have to squelch just to get through the holiday season.
We gather with family at Christmas or Hannakuh in order to recapture the toasty warmth of community and family. And because, well, we have to. So you'd best bite your tongue.
During Festivus, by contrast, all the vague hostility of enforced togetherness gets an outlet. You have a chance to air your grievances -- and to pin the head of the household to the floor, if you can. It's hard to get sentimental about an aluminum pole. But as long as there are midwinter holidays, the spirit of Festivus will fill the air.
Rick Perlstein, a friend from the days of Lingua Franca, is now working on a book about Richard Nixon. Last year, he published a series of in-depth articles about the Republican Party and the American conservative movement. (Those are not quite the same thing, though that distinction only becomes salient from time to time.) In short, Perlstein has had occasion to think about honesty and dissimulation -- and about the broad, swampy territory in between, where politicians finesse the difference. As do artists and used-car salesmen....
It’s the job of historians to map that territory. But philosophers wander there, too. “What is truth?” as Nietzsche once asked. “A mobile army of metaphors, metonymies, anthropomorphisms. Truths are illusions of which one has forgotten that they are illusions.” Kind of a Cheneyo-Rumsfeldian ring to that thought. It comes from an essay called “On Truth and Lie in an Extra-Moral Sense,” which does, too, come to think of it.
So anyway, about a week ago, Rick pointed out a recent discussion of how the Bush Administration is dealing with critics who accuse it of fudging the intelligence that suggested Saddam Hussein had weapons of mass destruction. The link went to a comment by Joshua Micah Marshall, who is a liberal Democrat of the more temperate sort, not prone to hyperventilation.
“Garden variety lying is knowing it’s Y and saying it’s X,” he wrote, giving Lyndon Johnson on the Gulf of Tonkin as an example. The present executive branch, he continued, shows “a much deeper indifference to factual information in itself.”
Rick posed an interesting question: “Isn't Josh Marshall here describing as the Administration's methodology exactly what that Princeton philosophy prof defines as ‘bullshit’?” That prof being, of course, Harry Frankfurt, whose short and best-selling treatise On Bullshit will probably cover everyone’s Christmas bonus at Princeton University Press this year.
In February, The New York Times beat us by a day or so with its article on the book, which daintily avoided giving its title. But "Intellectual Affairs" first took a close look, not just at Frankfurt’s text -- noting that it remained essentially unchanged since its original publication as a scholarly paper in the 1980s -- but at the philosophical critique of it presented in G.A. Cohen’s essay “Deeper into Bullshit.”
Since then, the call for papers for another volume of meditations on the theme of bull has appeared. Truly, we are living in a golden age.
The gist of Frankfurt’s argument, as you may recall, is that pitching BS is a very different form of activity from merely telling a lie. And Marshall’s comments do somewhat echo the philosopher’s point. Frankfurt would agree that “garden variety lying” is saying one thing when you know another to be true. The liar operates within a domain that acknowledges the difference between accuracy and untruth. The bullshitter, in Frankfurt’s analysis, does not. In a sense, then, the other feature of Marshall’s statement would seem to fit. Bullshit involves something like “indifference to factual information in itself.”
So does it follow, then, that in characterizing the Bush team’s state of mind three years ago, during the run-up to the war, we must choose between the options of incompetence, dishonesty, and bullshit? Please understand that I frame it in such terms, not from any political motive, but purely in the interest of conceptual rigor.
That said.... It seems to me that this range of terms is inadequate. One may agree that Bush et al. are profoundly indifferent to verifiable truth without concluding that the Frankfurt category necessarily applies.
Per G. A. Cohen’s analysis in “Deeper into Bullshit,” we must stress that Frankfurt’s model rests on a particular understanding of the consciousness of the liar. The mind of the bullshitter is defined by contrast to this state. For the liar, (1) the contrast between truth and untruth is clearly discerned, and (2) that difference would be grasped by the person to whom the liar speaks. But the liar’s intentionality also includes (3) some specific and lucidly grasped advantage over the listener made possible by the act of lying.
By contrast, the bullshitter is vague on (1) and radically unconcerned with (2). There is more work to be done on the elements of relationship and efficacy indicated by (3). We lack a carefully argued account of bullshit’s effect on the bullshitee.
There is, however, another possible state of consciousness not adequately described by Frankfurt’s paper. What might be called “the true believer” is someone possessing an intense concern with truth.
But it is a Higher Truth, which the listener may not (indeed, probably cannot) grasp. The true believer is speaking a truth that somehow exceeds the understanding of the person hearing it.
During the Moscow Trials of the late 1930s, Stalin’s attorney lodged numerous charges against the accused that were, by normal standards, absurd. In many cases, the “evidence” could be shown to be false. But so much worse for the facts, at least from the vantage point of the true believer. If you’ve ever known someone who got involved in EST or a multi-level marketing business, the same general principle applies. In each case, it is not quite accurate to say that the true believers are lying. Nor are they bullshitting, in the strictest sense, for they maintain a certain fidelity to the Higher Truth.
Similarly, it did not matter three years ago whether or not any evidence existed to link Saddam and Osama. To anyone possessing the Higher Truth, it was obvious that Iraq must be a training ground for Al Qaeda. And guess what? It is now. So why argue about it?
On a less world-historical scale, I see something interesting and apropos in Academe, the magazine of the American Association of University Professors. In the latest issue, David Horowitz makes clear that he is not a liar just because he told a national television audience something that he knew was not true.
(This item was brought to my attention by a friend who teaches in a state undergoing one of Horowitz’s ideological rectification campaigns. My guess is that he’d rather not be thanked by name.)
Here’s the story so far: In February, while the Ward Churchill debate was heating up, Horowitz appeared on Bill O’Reilly’s program. It came up that Horowitz, like Churchill, had been invited to lecture at Hamilton College at some point. But he was not, he said, “a speaker paid by and invited by the faculty.”
As we all know, university faculties are hotbeds of left-wing extremism. (Especially the business schools and engineering departments. And reports of how hotel-management students are forced to read speeches by Pol Pot are positively blood-curdling.) Anyway, whenever Horowitz appears on campus, it’s because some plucky youngsters invite him. He was at Hamilton because he had been asked by “the conservative kids.”
That came as a surprise to Maurice Isserman, a left-of-center historian who teaches at Hamilton College. When I saw him at a conference a few years ago, he seemed to have a little gray in his hair, and his last book, The Other American: The Life of Michael Harrington, was a biography of the founder of the Democratic Socialists of America. No doubt he’s been called all sorts of things over the years, but “conservative kid” is not one of them. And when Horowitz spoke at Hamilton a few years ago, it was as a guest lecturer in Isserman’s class on the 1960s.
As Isserman put it in the September/October issue of Academe: “Contrary to the impression he gave on "The O’Reilly Factor," Horowitz was, in fact, an official guest of Hamilton College in fall 2002, invited by a faculty member, introduced at his talk by the dean of the faculty, and generously compensated for his time.”
I will leave to you the pleasure and edification of watching Horowitz explain himself in the latest issue of Academe. But in short, he could not tell the truth because that would have been a lie, so he had to say something untrue in order to speak a Higher Truth.
My apologies for the pretzel-like twistiness of that paraphrase. It is all so much clearer in the original Newspeak: Thoughtcrime is doubleplus ungood.