Bait and Switch: The (Futile) Pursuit of the American Dream, published this week by Metropolitan Books, is a return to matters that Barbara Ehrenreich has written about in the past. And no, I don't just mean the world of economic hard knocks.
In obvious ways, the new book's narrative of trying to get a white-collar corporate job (say, as a public-relations person) is similar in method and tone to Nickel and Dimed (2001), her account of the lives the working poor. Both are works of first-person reporting a la George Orwell's Road to Wigan Pier -- treading the fine line between investigative journalism and participant-observer ethnography, with the occasional dash of satire thrown in.
But Ehrenreich's new book also revisits a world first explored in her early work on "the professional-managerial class" (often abbreviated as PMC). In papers written during the late 1970s with her first husband, John Ehrenreich, she worked out an exacting Marxist analysis of the PMC as "consisting of salaried mental workers who do not own the means of production" (hence aren't capitalists) but whose "major function in the social division of labor may be broadly described as the reproduction of capitalist culture and capitalist relations." Ehrenreich revisited the topic, in a more popular vein, with Fear of Falling: The Inner Life of the Middle Class (1989).
You don't hear any trace of sociological diction in Ehrenreich's latest book, in which she goes undercover as "Barbara Alexander," a homemaker with some work experience in writing and event-planning. (Alexander's resume is a more modest rewriting of Ehrenreich's own background as academic and journalist.) Her search for a new job puts her in competition with other casualties of downsizing and midlife unemployment. She spends her time reading Monster.com, not Louis Althusser.
But some of Ehrenreich's old theoretical concerns do pop up as she tries to land a gig on the lower rungs of the PMC hierarchy. More than a quarter century ago, she had written that the private life of the middle class "becomes too arduous to be lived in private: the inner life of the PMC must be continuously shaped, updated and revised by ... ever mounting numbers of experts." And so Barbara Alexander finds teams of "career consultants" ready to help her adjust her outlook to fit into the new corporate culture. How? Through the modern science of psychobabble.
After reviewing Bait and Switch for Newsday, I still had some questions about where the book fit into Ehrenreich's thinking. Happily, she was willing to answer them by e-mail.
Q: Nickel and Dimed has become a standard reading assignment for undergraduates over the past few years, and some of that audience must now be entering the white-collar job market you describe in Bait and Switch. Is there anything in the new book intended as guidance for readers who will be facing that reality?
A: I'd like to reach undergraduates with Bait and Switch before they decide on a business career. I'm haunted by the kid I met at Siena College, in N.Y., who told me he was really interested in psychology, but since that isn't "practical," he was going into marketing, which draws on psychology -- though, as this fellow sadly admitted, only for the purpose of manipulating people. Or the gal I met at University of Oregon who wants to be a journalist but is drifting toward PR so she can make a living.
Right now, business is the most popular undergraduate major in America, largely because young people believe it will lead to wealth or at least security. I want them to rethink that decision, or at least do some hard thinking about what uses they would like apply their business skills to.
There's not much by way of individual guidance in Bait and Switch, but I do want to get people thinking more about corporate domination, not only of the economy, but of our psyches. Generally speaking, the corporations have us by the short hairs wherever you look, and of course, one source of their grip is the idea that they are the only or the major source of jobs. I'm asking, what kind of jobs -- back-breaking low-wage jobs as in Nickel and Dimed, or transient, better-paid jobs that seem to depend heavily on one's ability to be a suck-up, as in Bait and Switch?
Q:The pages in Bait and Switch devoted to New Age-inflected business-speak are quite funny -- but in an angry way. How much do you think people really buy into this ideology? Do they take it seriously? Or is it just something you have to repeat, to be part of the tribe?
A: Well, someone must believe it, or there wouldn't be any market for all the business advice books spewed out by career coaches and management gurus. I had the impression that the job seekers I was mingling with usually thought they should believe it all, or at least should act as if they believe it all. There certainly seems to be a lot of fear of being different or standing out in any way.
Q:What's the relationship between the world you are describing in the new book and that of the professional-managerial class? Are business professionals fully fledged members of the PMC? Or are they clueless and self-deluding mimics of it? All of the above?
A: Sure, they're bona fide members of the PMC as John Ehrenreich and I defined it in the 70s; they are college-educated and they command others or at least determine the work that others will do. But your question makes me think that an update on the PMC is long overdue.
In the late 80s, when I wrote Fear of Falling, it looked like the part of the PMC employed as corporate operatives was doing pretty well compared to the more academic and intellectual end of the PMC, which was beginning to get battered by HMOs (in the case of physicians), budget cuts (in the case of college professors, social workers, and others), etc.
Starting in the late 80s, though -- and insufficiently noted by me at the time -- the corporate operative-types began to lose whatever purchase they had on stability. First there were the mergers and acquisitions of the 80s, which inevitably led to white collar job loss; then there was the downsizing of the 90s; and now of course the outsourcing of many business-professional functions. So no one is safe.
Q: Do people in this sphere have any way to win a degree of real control over their economic condition? If they don't have some regulation of the market for their labor via certification (i.e. real professionalization) and they find it unimaginable to be unionized, does that leave them any options?
A: No. As a blue collar union friend of mine commented: They bought the line, they never had any concept of solidarity, and now they're sunk.
Q: In reporting this book, you created an alter ego, "Barbara Alexander," who is not the same person as Barbara Ehrenreich. But she's not totally different, either. There is a degree of overlap in age, background, work experience, etc. The job search proves fairly humiliating for Barbara Alexander. Was it hard to keep some distance from the role? It felt like she might explode a few times.
A: Remember, "Barbara Alexander" was just my cover; I only distanced myself enough to be a fairly low-key observer/reporter. Hence no tantrums or crazed rants. So yes, a certain amount of self-control was necessary, and it did take its toll. I often felt extremely soiled, compromised and generally yucky about the whole venture.
By which I don't mean I'm too pure to be involved in the great corporate money-making machine (my books, after all, are published by a large corporation and I happily accept my royalties) but that I was trying to act like someone I'm not and that I suspect very few people are, i.e., the endlessly upbeat, compliant, do-with-me-what-you-will corporate employee.
Q: Some aspects of the labor market you describe in Bait and Switch sound comparable to trends emerging in parts of academe. Any thoughts on that score? Have you considered writing, say, Ivy and Adjunct?
A: You want me to go undercover as an adjunct? No way. First, I've been an adjunct, years ago, at both NYU and the College of New Rochelle, and I understand the pay hasn't improved since then. So sorry, that option is no more enticing than another stint at Wal-Mart.
Someone should write about it though. The condition of adjuncts, who provide the bulk of higher ed in this country, is an absolute scandal. I've met adjuncts who moonlight as maids and waitresses, and I've read about homeless ones. If the right is so worried about the academy being too left wing, they should do something about the treatment of adjuncts (and many junior faculty.) There's something about hunger that has a way of turning people to the left.
My ambition to write a musical about the arrival of Lacanian theory in Tito-era Yugoslavia has always hinged on the zestiness of the intended title: Å½iÅ¾ek! The music would be performed, of course, by Laibach, those lords of industrial-strength irony; and the moment of psychoanalytic breakthrough that Lacan called la Passe would be conveyed via an interpretative dance, to be performed by a high-stepping chorus of Slovenian Rockettes.
Alas, it was all a dream. (Either that, or a symptom.) The funding never came through, and now Astra Taylor has laid claim to the title for her documentary, shown recently at the Toronto Film Festival.
Å½iÅ¾ek! is distributed by Zeitgeist, which also released the film Derrida. The company provided a screener DVD of Å½iÅ¾ek! that I've now watched twice -- probably the minimum number of times necessary to appreciate the intelligence and style of Taylor's work. The director is 25 years old; this is her first documentary.
It's not just her willingness to let Slavoj Å½iÅ¾ek be Slavoj Å½iÅ¾ek -- responding bitterly to an orthodox deconstructionist in the audience at a lecture at Columbia University, for example, or revisiting some familiar elements of his early work on the theory of ideology. Nor is it even her willingness to risk trying to popularize the unpopularizable. The film ventures into an account of Å½iÅ¾ek's claim of the parallel between Marx's concept of surplus value and Lacan's "object petit a." (This is illustrated, you may be relieved to know, via a cartoon involving bottles of Coke.)
Beyond all that, Å½iÅ¾ek! is very smart as a film. How it moves from scene to scene -- the playful, yet coherent and even intricate relationship between structure and substance -- rewards more than one viewing.
In an e-mail conversation with Taylor, I mentioned how surprising it was that Å½iÅ¾ek! actually engaged with his theory. It would be much easier, after all, just to treat him as one wacky dude -- not that Å½iÅ¾ek quite avoids typecasting himself.
"I wanted very much to make a film about ideas," she told me. "That said, I think the film betrays a certain fascination with Å½iÅ¾ek's personality. He's got this excess of character and charisma that can't be restrained, even when we would try to do an interview about 'pure theory.'"
Å½iÅ¾ek! isn't a biography. (For that, you're probably better off reading Robert Boynton's profile from Lingua Franca some years ago.) Taylor says she started work with only a hazy sense of what she wanted the documentary to do -- but with some definite ideas about things she wanted to avoid. "I didn't want to make a conventional biopic," she recalls, "tracing an individual's trajectory from childhood, complete with old photographs, etc. It's not even that I have anything against that form in particular, it just didn't seem the right approach for a film about Å½iÅ¾ek."
Her other rule was to avoid pretentiousness. "Especially when dealing in theory, which has quite a bad name on this front, one has to be careful," she says. "I decided to veer towards the irreverent instead of the reverential. Granted, this is fairly easy when you're working with Slavoj Å½iÅ¾ek."
Fair enough: This is the man who once explained the distinctions between German philosophy, English political economy, and the French Revolution by reference to each nation's toilet design. (Å½iÅ¾ek runs through this analysis in the film; it also appeared last year in an article in The London Review of Books.)
Just to be on the safe side, Taylor also avoided having talking heads on screen "instructing the audience in what to think about Å½iÅ¾ek or how to interpret his work." The viewer sees Å½iÅ¾ek interact with people at public events, including both an enormous left-wing conference in Buenos Aires and a rather more tragically hip one in New York. But all explanations of his ideas come straight from the source.
In preparing to shoot the film, Taylor says she came up with a dozen pages of questions for Å½iÅ¾ek, but only ended up asking two or three of them. Having interviewed him by phone a couple of years ago, I knew exactly what she meant. You pose a question. Å½iÅ¾ek then takes it wherever he wants to go at the moment. The trip is usually interesting, but never short.
One of the funniest moments in Å½iÅ¾ek! is a video clip from a broadcast of a political debate from 1990, when he ran for president of Yugoslavia as the candidate of the Liberal Democratic Party. At one point, an old Communist bureaucrat says, "Okay, Å½iÅ¾ek, we all know your IQ is twice that of everybody else here put together. But please, please let somebody else talk!"
Taylor says she soon realized that her role was less that of interviewer than traffic director, "giving positive or negative feedback, telling him when to stop or when he'd said enough, and directing the flow of the conversation as opposed to conducting a straightforward interview with stops and starts."
She kept a log throughout the various shoots, "summing up everything he said in what would eventually be a one hundred page Excel spreadsheet. That way, I knew what subjects had been addressed, in what setting, and if the material was useful or needed to be reshot." About halfway through the production, she and Laura Hanna, the film's editor, assembled a rough cut.
"At that point," Taylor recalls, "I began to choose various passages for the animated sequences. I knew there needed to be some recurring themes and a broader theoretical argument to underpin the film.... But that makes it sound too easy and rational. The majority of choices were more intuitive, especially at the beginning when we were trying to cut down eighty hours of raw footage. When you're editing a film it is as much about what feels right, what flows, as what makes sense logically."
One really inspired moment came when Taylor learned of Jacques Lacan's appearance on French educational television in the early 1970s. She obtained a copy of the program and sat down with Å½iÅ¾ek in his apartment to watch it.
The transcript of Lacan's enigmatic performance is available as the book Television: A Challenge to the Psychoanalytic Establishment (Norton, 1991). But to get the full effect, you really have to see Lacan in action: Self-consciously inscrutAble, yet also suave, he utters short and gnomic sentences, looking for all the world like Count Dracula ready for a nap after a good meal.
The contrast with the stocky and plebeian Å½iÅ¾ek (a bundle of energy and nervous tics) is remarkable; and so is the highly ambivalent way he responds to hearing his Master's voice. Å½iÅ¾ek takes pride in being called a dogmatic Lacanian. But the video clearly bothers him.
"I think Å½iÅ¾ek reacts to the footage on different registers at once," as Taylor puts it, "which is what makes the scene so interesting. He's obviously disturbed by Lacan's delivery, which seems very staged and pompous. Yet he attempts to salvage the situation by discussing how the very idea of a 'true self' is ideological or by arguing that the substance of Lacan's work should not be judged by his style."
The scene is also plenty meta. We are watching footage in which the most psychoanalytic of philosophers watches a video of the most philosophical of psychoanalysts. And yet somehow it does not feel the least bit contrived. If anything, there is something almost voyeuristically fascinating about it.
Taylor told me that the sequence "evokes what I see as one of the film's central themes: the predicament of the public intellectual today, and Å½iÅ¾ek's strategies for coping with it."
Early in the documentary -- and again at the end -- he denounces the fascination with him as an individual, insisting that the only thing that matters is his theoretical work. He gives a list of what he regards as his four really important books: The Sublime Object of Ideology, For They Know Not What They Do, The Ticklish Subject, and a work now in progress that he has provisionally titled The Parallax View (a.k.a. the sequel to Ticklish).
There is a clear hint that his other and more popular books are negligible by contrast; he speaks of wanting to kill his doppelganger, the wild-and-crazy guy known for obscene jokes and pop-culture riffs.
"And yet," as Taylor notes, "Å½iÅ¾ek, despite his frustrations, continues to put on a good show, albeit one quite different in demeanor from Lacan's." That is what makes the final images of Å½iÅ¾ek! so interesting.
I don't want to give the surprise ending away. Suffice it to say that it involves a spiral staircase, and makes explicit reference to Vertigo, Alfred Hitchcock's great meditation on Freud's Beyond the Pleasure Principle. (Whether or not Hitchcock ever actually read Freud is sort of beside the point, here.) The scene also harkens back to earlier comments by Å½iÅ¾ek -- and yet it really comes out of left field.
Taylor says they improvised it at the very last moment of shooting. She calls the scene "fantastically head-scratching," and not just for the audience.
"Over the last few months," she says, "I have come up with all sorts of pseudo-theoretical justifications and interpretation of it, all the different layers of meaning and resonances with Å½iÅ¾ek's work and life and the intersections of the two. But all of these, I must admit, were created after the fact ( après coup, as Lacan would say)."
So what are her theories? "I feel like I would be ruining the fun if I elaborated on them," she told me. "That is, after all, precisely what people are supposed to debate over a beer after seeing the movie."
For more on Å½iÅ¾ek! -- including information about its availability and a clip from the film -- check out its Web site.
If not for the recent online buzz about whether or not President Bush has resumed drinking, most of us never would have heard the allegations. The story was, after all, broken by The National Enquirer – a paper not taken in this household, you may be sure. We are loyal to the Weekly World News instead.
The cover of the Enquirer is always full of the faces and first names of celebrities, very few of which I recognize -- while the reporters at the News do the kind of hard journalistic digging needed to reveal, for example, Saddam Hussein’s efforts to clone dinosaurs for use as weapons of mass destruction. Some years ago, there was an off-Broadway musical inspired by WWN coverage of the amazing saga of the half-human Bat Boy. I’m always keen to read updates about that brave little guy.
But a scoop is a scoop. More interesting than the Enquirer story itself has been the response to it -- not just its prime spot in Slate’s roundup of trash news, but the loud blog feedback, followed by the metacommentary by Jonathan Dresner at Cliopatria, which was remarkably sober. (Didn't see that one coming, did you?)
A decade has passed since the earliest syllabus was prepared for a course called “Tabloid Culture.” Now it’s a regular area of scholarly specialization ( with conferences), so it’s hard to know how anybody keeps up with all the secondary literature, let alone the two-headed alien babies.
As it happens, the first paper on cultural studies by an American academic I ever came across -- more than 20 years ago, in fact -- was a pioneering study in the field of tabloid hermeneutics. Stephanie Greenhill presented “ The National Enquirer: A Secret Method for the Mastery of Life” at the Southwest Graduate Student Conference in Comparative Literature, held in March 1982 at the University of Texas at Austin. The proceedings were published the following year by UT’s Comp Lit program -- using what appears to have been a very, very early desktop publishing program. The volume provides no information about the contributors. Nor is there any trace of Stephanie Greenhill’s subsequent career as a scholar available online. [See update below.]
But a vague memory of her argument has been at the back of my mind ever since the current Enquirer story began pinging around the blogosphere. It took some digging, but I’ve located my copy of the proceedings and reread Greenhill’s paper. And so, in the spirit of honoring a forgotten pioneer, here is a precis of her work, and an application of it to interpreting “Bush’s Booze Crisis.”
My recollection had it that Greenhill must have been one of the first American academics to draw on the first generation of cultural-studies scholars – the early theoretical work of Stuart Hall and others at the Birmingham Center for Contemporary Cultural Studies in England. (The center closed three years ago.) But in fact, rereading her paper now, I see that Greenhill was actually looking at the tabloid from within a completely different framework: that of folklore.
An interesting rewriting of things, given the subsequent fate of each discipline over the following two decades. By the 1990s, the American version of cultural studies was on the rise, while folklore programs were shutting down. If the stereotype had it that someone with a background in cultural studies wore hipster eyeglasses and a complicated haircut, the other field had a much less flattering icon: namely, the Comic Book Store Guy on "The Simpsons," responding to one of Bart’s pranks by saying, “I do not deserve this! I have a Ph.D. in folklore and mythology!”
With understandable frustration, some folklorists have insisted for years that they were doing cultural studies long before anybody thought to call it that. And rereading Greenhill’s paper after all these years, I’m inclined to think they have a case. Her analysis stresses how the Enquirer -- which is, arguably, as debased a piece of mass-produced junk as ever issued by a printing press -- actually replicates some features we associate with oral or traditional forms of culture.
Not that you’d notice it right away, of course. “It can be a very disturbing experience to read the Enquirer,” she writes. “The physical layout encourages the feeling of alienation. One’s eyes are forced to search up and down in order to find everything on the page. One cannot even look only at headline-sized materials to get an overview; there are a number of single-line quotations which force the eye constantly to refocus. Perhaps it is this format, rather than the content, which is the source of the subjective sense that the Enquirer is a fragmenting rather than a communal force.”
But the ads, and perhaps especially the articles, recycle many of the basic themes found in folklore. “Collections such as Flanders and Brown’s Folk-Songs from Vermont,” notes Greenhill, “deal with many of the subjects equally beloved of the Enquirer: illicit love, the bizarre, violence, death, satire, and religion.”
Furthermore, many of the stories in the tabloid’s pages lend themselves to exactly the kind of structuralist analysis that Claude Levi-Strauss performed on myths gathered by anthropologists. In short, they are efforts to resolve binary oppositions such as that between nature and culture, male and female, life and death.
Consider, if you will, “Miracle Baby: Love Overcomes Incredible Odds for Paralyzed Wife and Her Gentle Giant” -- recounting how a very tall bodybuilder and his very small, paralyzed wife created their happy family. Their small child is, as Greenhill puts it, “obviously the symbolic synthesis of the two.... By creating a balance between these opposites, a state of normalcy will result.”
The classical balance typical of the Enquirer may explain my own preference for the rather more surreal landscape of the Weekly World News, in which normality is constantly menaced by (for example) self-replacing androids who “breed like flies.” And it was WWN that revealed that undergraduates aren’t the only ones spending all semester on drinking binges. So do 8 out of 10 of their professors!
And now our Commander in Chief is staggering down the same path. Or so we are told by the Levi-Straussian structuralists at the Enquirer. I think Greenhill’s paper helps clarify some things about the response to this news (if that’s what it is) -- and, in fact, elucidates some things happening between the lines of the story itself.
Her analysis emphasizes that a folkloric work (song, legend, tabloid) serves to help hold a community together. And it can play that role whether or not everyone in the community quite believes it to be literally true. That point has been made more recently, with great force, by the Pennsylvania State University-Hazleton folklorist Bill Ellis, whose papers in the book Aliens, Ghosts, and Cults: Legends We Live (University Press of Mississippi, 2001) deserves to be better known.
To sum up the point as Ellis makes it: The important thing to understand about any form of contemporary folklore (for example, the urban legends constantly making the rounds of e-mail and conversation) is that the debate over its truth or falsity is part of how it circulates. Such folklore helps define the limits of what a community believes. Or rather, what it hopes or fears might be believable.
That certainly applies to the online conversation over “Bush’s Booze Crisis” – in which some very fine hairs have been split over the epistemological question of whether something might be true even if it’s in the Enquirer.
That story can be read as a criticism of the President, in keeping with a populist distrust of power that Greenhill finds operating throughout the tabloid. But it is also, at the same, time, a negation of the image of him that has emerged in recent months as someone utterly out of touch with the news about Iraq and Katrina. It shows him, rather, as wounded to the core. What might look like ignorance and indifference are actually the signs that awesome responsibility has left him in unimaginable pain. He is human, all too human.
Does this have any political consequence at all, in the real world? I don’t know. But it does tend to confirm the basic insightfulness of Greenhill’s paper – tucked away in a scholarly publication now forgotten, probably, by everyone except the contributors. And maybe even by them. The final line of her essay sums it up nicely: “The attractiveness of the Enquirer could be that its readers can pick and choose both their tales and their morals from a certain range of possibilities, and know that others are doing the same thing.”
UPDATE: The mystery regarding the fate of the author of the 1982 paper has now been cleared up -- for it turns out that her name was actually Pauline (not Stephanie) Greenhill. I regret the error, without being quite responsible for it. As luck would have it, a glitch from 23 years ago has come back to haunt us.
In an e-mail note, Ms. Greenhill explains that she was indeed the folklore grad student at the University of Texas at Austin "who wrote the article on the National Enquirer, published in the student conference proceedings so many years ago." But she wasn't actually there to present her paper. "My friend STEPHANIE Kane gave it on my behalf, and somehow the editors mixed up our two names.... The conference organisers did put in an errata sheet correcting the mistake, but we all know what happens to those sorts of things."
Today "the not very mysterious" Pauline Greenhill, as she signs herself, is a professor of women's studies at the University of Winnipeg. A list of her scholarly publications since 1993 is available at this Web page.
On Thursday, by appointment, this column makes its rendezvous with the most recent output of the book trade, with university press offerings a speciality -- even though many catalogues crossing my desk seem divested of many scholarly titles at all. More and more of them seem filled instead with listings for regional cookbooks, detective novels, and photographic albums devoted to the flowers visible from the bicycle trails of state parks, suitable for purchase in the gift shop.
Is this an exaggeration? No, friends, it is not. Serious books do, of course, appear; right now, I am reading several at one go, each of them at least 500 pages long. (No doubt, as Kant said, it was too much work to write a short one.) But even these ponderous tomes are sometimes manifestly worse for wear -- victims of the (rolling and interminable) crisis in academic publishing.
One book from last year is sinking ever-lower in my must-read pile simply because there has not been time to set up an appointment with my oculist: It was set in type just slightly larger than that used to list the side-effects of a new drug. The editor of the volume in question says he was startled when he first saw it. And the readers, if any, feel his pain -- a sharp throbbing sensation, after about 20 minutes.
Easier on the eyes, but no less appalling, are the latest reports from the Library and Information Statistics Unit at Loughborough University, in the UK. Every six months, LISU crunches the numbers regarding British and American academic book prices. Since 1987, the statisticians have been compiling and analyzing the prices of books in 64 subject categories that are "closely relevant to acquisitions librarians’ needs." Categories include law, engineering, medicine (both human and veterinary), the various arts, and several branches each of the humanities and the social sciences.
Thanks to LISU's analysis of how prices are varying, librarians in charge of research collections have some sense of how to plan their budgets. Claire Creaser, the deputy director and senior statistician for LISU, has kindly provided me with the latest reports, covering the first six months of this year. (A more extensive presentation of the material is available for sale in CD-ROM format, including masses of data in Excel spreadsheets, if that sort of thing does not terrify you.)
First, the bad news: During 2004 to 2005, the overall average price for an academic book from an American publisher has gone up 2.2 percent compared to the previous year. "This compares to a slight fall in prices for UK books over the same period," as LISU notes, "and continues the recent trend for prices to rise rather more rapidly in the US than the UK." Indeed, the report focusing on British prices notes that they have gone down 4.8 percent over the past five years. A comparison with the comparable table for American academic titles shows prices increasing by 35 percent since 1999-2000.
Then, the rest of the bad news: "There is no consistency or pattern in the half-yearly price changes [for US titles] over recent years, which can only make budgets more difficult for librarians." Prices in a few areas have gone down. (The 761 philosophy titles from American presses between January and June were 2 percent cheaper than the previous lot.) But the vast majority of subject categories have shown an increase in price. A graph covering the past two decades shows that, sometime around mid-1996, the average cost of academic books began to shoot ahead of the retail price index. The gap between them is now wider than at any point in LISU’s record.
The figures are, in short, what you’d figure: Not only are scholarly books getting more expensive, but most of them are growing more expensive faster than other commodities. I asked Claire Creaser if her associates had tried to extrapolate from their (by now, enormous) data set. She declined, saying, "We do not analyze trends in any detail, and do not make forecasts." But she did point to a summary of budget trends for UK libraries. Up to a third of the budget for a British academic library may come from US publishers -- a reminder that the increasing price of our scholarly exports does have a global effect.
Meanwhile, closer to home, Intellectual Affairs faces the continuing balancing act of determining how to budget the scarcest of resources -- namely, reading time. It is a common enough problem, of course. But for a journalist, there is the added complication known as the university press publicist -- though, come to think of it, scholars encounter them as well, usually while roaming the exhibit hall at a convention.
Now, in my line of work, eavesdropping on such conversations sometimes actually counts as research. (The discussion may possess anthropological significance.) But actually having a face-to-face with a university press publicist is very often a maddening thing.
A handful of them are genuine book people. They are in love with a few of the new titles, reasonably well informed about the rest, and smart enough to have established sound lines of communication with the acquisitions editors – so that, while talking to them, you just might find out what the press’s big titles for 2008 will be. Such publicists are rare.
As for the rest.... If you ask what the most important, interesting, or otherwise eyestrain-worthy things their press has to offer, they will pull out the catalog and – please understand that I am not making this up -- begin to read it aloud to you, sometimes in a manner suggesting that this is the first time they have had occasion to pay so much attention to it.
The tedium of it for everyone is heartbreaking. The molars grind. The mind wanders. Mine, actually, finds itself trapped in an audiovisual center where someone is screening a video of Leonid Brezhnev giving a four-hour speech on the need to increase oatmeal production in 1974. Was the man happy? Was his audience? And just how many more pages are there in this catalog, anyway?
Isn't that a bit harsh? Aren't publicists rather low on the chain of command, the service workers of the publishing world? Yes, and yes, respectively. But the scene just described is (for any satirical embellishment) a familiar one, and it does no good for anybody -- not for the press, not for its authors, and certainly not for the larger public.
Nor are there grounds for thinking the arrangement will correct itself. I don't intend to be mean-spirited about it -- but perhaps a frank expression of irritation and dismay would be better than pretending that the lackadaisical status quo ante has been anything but ridiculous.
It might be as simple as making sure that editors spend more time with publicists, and encourage them to read some of the annual output. For that matter, people new to the publicist's trade could be encouraged to talk with the authors. My sense is that really good publicists do all of this anyway, just for the pleasure of it. But that doesn't mean the skills and habits can't be taught. It would be to everyone's advantage if they were.
There must be a better way. And one possibility that has emerged comes to my attention via Colleen Lanick, the publicity manager for MIT Press, who is one of the good ones. Earlier this week, she pointed out the new MIT PressLog and a similar blog run by Oxford University Press.
To find out if any other academic publishers had climbed aboard this particular bandwagon, I contacted Brenda McLaughlin, the communications manager for the Association of American University Presses. She mentioned the one maintained by Cork University Press, in Ireland. But otherwise, Oxford and MIT seem to be in the vanguard -- though McLaughlin says that AAUP itself now has a restricted-access blog for its members under development.
An interesting development, then – if only for the timing, given the recent wave of anguish over the danger that a reputation for blogging might pose to an academic on the job market. (The rather hysterical tone prevailing in some quarters calls to mind what sociologists call "moral panic." But the iron cage of bureaucracy is, after all, a strange thing: Today, there are timid souls who worry that a prospective colleague's blog might be a record of torrid threesomes indulged while plotting to assassinate the dean. Tomorrow they will be retired, or laughed off campus -- whereupon blogging might well become mandatory, rather than forbidden. Stranger things do happen.
With the shrinking space for coverage of books in the mainstream media, it’s understandable that academic presses might seize on the blog form as a venue to push their product. "We've been tracking the blog activities of our authors and various postings about our books for some time," Colleen Lanick of MIT told me. "So we thought it might be an interesting experiment to try this out in some form, ourselves....We are encouraging our authors to write original pieces for the log about how their work relates to current events."
Sanford Thatcher, the director of Penn State University Press, sounds a lot more skeptical of the idea of blogs run by academic publishers. Joining us on the telephone during the conference call was Tony Sanfillipo, the marketing and sales director for the PSU Press -- who, like Thatcher, is lately in the midst of dealing with the Google Print Libraries Project. They aren’t technophobes, but that doesn’t make them blog boosters.
"When people think of blogging," said Thatcher, "they think for the most part of political blogs, of argument. The Oxford and MIT blogs look very, very commercial. Wrigley Spearmint Gum has its own blog, if I remember correctly, but I’m just not sure it makes commercial sense." He suggests that the Books for Understanding project, sponsored by the AAUP, might be a better example of how academic publishers can make the public aware of their titles. The project offers bibliographies of university-press titles that are relevant to current events.
"The blog entries [for MIT and Oxford] don’t look that different from catalog copy to me," said Sanfillipo. "They don’t really engage the reader the way a blog does. You don’t see any trackbacks or comments for the entries, or not at least not many."
A fair point, though the sites are new and very much under development. Brenna McLaughlin from the Association of American University Presses says that the phenomenon is in its infancy. "It’s having its time," she said. "We’re just starting to realize the need to reimagine traditional publicity and communications work. That means you have to face the staffing issue: How much time do people have for this at a small press? But I do believe it will become more common."
With any luck, the chance to "reimagine traditional publicity and communications work" will include a careful evaluation of the tendency to read university press catalog copy out loud to functionally literate adults. (If so, I’m all for it.) Not that audio doesn’t have its place. Colleen Lanick mentions that MIT Press is now "working on developing a podcasting feature for the log where we can conduct interviews with authors and they can read segments of their books."
For a moment, you can envision a world in which more people become interested in academic press titles – so they begin to sell better, and maybe the prices go down a bit, since the publishers don’t have so many copies returned from Border’s.... And then comes news of a study in England showing that "taxi drivers, pub landlords and hairdressers -- often seen as barometers of popular trends -- found that nearly 90 percent had no idea what a podcast is and more than 70 percent had never heard of blogging." When asked about the latter term, many people thought the questioner was referring to “dogging” instead.
“Dogging,” as the Reuters news agency explains, “is the phenomenon of watching couples have sex in semi-secluded places such as out-of-town car parks. News of such events are often spread on Web sites or by using mobile phone text messages.”
Moral panic or no, you probably won’t get hired if the faculty search committee thinks you are dogging.
Two years ago, The Virginia Quarterly Review published an essay called "Quarterlies and the Future of Reading." The author, George Core, has been the editor of The Sewanee Review since 1973 -- at which time, it was already a venerable institution, one of the oldest publications of its kind in the United States.
By "publications of its kind," I mean the general-interest cultural quarterlies, usually published by universities or liberal-arts colleges. Unlike scholarly journals, they aren't focused on a particular field. Usually they offer a mixture of contemporary poetry and fiction with essays that are learned but nonspecialist.
It pays to be explicit about that, because so few people keep track of the university quarterlies now. Many of them are still around, with a modest if reliable subscriber base in the libraries. But it often seems as if they continue just by inertia.
It is always a sentimental gesture to speak of a golden age. But what the hell: The years between, say, 1925 and 1965 were a glorious time for such journals. Then, as now, their circulations were usually modest. But the worlds of publishing and of the university were smaller, and the quarterlies had a disproportionately large role to play. Truman Capote once mentioned in an interview how he knew he had "arrived" as a young writer in the 1940s: On the same day, he received two or three letters from the editors of quarterlies accepting his short stories for publication in their pages.
In his essay, two years ago, Core insisted that "the literary quarterly ... has been the linchpin of civilization since the 18th century." And if things did not look encouraging at the dawn of the 21st century ... well, so much the worse for what that says about civilization. "The average librarian these days, like many members of various departments in the humanities," wrote Core, "has become hostile to books and hostile to reading." Needless to say, technology is to blame.
It is hard to know what to make of the fact that Core's essay is now available online. I mean, sure, you can read it that way, but would he really want that?
But cantankerousness in defense of the quarterly is no vice. Nor, for that matter, is it a virtue to overstate how much the university-based general-interest periodical has declined. The situation may not be good, but it is not quite catastrophic.
Core's Sewanee Review is hopelessly out of touch with many trends in contemporary literary studies -- which is one reason it is still worth reading. But The Minnesota Review is very much in touch with developments in cultural theory, while also publishing a good deal of poetry, fiction, and personal essays. I've been reading it with interest for 25 years now (during which time it has never actually been published or edited in Minnesota). The Common Review, published by the Great Books Foundation, occupies a niche somewhere between the old-fashioned university quarterly and magazines such as Harper's and The Atlantic Monthly.
The list could go on -- and if it did, would have to include such non-academic, sui generis publications as N+1, which I've been urging upon the attention of startled bystanders ever since seeing the prototype pamphlet that appeared in advance of its debut, not quite two years ago. A third issue is now at newsstands. (See also the magazine's Web site. )
And if you want to see some interesting and successful experiments in updating the whole format, keep an eye out for Boston Review and The Virginia Quarterly Review.
The September/October issue of Boston Review marks its 30th anniversary. Coming out six times a year, and as a tabloid, BR might at first seem to bear little resemblance to the university quarterlies of any era. But those differences are superficial. The mixture of political, philosophical, and literarydiscussion calls to mind the early Partisan Review, the most agenda-setting of the quarterlies published at mid-century.
Boston Review's anniversary issue contains a 12-page anthology of poems and essay organized by year -- including work by (to give a partial list) John Kenneth Galbraith, Adrienne Rich, Rita Dove, Ralph Nader, George Scialabba, and Martha Nussbaum. (Somebody on that list is bound to interest or agitate you.) One of the selections for 1993 comes from an essay by Christopher Hitchens called "Never Trust Imperialists." It would have been interesting to hear the editorial discussion that resulted in that one being included.
As for The Virginia Quarterly Review, it has come under a new editor, Ted Genoways, who seems to have ignored entirely the worries expressed in George Core's ruminations on the state of the quarterly. In its new incarnation, [ital]VQR[ital] is colorful, topical, even a bit flashy -- with the latest issue offering a gallery of photographs from Vietnam as well as a pull-out comic book by Art Spiegelman, along with poetry, fiction, a play, and several essays.
Some of the once-vital quarterlies ended up becoming so polite and reserved that their audiences did not so much read them as search each issue for signs of a pulse. You certainly don't have that problem with Boston Review or VQR. Recent discussion of standards for "public scholarship" has emphasized the possibility of creating venues "at the interface of campus and community."
Arguably, that is what the quarterlies and review always were -- and their revitalization now can only be a good sign.
It is disagreeable to approach the cashier with a book called How to Read Hitler. One way to take the stink off would be to purchase one or two other volumes in the new How to Read series published by W. W. Norton, which also includes short guides to Shakespeare, Nietzsche, Freud, and Wittgenstein. But at the time, standing in line at a neighborhood bookstore a couple weeks ago, I wasn't aware of those other titles. (The only thing mitigating the embarrassment was knowing that my days as a skinhead, albeit a non-Nazi one, are long over.) And anyway, the appearance of Adolf Hitler in such distinguished literary and philosophical company raises more troubling questions than it resolves.
"Intent on letting the reader experience the pleasures and intellectual stimulation in reading classic authors," according to the back cover, "the How to Read series will facilitate and enrich your understanding of texts vital to the canon." The series editor is Simon Critchley, a professor of philosophy at the New School in New York City, who looms ever larger as the guy capable of defending poststructuralist thought from its naysayers. Furthermore, he's sharp and lucid about it, in ways that might just persuade those naysayers to read Derrida before denouncing him. (Yeah, that'll happen.)
Somehow it is not that difficult to imagine members of the National Association of Scholars waving around the How to Read paperbacks during Congressional hearings, wildly indignant at Critchley's implicit equation of Shakespeare and Hitler as "classic authors" who are "vital to the canon."
False alarm! Sure, the appearance of the Fuhrer alongside the Bard is a bit of a provocation. But Neil Gregor, the author of How to Read Hitler, is a professor of modern German history at the University of Southampton, and under no illusions about the Fuhrer's originality as a thinker or competence as a writer.
About Mein Kampf, Gregor notes that there is "an unmistakably 'stream of consciousness' quality to the writing, which does not appear to have undergone even the most basic editing, let alone anything like polishing." Although Gregor does not mention it, the title Hitler originally gave to the book reveals his weakness for the turgid and the pompous: Four and a Half Years of Struggle against Lies, Stupidity and Cowardice. (The much snappier My Struggle was his publisher's suggestion.)
Incompetent writers make history, too. And learning to read them is not that easy. The fact that Hitler had ideas, rather than just obsessions, is disobliging to consider. Many of the themes and images in his writing reflect an immersion in the fringe literature of his day -- the large body of ephemeral material analyzed in Fritz Stern in his classic study The Politics of Cultural Despair: The Rise of the Germanic Ideology.
But Gregor for the most part ignores this influence on Hitler. He emphasizes, instead, the elements of Hitler's thinking that were, in their day, utterly mainstream. He could quote whole paragraphs Carl de Clausewitz on strategy. And his racist world view drew out the most virulent consequences of the theories of Arthur de Gobineau and Houston Stewart Chamberlain.(While Hitler was dictating his memoirs in a prison following the Beer Hall Putsch, he could point with admiration to one effort to translate their doctrines into policy: The immigration restrictions imposed in the United States in the 1920s.)
Gregor's method is to select passages from Mein Kampf and from an untitled sequel, published posthumously as Hitler's Second Book. He then carefully unpacks them -- showing what else is going on within the text, beneath the level of readily paraphrasable content. With his political autobiography, Hitler was not just recycling the standard complaints of the extreme right, or indulging in Wagnerian arias of soapbox oratory. He was also competing with exponents of similar nationalist ideas. He wrote in order to establish himself as the (literally) commanding figure in the movement.
So there is an implicit dialogue going on, disguised as a rather bombastic monologue. "Long passages of Hitler's writings," as Gregor puts it, "take the form of an extended critique of the political decisions of the late nineteenth century.... Hitler reveals himself not only as a nationalist politician and racist thinker, but -- this is a central characteristic of fascist ideology -- as offering a vision of revitalization and rebirth following the perceived decay of the liberal era, whose failings he intends to overcome."
The means of that "overcoming" were, of course, murderous in practice. The vicious and nauseating imagery accompanying any mention of the Jews -- the obsessive way Hitler constantly returns to metaphors of disease, decay, and infestation -- is the first stage of a dehumanization that is itself an incipient act of terror. The genocidal implications of such language are clear enough. But Gregor is careful to distinguish between the racist stratum of Hitler's dogma (which was uncommonly virulent even compared to the "normal" anti-Semitism of his day) and the very widespread use of militarized imagery and rhetoric in German culture following World War I.
"Many of the anti-Semitic images in Hitler's writing can be found in, say, the work of Houston Stewart Chamberlain," writes Gregor. "Yet when reading Chamberlain's work we hardly sense that we are dealing with an advocate of murder. When reading Hitler, by contrast, we often do -- even before we have considered the detail of what he is discussing. This is because the message is not only to be found in the arguments of the text, but is embedded in the language itself."
How to Read Hitler is a compact book, and a work of "high popularization" rather than a monograph. The two short pages of recommended readings at the end are broad, pointing to works of general interest (for example, The Coming of the Third Reich by Richard Evans) rather than journal articles. It will find its way soon enough into high-school and undergraduate history classrooms -- not to mention the demimonde of "buffs" whose fascination with the Third Reich has kept the History Channel profitable over the years.
At the same time, Gregor's little book is an understated, but very effective, advertisement for the "cultural turn" in historical scholarship. It is an example, that is, of one way historians go about examining not just what documents tell us about the past, but how the language and assumptions of a text operated at the time. His presentation of this approach avoids grand displays of methodological intent. Instead the book just goes about its business -- very judiciously, I think.
But there is one omission that is bothersome. Perhaps it is just an oversight, or, more likely, a side effect of the barriers between disciplines. Either way, it is a great disservice that How to Read Hitler nowhere points out the original effort by someone writing in English to analyze the language and inner logic of Mein Kampf -- the essay by Kenneth Burke called "The Rhetoric of Hitler's 'Battle,' " published in The Southern Review in 1939. (In keeping with my recent enthusing over the "golden age" of the academic literary quarterly, it is worth noting that the Review was published at Louisiana State University and edited by a professor there named Robert Penn Warren.)
Burke's essay was, at the time, an unusual experiment: An analysis of a political text using the tools of literary analysis that Burke had developed while studying Shakespeare and Coleridge. He had published the first translations of Thomas Mann's Death in Venice and of portions of Oswald Spengler's Decline of the West -- arguably a uniquely suitable preparation for the job of reading Hitler. And just as various German émigrés had tried to combine Marx and Freud in an effort to grasp "the mass psychology of fascism" (as Wilhelm Reich's title had it), so had Burke worked out his own combination of the two in a series of strange and brilliant writings published throughout the Depression.
But he kept all of that theoretical apparatus offstage, for the most part, in his long review-essay on a then-new translation of Mein Kampf. Instead, Burke read Hitler's narrative and imagery very closely -- showing how an "exasperating, even nauseating" book served to incite and inspire a mass movement.
This wasn't an abstract exercise. "Let us try," wrote Burke, "to discover what kind of 'medicine' this medicine man has concocted, that we may know, with greater accuracy, exactly what to guard against, if we are to forestall the concocting of similar medicine in America."
Burke's analysis is a [ital]tour de force[ital]. Revisiting it now, after Gregor's How to Read volume, it is striking how much they overlap in method and implication. In 1941, Burke reprinted it in his collection The Philosophy of Literary Form, which is now available from the University of California Press. You can also find it in a very useful anthology of Burke's writings called On Symbols and Society, which appears in the University of Chicago Press's series called "The Heritage of Sociology."
"Above all," wrote Burke in 1939, "I believe we must make it apparent that Hitler appeals by relying upon a bastardization of fundamentally religious patterns of thought. In this, if properly presented, there is no slight to religion. There is nothing in religion proper that requires a fascist state. There is much in religion, when misused, that does lead to a fascist state. There is a Latin proverb, Corruptio optimi pessima, 'the corruption of the best is the worst.' And it is the corruptors of religion who are a major menace to the world today, in giving the profound patterns of religious thought a crude and sinister distortion."
As a technology specialist working in the humanities, there are times when it's difficult not to feel like a walking contradiction.
I'm accustomed, as an academic blogger for 2+ years, to near-instantaneous feedback on my writing and participating in networked conversations that can spring up overnight and disappear just as quickly. As an academic bound to certain research expectations, though, conversational cycles still occur at the speed of print, unfolding in some cases over a period of years. Blogging is less about occupying a different space than it is working to a different rhythm, a difference that can be difficult to explain to those who don't do it.
As a blogger, I subscribe to more than 100 RSS feeds, syndicated content from the blogs of friends and acquaintances, news sites, and filter sites for particular topics. As an academic, though, I struggle to keep current with a much smaller number of journals in my field (rhetoric and composition), and this even though most of them only publish two to four times a year. Undoubtedly, this is in part because of the difference between "work" reading and "play" reading; it's far easier (and usually more pleasant) to skim a handful of blog entries than it is to give sustained attention to a journal article, after all.
But it's more than that. Although the quantity and quality of writing that I read online almost certainly differs from the scholarly reading I do, I would argue that the biggest change is that I practice reading differently. And this is a truth that, traditionally, disciplines in the humanities have been slow to accept. We are still prone to thinking of technology as something added to what are already substantial professional duties, instead of conceiving of it as a way of approaching those duties differently.
I've had opportunity in recent months to reflect on my various reading practices. In the spring of 2005, I was named the associate editor of the flagship journal ( College Composition and Communication, or CCC ) in my field and given the task of rethinking and redesigning its Web site, a task that would potentially bring together these separate domains. Most of the journals in my field have Web sites, and their quality tends to vary even more widely than the journals themselves.
What almost all of these sites have in common, though, is their central mission, which is to mirror their respective print journals. In other words, most of these sites provide little content beyond what is already available in the journals themselves (and in some cases, much less or after a significant lag).
The fact of the matter is that I don't think of these mirror sites as part of my online reading. There are times where it is convenient to look up a piece of information online, but only rarely do I end up at a journal Web site when I do. Although I may not be typical in this regard, I use these sites so infrequently that they seem little more than a cursory obligation added to the workload of the journals' editors. As I began the task of rethinking CCC Online, then, one of my chief motivations was to practice what I've preached above, to conceive of the site as a way for my colleagues to experience the journal in different and (ideally) productive ways, rather than as a mere repository mirroring the print version.
All three members of the development team (myself and two research assistants) are active academic bloggers, and ultimately, it has been our experience with blogging, and with social software more generally, that has helped us in this process. In discussions of blogs, the tendency is to focus on content, on the identities of bloggers, and on the networking that occurs within and among these sites. In a more formal sense, though, blog platforms are simply content management systems, databases designed with an eye toward the kinds of publications that we now think of as blogs. Most important, perhaps, they are scaled for personal use.
In the case of CCC Online, we use Movable Type for the site's infrastructure, which allows us to duplicate a number of the features typical to large-scale databases. We provide a dedicated search engine, for instance, and can notify subscribers about updates to the site, over e-mail and RSS.
While we have taken advantage of some of the generic features of blogs, we haven't turned the journal's Web site into a blog. The problem we're addressing isn't a dearth of information, but an overload. With the publication of a new issue of CCC, we add an entry for each article, and that entry contains metadata about the article: abstract, keywords, bibliography, and a permalink for the article itself (which remains password-protected for subscribers only). At the same time, we are slowly adding similar data for the journal's back issues.
Building a centralized archive for the journal's metadata allows us to do a number of things:
It makes the contents of the journal available not only to search engines, but to bookmarking services like del.icio.us, CiteULike and others.
It allows us to include insular links to the articles, allowing users to follow up on citations of other CCC articles.
Through the use of bi-directional trackback links, we also include "Works Citing," links to essays that have taken up (and cited) a particular article.
Finally, we use del.icio.us, a social bookmarking application, to offer users an associative interface for browsing the journal.
This last feature deserves a little more explanation. Providing a static page for each journal article allows our colleagues to bookmark those pages (and to tag them with their own keywords) in an application like del.icio.us. What we've done is to open our own del.icio.us account, and to bookmark each article, tagging them with both author-supplied and textually-generated keywords. The result is an associative network that goes beyond citation to connect articles on similar topics.
In the case of each of these features, our focus is not on adding new content, but rather on devising ways for users to approach and/or manage the content we are already generating as a discipline. The principle behind each of these features is the same: we've tried to focus site development around the question of how the site might be used instead of what information the site contains.
CCC Online is in many ways still a mirror site, but it's a mirror that can be manipulated in a variety of ways, offering our colleagues different perspectives on the journal's content, perspectives that are impossible to duplicate in print. We've worked to make the site as productive as possible, integrating more efficient management of the journal's content with opportunities for exploration and invention.
Depending on the discipline or profession where you find yourself, our site may seem either painfully obvious or distressingly opaque. For our colleagues in the humanities, though, we hope it serves as a model of what we might accomplish by turning to social software to inform (and improve) our practices as readers and writers.
In addition to managing CCC Online, Collin Brooke is an assistant professor of rhetoric and writing at Syracuse University, where he also directs the composition and cultural rhetoric doctoral program.
Jerome Karabel's The Chosen is the big meta-academic book of the season -- a scholarly epic reconstructing "the hidden history of admission and exclusion at Harvard, Yale, and Princeton," as the subtitle puts it. Karabel, who is a professor of sociology at the University of California at Berkeley, has fished documents out of the archive with the muckraking zeal worthy of an investigative journalist. And his book, published this month by Houghton Mifflin, is written in far brisker narrative prose than you might expect from somebody working in either sociology or education. That's not meant as a dis to those worthy fields. But in either, the emphasis on calibrating one's method does tend to make storytelling an afterthought.
For Karabel really does have a story to tell. The Chosen shows how the gentlemanly anti-Semitism of the early 20th century precipitated a deep shift in how the country's three most prestigious universities went about the self-appointed task of selecting and grooming an elite.
It is (every aspect of it, really) a touchy subject. The very title of the book is a kind of sucker-punch. It is an old anti-Jewish slur, of course. It's an allusion to Jehovah's selection of the Jews as the Chosen People, of course. It's also a term sometimes used, with a sarcastic tone, as an ethnic slur. But Karabel turns it back against the WASP establishment itself -- in ways too subtle, and certainly too well-researched, to be considered merely polemical. (I'm going to highlight some of the more rancor-inspiring implications below, but that is due to my lack of Professor Karabel's good manners.)
The element of exposé pretty much guarantees the book a readership among people fascinated or wounded by the American status system. Which is potentially, of course, a very large readership indeed. But "The Chosen" is also interesting as an example of sociology being done in almost classical vein. It is a study of what, almost a century ago, Vilfredo Pareto called "the circulation of elites" -- the process through which "the governing elite is always in a state of slow and continuous transformation ... never being today what it was yesterday."
In broad outline, the story goes something like this. Once upon a time, there were three old and distinguished universities on the east coast of the United States. The Big Three were each somewhat distinctive in character, but also prone to keeping an eye on one another's doings.
Harvard was the school with the most distinguished scholars on its faculty -- and it was also the scene of President Charles Eliot's daring experiment in letting undergraduates pick most of their courses as "electives." There were plenty of the "stupid young sons of the rich" on campus (as one member of the Board of Overseers put it in 1904), but the student body was also relatively diverse. At the other extreme, Princeton was the country club that F. Scott Fitzgerald later described in This Side of Paradise. (When asked how many students there were on campus, a Princeton administrator famously replied, "About 10 percent.")
Finally, there was Yale, which had crafted its institutional identity as an alternative to the regional provincialism of Harvard, or Princeton's warm bath of snobbery. It was "the one place where money makes no difference ... where you stand for what you are," in the words of the then-beloved college novel Dink Stover, about a clean-cut and charismatic Yalie.
But by World War One, something was menacing these idyllic institutions: Namely, immigration in general and "the Hebrew invasion" in particular. A meeting of New England deans in the spring of 1918 took this on directly. A large and growing percentage of incoming students were the bright and driven children of Eastern European Jewish immigrants. This was particularly true at Harvard, where almost a fifth of the freshman class that year was Jewish. A few years later, the figure would reach 13 percent at Yale -- and even at Princeton, the number of Jewish students had doubled its prewar level.
At the same time, the national discussion over immigration was being shaped by three prominent advocates of "scientific" racism who worried about the decline of America's Nordic stock. They were Madison Grant (Yale 1887), Henry Fairfield Osborne (Princeton 1877), and Lothrop Stoddard (Harvard 1905).
There was, in short, an air of crisis at the Big Three. Even the less robustly bigoted administrators worried about (as one Harvard official put it) "the disinclination, whether justified or not, on the part of non-Jewish students to be thrown into contact with so large a proportion of Jewish undergraduates."
Such, then, was the catalyst for the emergence, at each university, of an intricate and slightly preposterous set of formulae governing the admissions process. Academic performance (the strong point of the Jewish applicants) would be a factor -- but one strictly subordinated to a systematic effort to weigh "character."
That was an elusive quality, of course. But administrators knew when they saw it. Karabel describes the "typology" that Harvard used to make an initial characterization of applicants. The code system included the Boondocker ("unsophisticated rural background"), the Taconic ("culturally depressed background," "low income"), and the Krunch ("main strength is athletic," "prospective varsity athlete"). One student at Yale was selected over an applicant with a stronger record and higher exam scores because, as an administrator put it, "we just thought he was more of a guy."
Now, there is a case to be made for a certain degree of flexibility in admissions criteria. If anything, given our reflex-like tendency to see diversity as such as an intrinsic good, it seems counterintuitive to suggest otherwise. There might be some benefit to the devil's-advocate exercise of trying to imagine the case for strictly academic standards.
But Karabel's meticulous and exhaustive record of how the admissions process changed is not presented as an argument for that sort of meritocracy. First of all, it never prevailed to begin with.
A certain gentlemanly disdain for mere study was always part of the Big Three ethos. Nor had there ever been any risk that the dim sons of wealthy alumni would go without the benefits of a prestigious education.
What the convoluted new admissions algorithms did, rather, was permit the institutions to exercise a greater -- but also a more deftly concealed -- authority over the composition of the student body.
"The cornerstones of the new system were discretion and opacity," writes Karabel; "discretion so that gatekeepers would be free to do what they wished and opacity so that how they used their discretion would not be subject to public scrutiny.... Once this capacity to adapt was established, a new admissions regime was in place that was governed by what might be called the 'iron law of admissions': a university will retain a particular admissions policy only so long as it produces outcomes that correspond to perceived institutional interests."
That arrangement allowed for adaptation to social change -- not just by restricting applicants of one minority status in the 1920s, but by incorporating underrepresented students of other backgrounds later. But Karabel's analysis suggests that this had less to do with administratorsbeing "forward-looking and driven by high ideals" than it might appear.
"The Big Three," he writes, "were more often deeply conservative and surprisingly insecure about their status in the higher education pecking order.... Change, when it did come, almost always derived from one of two sources: the continuation of existing policies was believed to pose a threat either to vital institutional interests (above all, maintaining their competitive positions) or to the preservation of the social order of which they were an integral -- and privileged -- part."
Late in the book, Karabel quotes a blistering comment by the American Marxist economist Paul Sweezy (Exeter '27, Harvard '31, Harvard Ph.D. '37) who denounced C. Wright Mills for failing to grasp "the role of the preparatory schools and colleges as recruiters for the ruling class, sucking upwards the ablest elements of the lower classes." Universities such as the Big Three thus performed a double service to the order by "infusing new brains into the ruling class and weakening the potential leadership of the working class."
Undoubtedly so, once upon a time -- but today, perhaps, not so much. The neglect of their duties by the Big Three bourgeoisie is pretty clear from the statistics.
"By 2000," writes Karabel, "the cost of a year at Harvard, Yale, and Princeton had reached the staggering sum of more than $35,000 -- an amount that well under 10 percent of American families could afford....Yet at all three institutions, a majority of students were able to pay their expenses without financial assistance -- compelling testimony that, more than thirty years after the introduction of need-blind admissions, the Big Three continued to draw most of their students from the most affluent members of society." The number of students at the Big Three coming from families in the bottom half of the national income distribution averages out to about 10 percent.
All of which is (as the revolutionary orators used to say) no accident. It is in keeping with Karabel's analysis that the Big Three make only as many adjustments to their admissions criteria as they must to keep the status quo ante on track. Last year, in a speech at the American Council on Education, Harvard's president, Larry Summers, called for preferences for the economically disadvantaged. But in the absence of any strong political or social movement from below -- an active, noisy menace to business as usual -- it's hard to imagine an institutionalized preference for admitting students from working families into the Big Three. (This would have to include vigorous and fairly expensive campaigns of recruitment and retention.)
As Walter Benn Michaels writes in the latest issue of N+1 magazine, any discussion of class and elite education now is an exercise in the limits of the neoliberal imagination. (His essay was excerpted last weekend in the Ideas section of The Boston Globe.
"Where the old liberalism was interested in mitigating the inequalities produced by the free market," writes Michaels, " neoliberalism -- with its complete faith in the beneficence of the free market -- is interested instead in justifying them. And our schools have a crucial role to play in this. They have become our primary mechanism for convincing ourselves that poor people deserve their poverty, or, to put the point the other way around, they have become our primary mechanism for convincing rich people that we deserve our wealth."
How does this work? Well, it's no secret that going to the Big Three pays off. If, in theory, the door is open to anyone smart and energetic, then everything is fair, right? That's equality of opportunity. And if students at the Big Three then turn out to be drawn mainly from families earning more than $100,000 per year....
Well, life is unfair. But the system isn't.
"But the justification will only work," writes Michaels, if "there really are significant class differences at Harvard. If there really aren't -- if it's your wealth (or your family's wealth) that makes it possible for you to go to an elite school in the first place -- then, of course, the real source of your success is not the fact that you went to an elite school but the fact that your parents were rich enough to give you the kind of preparation that got you admitted to the elite school. The function of the (very few) poor people at Harvard is to reassure the (very many) rich people at Harvard that you can't just buy your way into Harvard."
The rule of law is to the routines of an ordinary, civilized existence roughly what oxygen is to long division. It's not actually part of the equation, but you can't actually make the calculations without it. Not that it guarantees justice. There may be lapses, omissions, misfires; and if the laws themselves are bad, then the rule of law can bring misery. And there's more to life than regularity. Profound truths may by revealed by transgression, charismatic authority, and ecstatic excesses embodying the creative and destructive dimensions of poetry, mystery, and the sacred. (That said, I'd still rather live in a city with zoning ordinances.)
In The Law in Shambles -- just published in the Prickly Paradigm series, distributed by the University of Chicago Press -- Thomas Geoghegan offers an incisive criticism, from the left, of the idea that the expression "rule of law" is at all appropriate to the way we live now. His booklet is conversational, wide-ranging, and absolutely terrifying. It deserves a wide readership.
In saying that Geoghegan's perspective comes "from the left," I've made room for misunderstandings that should be cleared up right away. First of all, he's not denouncing the whole concept of rule of law as a more or less streamlined way of carrying out the "golden rule" of capitalism, that he with the gold makes the rules. (That's the paleo-Marxist position. Some of the International Socialist Organization activists on your campus might make this argument.) Nor is Geoghegan criticizing actually existing constitutional democracy (as we might call it) from the vantage point of some "original position" of fairness, A Theory of Justice-style.
The author is a labor lawyer (though he has also been a fellow at the American Academy in Berlin). He's arguing from his own court cases, and from perceived trends -- not from first principles. He once loved the work of John Rawls, and the dream is not quite dead; but really, that was a long time ago. "Ever since he wrote that book," Geoghegan says, "it's as if someone with a voodoo doll put a hex on his whole approach."
No, Geoghegan's criticism is less abstract, more crunchy. It's not just that respect for the awesome majesty of the law is now largely pro forma. Rather, in important regards the whole edifice has been gutted; before long, there won't even be any nails holding the facade together.
The increasingly robust and strident contempt for the judiciary expressed by the American right is only part of it, if the most bewildering for anybody who remembers the old conservative motto of "law and order." Now the emphasis is just on order, plain and simple. And not in the sense conveyed by Jack Webb's no-nonsense demeanor on Dragnet. More like Joseph de Maistre's rhapsody over the hangman's role as cornerstone of civilization.
Which is worrisome, no doubt about it. But Geoghegan is more concerned about the low-key, day-to-day degradations of the rule of law. Consider, for example, the case of the rat turds. Geoghegan worked on a brief on behalf of workers who had lost their jobs when a chicken-processing plant shut down -- suing on their behalf under the Worker Adjustment and Retraining Notification (WARN) Act, which requires that a factory owner give employees 60 days notice that a plant will be shut down. To this, the chicken-processing guy had a ready answer: He had been shut down by the Department of Agriculture for health-code violations -- an unforeseen contingency, he said.
He had an argument, says Geoghegan: "Yes, he may have done bad things, and let rats run wild, and let rats shit on the chicken meat. And yes, it is even true that the inspectors of the Department of Agriculture gave him 'write-ups.' But here is the issue: Was it reasonable for the owner to foresee that the DOA would enforce its own regulation?" After all, everybody in the business knows that you get the write-up and pay the fine.
"That," as Geoghegan says, "was his claim: DOA is more or less a joke. Under Bush I, then Clinton, and then Bush II, it's gotten worse.... Now comes the ruling of the district judge, who is a liberal, a Clinton appointee: Yes, he says, it was unforeseeable. It was as if he took judicial notice that, as a matter of common knowledge, the government does not enforce the laws.... In other words, the application of the rule of law is the equivalent of an 'act of God.' Like a hurricane."
Behind such incidents, the author sees at work an intricate play of mutually reinforcing tendencies. One is the weak and shrinking labor movement. Another is the long-term decline in voter turnout (thereby eroding any checks on wingnuttery that may exist within the ideological sphere). And then there is the systematic underfunding of enforcement for whatever regulations of the economic sphere are still on the books. The legal center of gravity of civil society shifts from contract (with its schedule of benefits and obligations) to tort (so that the only right that matters is the one to collect on damages).
Some of this is familiar, but ne'er so well expressed. Were there a serious movement in this country to challenge the present course of things, The Law in Shambles would be available in a grubby newsprint version selling for 10 cents, and distributed by the hundreds of thousands. Instead, you have to pay 10 bucks for it. This is not good.
Sometimes our tools are our politics, and that’s not always a good thing. Last week, the Copyright Clearance Center announced that it would integrate a “Copyright Permissions Building Block” function directly into Blackboard’s course management tools. The service automates the process of clearing copyright for course materials by incorporating it directly into the Blackboard tool kit; instructors post materials into their course space, and then tell the application to send information about those materials to CCC for clearance.
For many, this move offers welcome relief to the confusion currently surrounding the issue of copyright. Getting clearance for the materials you provide to your students, despite the help of organizations like CCC, is still a complicated and opaque chore. Instructors either struggle through the clumsy legal and financial details or furtively dodge the process altogether and hope they don’t get caught. With the centralization offered by CCC and now the automation offered by this new Blackboard add-on, the process will be more user-friendly, comprehensive, and close at hand. As Tracey Armstrong, executive vice president for CCC, put it, “This integration is yet another success in making the ‘right thing’ become the ‘easy thing.’”
Certainly, anything that helps get intellectual resources into the hands of students in the format they find most useful is a good thing. I have no doubt that both the CCC and Blackboard genuinely want the practical details of getting course materials together, cleared, and to the student to be less and less an obstacle to actually teaching with those materials. But I’m skeptical of whether this “easy thing” actually leads to the “right thing.” Making copyright clearance work smoothly overlooks the question of whether we should be seeking clearance at all -- and what should instead be protected by the copyright exception we’ve come to know as “fair use.”
Fair use has been the most important exception to the rules of copyright since long before it was codified into law in 1976, especially for educators. For those uses of copyrighted materials that would otherwise be considered an infringement, the fair use doctrine offers us some leeway when making limited use for socially beneficial ends.
What ends are protected can vary, but the law explicitly includes education and criticism -- including a specific reference to “multiple copies for classroom use.” It’s what lets us quote other research in our own without seeking permission, or put an image we found online in our PowerPoint presentations, or play a film clip in class. All of these actions are copyright violations, but would enjoy fair use protection were they ever to go to court.
But there is a dispute, among those who dispute these kinds of things, about exactly why it is we need fair use in such circumstances. Some have argued that fair use is a practical solution for the complex process of clearing permission. If I had to clear permission every single time I quoted someone else’s research or Xeroxed a newspaper article for my students -- figuring out who owns the copyright and how to contact them, then gaining permission and (undoubtedly) negotiating a fee -- I might be discouraged from doing so simply because it’s difficult and time-consuming. In the absence of an easy way to clear copyright, we have fair use as a way to “let it slide” when the economic impact is minimal and the social value is great.
Others argue that fair use is an affirmative protection designed to ensure that copyright owners don’t exploit their legal power to squelch the reuse of their work, especially when it might be critical of their ideas. If I want to include a quote in my classroom slides in order to demonstrate how derivative, how racist, or maybe just how incompetent the writer is, and copyright law compelled me to ask the writer’s permission to do it, he could simply say no, limiting my ability to powerfully critique the work. Since copyright veers dangerously close to a regulation of speech, fair use is a kind of First Amendment safety valve, such that speakers aren’t restricted by those they criticize by way of copyright.
This distinction was largely theoretical until organizations like CCC came along. With the help of new database technologies and the Internet, the CCC has made it much easier for people to clear copyright, solving some of the difficulty of locating owners and negotiating a fair price by doing it for us. The automatic mechanism being built into Blackboard goes one step further, making the process smooth, user-friendly, and automatic. So, if fair use is merely a way to account for how difficult clearing copyright can be, then the protection is growing less and less necessary. Fair use can finally be replaced by what Tom Bell called “fared use” -- clear everything easily for a reasonable price.
If, on the other hand, fair use is a protection of free speech and academic freedom that deliberately allow certain uses without permission, then the CCC/Blackboard plan raises a significant problem.
The fact that the fair use doctrine explicitly refers to criticism and parody suggests that it is not just for when permission is difficult to achieve, but when we shouldn’t have to ask permission at all. The Supreme Court said as much in Campbell v. Acuff-Rose (1994), when Justice Kennedy in a concurring decision noted that fair use “protects works we have reason to fear will not be licensed by copyright holders who wish to shield their works from criticism.” Even in a case in which permission was requested and denied, the court did not take this as a sign that the use was presumptively unfair. Fair use is much more than a salve for the difficulty of gaining permission.
Faculty and their universities should be at the forefront of the push for a more robust fair use, one that affirmatively protects “multiple copies for classroom use” when their distribution is noncommercial, especially as getting electronic readings to students is becoming ever cheaper and more practical.
Automating the clearance process undoes the possibility of utilizing, and more importantly challenging, this slow disintegration of fair use. Even if the Blackboard mechanism allows instructors simply not to send their information to CCC for clearance (and it is unclear if it is, or eventually could become, a compulsory mechanism), the simple fact that clearance is becoming a technical default means that more and more instructors will default to it rather than invoking fair use.
The power of defaults is that they demarcate the “norm”; the protection of pedagogy and criticism envisioned in fair use will increasingly deteriorate as automatic clearance is made easier, more obvious, and automatic. This concern is only intensified as Blackboard, recently merged with WebCT, continues to become the single, dominant provider of course management software for universities in the United States.
Technologies have politics, in that they make certain arrangements easier and more commonplace. But technologies also have the tendency to erase politics, rendering invisible the very interests and efforts currently working to establish “more copyright protection is better” as the accepted truth, when it is far from it.
As educators, scholars, librarians, and universities, we are in a rarified position to fight for a more robust protection of fair use in the digital realm, demanding that making “multiple copies for classroom use” means posting materials into Blackboard without needing to seek the permission of the copyright owners to do so.
The automation of copyright clearance now being deployed will work against this, continuing to shoehorn scholarship into the commercial model of information distribution, and erase the very question of what fair use was for -- not by squelching it, but simply by making it easier not to fight for it and harder to even ask if there’s an alternative.
Tarleton Gillespie is an assistant professor in the Department of Communication at Cornell University, and a Fellow with the Stanford Law School Center for Internet and Society.