A professor -- grant money in hand, spouse and child off on vacation -- goes to Berlin to work on his long-gestating book about the painter Titian. He plans to focus (perhaps New Historicist-style) on an anecdote in which Charles V stooped to pick up the artist's paintbrush.
As often happens with a writing project, the scholar gets bogged down on a minor point. Days go by, turning into weeks. All he writes are the first two words of the book. Meanwhile, he has agreed to take care of a neighbor's plants, and he procrastinates about doing that, too. When he finally gets around to watering them, he lingers a while in front of the television, slipping into the narcotic trance of the total couch potato....
At this point, some of you are thinking, "I know that guy. In fact, I know him a little too well."
The era when any self-respecting academic would do the standard "I do not own a television machine" bit is now as distant and implausible as, say, Ozzy and Harriet. It may be that the turning point is recorded by the cultural commentator John Leonard, in his account of a discussion with Lionel Trilling in the early 1970s. After denying that he actually used his television machine very much, the sage of the Columbia University English department admitted to watching quite a bit of basketball, and also to having a certain weakness for Kojak.
Actually, the Titian scholar with the square eyeballs is the narrator of Jean-Phillipe Toussaint's Television, published in translation last year by the Dalkey Archives Press. It's the sort of novel in which dry irony is the real hero. As the drive-in movie critic Joe Bob Briggs used to say, there isn't a lot of plot to get in the way of the story.
But it seems like the right book to be reading now, during national TV Turnoff Week. Not because the unnamed European professor in Toussaint's book is an example of what happens to someone who succumbs to the tube. Quite the contrary: Television is a book about how pride in not watching can render you even more obsessed.
The narrator (sounding a little like Trilling) announces that he seldom turned the box on: "Apart from major sporting events, which I always watched with pleasure, and of course the news and the occasional election-night special, I never watched much of anything on television." He says he avoided seeing movies there, for the same reason he never read books in Braille.
"Although I never tried it," he continues, "I was always quite sure I could give up watching television anytime, just like that, without suffering in the least, without suffering the slightest ill effect -- in short, that there was no way I could be considered dependent."
And yet, from time to time, he slips into "a slight deterioration of my day-to-day habits." He finds himself barefoot and unshaven, "half-reclining on the couch, taking it easy ... my hand cradling my privates." (Note to anthropologists and psychoanalysts: The latter gesture, possibly universal, requires cross-cultural interpretation. See also Slavoj Zizek's proposal that tendency of men to dominate the remote control is symptom of castration anxiety.)
"Most of these afternoons I was alone in the apartment," the narrator recalls, "but sometimes the cleaning woman was there too, ironing my shirts beside me in the living room, mute with contained indignation."
He resolves, more than once, to quit for good. And yet television is everywhere. Looking out the window of his temporary lodgings, he sees the blue glow in apartment after apartment across the way. Visiting the museum to do research for his monograph (research that merely amounts, at this point, to another form of procrastination) he wanders into the security station, where guards keep watch on the gallery through a bank of surveillance cameras: "After studying the monitors for some time, I finally recognized a painting that had been a starting point for my study, the portrait of Emporer Charles V...."
That is the trouble with procrastination. It is hard to make any progress, no matter how hard you work at it -- and any halfway serious bout of procrastination is, of course, quite exhausting. The obligation you try to escape keeps returning.
Likewise with the effort of Toussaint's narrator to avoid television. The solemn (if never very firm) vow to keep the machine turned off becomes just another stage of immersion in "its essential immediacy, its ever-evolving, always-in-progress superficiality."
Resistance is futile. So Television shows, tongue in cheek. But for the most articulate terms of surrender, we have to turn to another professor.
"TV offers incredible amounts of psychic data," says Murray J. Siskind, a visiting lecturer at College-on-the-Hill. "It opens ancient memories of world birth, it welcomes us into the grid, the network of little buzzing dots that make up the picture patterns...."
(At this point, it is probably worth mentioning that both Siskind and the College appear in Don Delillo's novel White Noise, first published 20 years ago. In 1985, the book was clearly a satire. Now I'm not so sure. It's probably turned into a fairly straightforward picture of the way we live now.)
"Look at the wealth of data concealed in the grid," says Siskind, "in the bright packaging, the jingles, the slice of life commercials, the products hurtling out of darkness, the coded messages and endless repetitions, like chants, like mantras. 'Coke is it, Coke is it, Coke is it.' The medium practically overflows with sacred formulas if we can remember how to respond innocently and get past our irritation, weariness, and disgust."
Then again, all of this -- Toussaint's fiction and Delillo's alike -- does seem a little out of date. The locus of procrastination has now shifted.
It's moved to another screen ... a different grid ... offering infinitely more information ... white noise that is louder and blurrier. The distractions range from the sublime to the barely legal.
Going online, and resolving to stay offline, will require another kind of obsessive narrative. One to be read, perhaps, in August, during PC Turnoff Week.
The publication, 100 years ago, of The Jungle, by Upton Sinclair, in the popular American socialist newspaper Appeal to Reason had an enormous effect -- if not quite the one that its author intended. "I aimed at the public’s heart," Sinclair later said, “and by accident I hit it in the stomach.”
Drawing on interviews with workers in Chicago and his own covert explorations of the city’s meat-processing factories, Sinclair intended the novel to be an expose of brutal working conditions. By the time it appeared as a book the following year, The Jungle’s nauseating revelations were the catalyst for a reform movement culminating in the Pure Food and Drug Act. In portraying the life and struggles of Jurgis Rudkus, a Lithuanian immigrant, Sinclair wanted to write (as he put it), “The Uncle Tom’s Cabin of wage slavery,” thereby ushering in an age of proletarian emancipation. Instead, he obliged the bourgeoisie to regulate itself -- if only to keep from feeling disgust at its breakfast sausages.
In his introduction to a new edition of The Jungle, just published by Bedford/St. Martin’s, Christopher Phelps traces the origins and effects of Sinclair’s novel. Phelps, an associate professor of history at Ohio State University in Mansfield, is currently on a Fulbright fellowship in Poland, where he occupies a distinguished chair in American studies and literature at the University of Lodz. The following is the transcript of an e-mail interview conducted this month.
Q: At one of the major chain bookstores the other day, I noticed at least four editions of The Jungle on the shelf. Yours wasn’t one of them. Presumably it's just a matter of time. What’s the need, or the added value, of your edition? Some of the versions available are pretty cheap, after all. The book is now in the public domain.
A: Yes, it’s even available for free online these days, if all you want is the text. This new edition is for readers seeking context. It has a number of unique aspects. I’m pleased about the appendix, a report written by the inspectors President Theodore Roosevelt dispatched to Chicago to investigate Upton Sinclair’s claims about the meatpacking industry. In one workplace, they watch as a pig slides off the line into a latrine, only to be returned to the hook, unwashed, for processing. No other version of The Jungle includes this report, which before now had lapsed into obscurity. The new edition also features an introduction in which I survey the scholarship on the novel and provide findings from my research in Sinclair’s papers held by the Lilly Library at Indiana University. Finally, there are a lot of features aimed at students, including a cartoon, a map, several photographs, a bibliography, a chronology of Sinclair’s life, and a list of questions for discussion. So it doubles as scholarly edition and teaching edition.
Q: Let me ask about teaching the book, then. How does The Jungle go over in the classroom?
A: Extremely well. Students love it. The challenge of teaching history, especially the survey, is to get students who think history is boring to imagine the past so that it comes alive for them. The Jungle has a compelling story line that captures readers’ attention from its very first scene, a wedding celebration shaded in financial anxiety and doubts about whether Old World cultural traditions can survive in America. From then on, students just want to learn what will befall Jurgis and his family. Along the way, of course, Sinclair injects so much social commentary and description that teachers can easily use students’ interest in the narrative as a point of departure for raising a whole range of issues about the period historians call the Progressive Era.
Q: As you've said, the new edition includes a government report that appeared in the wake of the novel, confirming the nauseating details. What are the grounds for reading and studying Sinclair's fiction, rather than the government report?
A: Well, Teddy Roosevelt’s inspectors had the singular mission of determining whether the industry’s slaughtering and processing practices were wholesome. Sinclair, for his part, had many other concerns. What drew him to write about the meatpacking industry in the first place was the crushing of a massive strike of tens of thousands of workers led by the Amalgamated Meat Cutters and Butcher Workmen of North America in 1904. In other words, he wanted to advance the cause of labor by exposing the degradation of work and exploitation of the immigrant poor.
When The Jungle became a bestseller, Sinclair was frustrated that the public furor centered almost exclusively on whether the companies were grinding up rats into sausage or disguising malodorous tinned beef with dyes. These were real concerns, but Sinclair cared most of all about the grinding up of workers. I included this government report, therefore, not only because it confirms Sinclair’s portrait of unsanitary meat processing, but because it exemplifies the constriction of Sinclair’s panorama of concerns to the worries of the middle-class consumer.
It further shows how Sinclair’s socialist proposal of public ownership was set aside in favor of regulatory measures like the Pure Food and Drug Act and Meat Inspection Act of 1906. Of course, that did not surprise Sinclair. He was proud, rightly so, of having been a catalyst for reform. Now, just as the report must be read with this kind of critical eye, so too the novel ought not be taken literally.
Q: Right. All kinds of problems come from taking any work of literature, even the most intentionally documentary, as giving the reader direct access to history.
A: Nowadays The Jungle is much more likely to be assigned in history courses than in literature courses, and yet it is a work of fiction. You point to a major problem, which we might call the construction of realism. I devote a good deal of attention to literary form and genre in my introduction, because I think they are crucial and should not be shunted aside. I note the influence upon The Jungle of the sentimentalism of Harriet Beecher Stowe, of naturalist and realist writers like William Dean Howells and Frank Norris, and of the popular dime novels of Horatio Alger. Sinclair was writing a novel, not a government report. He fancied himself an American Zola, the Stowe of wage slavery.
A good teacher ought to be able to take into account this status of the text as a work of creative literature while still drawing out its historical value. We might consider Jurgis, for example, as the personification of a class. He receives far more lumps in life than any single worker would in 1906, but the problems he encounters, such as on-the-job injury or the compulsion to make one’s children work, were in fact dilemmas for the working class of the time.
In my introduction, I contrast the novel with what historians now think about immigrant enclaves, the labor process, gender relations, and race. There is no determinant answer to the question of how well The Jungle represented such social realities. Many things it depicted extremely well, others abominably, race being in the latter category. If we keep in mind that realism is literary, fabricated, we can see that Sinclair’s background afforded him a discerning view of many social developments, making him a visionary, even while he was blind in other ways. Those failings are themselves revelatory of phenomena of the period, such as the racism then commonplace among white liberals, socialists, and labor activists. It’s important that we read the novel on all these levels.
Q: Sinclair wrote quite a few other novels, most of them less memorable than The Jungle. Well, OK, to be frank, what I've heard is that they were, for the most part, awful. Is that an unfair judgment? Was The Jungle a case of the right author handling the right subject at the right time?
A: That's precisely it, I think. Sinclair was uniquely inspired at the moment of writing The Jungle. I've been reading a lot of his other books, and although some have their moments, they sure can give you a headache. Many of them read like failed attempts to recapture that past moment of glory. He lived to be ninety and cranked out a book for every year of his life, so it's a cautionary tale about allowing prolixity to outpace quality. The book of his that I like best after The Jungle is his 1962 autobiography, a book that is wry and whimsical in a surprising and attractive, even disarming, way.
Scott McLemee writes Intellectual Affairs on Tuesdays and Thursdays.
There’s a wonderful scene in the 1979 film Manhattan that is parody, but as in most satire, perilously close to reality. Ike (Woody Allen) and Mary (Diane Keaton) are strolling in the Guggenheim Museum when Mary starts rattling off the names of members of what she calls the "Academy of the Overrated." Among the academy’s charter members: Norman Mailer, Gustav Mahler, Carl Jung, F. Scott Fitzgerald, Lenny Bruce, Walt Whitman, Vincent Van Gogh and Ingmar Bergman.
Woody is beside himself. He can’t believe anyone would trash those so close to his heart.
Flash-forward to a meeting I attended recently. The journalism school at the University of Iowa is deservedly getting a new building, a marvel of technological and architectural wonders dedicated to teaching the wonders of communication to would-be 21st Century journalists. A colleague and I were selected to coordinate a day-long dedication for the new school, and through the benevolence of a benefactor, have a small pot of money to spend to attract a big-name speaker or two.
As in everything academic, the decision won’t be mine alone. The j-school will be sharing its new space with a hybrid, the Department of Cinema and Comparative Literature, and because universities like to act democratically, representatives from the two disciplines need to agree on who the speakers would be.
On the j-school’s list were such luminaries as Donald Barlett, James Fallows, Donald Graham, Bill Kovach, Daniel Okrent, James Steele and Bob Woodward.
Just as I finished circulating this A-list of names, a young professor from Cinema and Comparative Literature sneered. "Well, I'd hope we wouldn’t invite Woodward!" She was almost spitting.
"What's wrong with Woodward?" I asked, my blood pressure beginning to spike.
"Well, I just don’t think he’s a very good journalist!" the professor snarled.
A momentary pause for anyone who’s been living in a cave: Bob Woodward has taken us into the lives of Americans as diverse as the two George Bushes, Bill Clinton, John Belushi, the former CIA chief spy William Casey, the Supreme Court justices, Colin Powell and Alan Greenspan. With help from Carl Bernstein, he was responsible for showing Richard Nixon the White House door. Woodward has been one of America’s most gifted newspapermen for more than 35 years. He has changed how Americans look at our country and how journalists write about it.
Considering all the above, I stared at this Judas in my midst, my mouth forming an O-shape. I looked around the table for a nibble of support but got none. Just as I was about to jump on the table to protest, my own colleague from the journalism school joined Judas, voicing her assessment of Woodward as an opportunistic sellout.
The emboldened professor from Cinema and Comparative Literature hopped on the thread. "We definitely wouldn’t want Woodward," she said now with finality.
"But then who?" I asked.
"Well, I could see inviting Sy Hershman."
This cinema-and-comparative-literature professor was so chummy with the investigative reporter and New Yorker political writer Seymour Hersh, who broke the Abu Ghraib Prison scandal story, that she was comfortable enough calling him Sy, but somehow couldn’t get his last name right.
The rest of the discussion, as far as I could follow, involved how corrupt journalism is and how complicit the school is to take money from the likes of giants like Gannett, Lee Enterprises and other models of corporate greed.
After gathering my wits, I suggested that we ought to have two separate days of dedication -- one where academics could trash the corporate model of journalism, and another where professional journalists could talk about ways to enhance and improve American journalism.
Absolutely not, the professors around me railed. There should be one and only one program. The journalists (well, maybe not Woodward) should be invited to the dedication to learn from the academics. We need to publicly humiliate, flog and pummel these propagandists. Lock the doors so the lapdogs can’t escape. Call C-SPAN to document the bloodbath.
I’m not making this up.
What’s the lesson? Just another case of academic elitism at its most basic and sniveling core?
What happened is not new or different from how the academy has historically looked at anything popular or successful. Popularity means corrupt, and corrupt means without merit, worthy of scorn -- a ticket into the Academy of the Overrated.
That recent incident recalled a similar instance of incorrigible academic elitism I experienced when I was an untenured professor and about to submit a book proposal to a trade publisher. A tenured faculty member told me, point blank, that if a trade publishing house were ever to publish my book, I should be prepared to kiss tenure goodbye. Naïve and new to the job, I couldn't believe what I was hearing.
"You mean to say that if a reputable publisher, a place like Knopf, Doubleday or Harcourt, were to publish the book, and if it were to get positive reviews in places like The New York Times and The Washington Post, and a great number of people were to read the book, I wouldn’t get tenure?" "That’s right," came the acid response from the full professor. "Trade publishers will print anything that’ll sell."
As though writing a book that the lay people read would be bad.
I had never heard of anything so undemocratic in my life. Almost a decade later, I still feel the same way. I understand that there is a place for serious scholarship, which by nature has a limited audience. But I was a journalist, teaching in a journalism school. The definition of good journalism is to break new ground, and in doing so, reach as large an audience as possible. The idea is to discover and inform -- not really so different from the role of a university professor.
I’m glad to report that the full professor soon left the university, the book came out, I got tenure, was promoted, and life has been rosy ever since. But the professor’s elitist drivel still sticks in my craw because his snobbery runs so rampant in the academy today -- as what I experienced with the dopey professor from the Department of Cinema and Comparative Literature.
Frankly, I doubt whether Bob Woodward would even want to come to Iowa in the first place. The real action these days when it comes to improving journalism isn’t in the critical-cultural halls of academe. No surprise. It lies with smart, savvy reporters and editors pushing the limits of corporate media ownership by producing the kind of journalism that demands to be disseminated and read, stuff so good that no one can ignore it.
It’s hard to be a journalist today given economic constraints, not to mention a surging patriotic mandate from a large part of this nation that dictates to be critical of the government is to be Un-American. In my mind, to do journalism well today is a form of heroism.
For more than a century, the credo of millions of American journalists used to be “Comfort the afflicted and afflict the comfortable." That magnificent credo still flies proudly at several rarified media outlets. God knows, such journalism is needed today. The way journalism is practiced today at many newspapers and electronic outlets is mediocre, often embarrassing. For many reasons, much mainstream journalism has entered a new kind of Dark Age.
But journalists shouldn’t -- and won’t -- put up with ivory-tower snipers pointing AK-47s at their real-world heads. Few newly minted journalism/mass communication Ph.D.s today have any familiarity with the great journalists of our times -- Tom Wolfe, Gay Talese, John McPhee, Hunter S. Thompson, David Halberstam, Bob Woodward and Seymour Hersh, to name a few. Mention John Hersey, Rachel Carson, James Agee, Lincoln Steffens, H.L. Mencken, Hannah Arendt, Ida Tarbell and you’re likely to get blank stares. Doctoral students today receive few incentives to study journalists. Today’s graduate students in the field study critical-cultural theoretical icons who, I’m afraid to say, have little real understanding of today’s working press.
It comes as no surprise, then, that there’s so little scholarship that has contributed to improving the quality of journalism. I doubt whether scholars really want to do that, anyway. For most scholars, such activity would be considered beneath them — sort of like publishing a book that people could actually understand.
Stephen G. Bloom
Stephen G. Bloom is professor of journalism and mass communication at the University of Iowa and author of Postville: A Clash of Cultures in Heartland America and Inside the Writer’s Mind: Writing Narrative Journalism. He has worked as a reporter for the Los Angeles Times, Dallas Morning News, and San Jose Mercury News, and is co-founder of the Iowa Journalists Oral History Project (http://www.uiowa.edu/~acadtech/journalists/index2.htm).
Right after 9/11, the obituaries started to appear: Irony, the reports said, was dead. Either that or in really bad condition.
It had been a very 1990s thing, this irony. Never before in human history had so many people so often used that two-handed gesture to inscribe quotation marks in the air. Or pronounced the word really with an inflection conveying the faux enthusiasm that doubled as transparent contempt (as in; "I really like that new Britney Spears single"). The manner had been forged in earlier times -- by pioneers at the Harvard Lampoon, for example. But it really caught on during the cold peace that followed the Cold War. Suddenly, irony became available to everyone, on the cheap. It was the wit of the witless, the familiar smirk beneath the perpetually raised eyebrow.
And then it died. Hard realities broke through the callow veneer of detachment. Everybody became very earnest. And then America entered its present golden age of high seriousness.
Oh, no, wait.... That last bit never actually happened. The rest of the story is familiar enough, though. So much so, in fact, that I am reluctant to note my own recent suspicion that, after all, it's more or less true. Irony really is dead.
It's not just that irony is a much richer notion than sarcasm. Broadly defined, it means the coexistence of two radically counterposed (even mutually contradictory) meanings within the same utterance. The simplest case would be saying, "What a beautiful day!" in the middle of a hurricane. But the subtler kinds spin out into infinity....
There is the irony of Plato's dialogues, where men who are very sure of their own competence try to explain things to Socrates (who says that he knows nothing, yet quickly, through simple questions, ties their arguments into the Athenian equivalent of pretzels). There is dramatic irony, in which action on stage means one thing for the characters and something very different for the audience. And let's not even get started on where the German philosophers went with it -- beyond noting that it turned into something like the essence of art, consciousness,and human existence.
I'm not saying that there is no connection at all between the Philosophical Fragments of Friedrich Schlegel and the camp value of listening to The Carpenters' Greatest Hits. Actually, they go together pretty well, if you're in the right mood. (As Schlegel put it: "For a man who has achieved a certain height and universality of cultivation, his inner being is an ongoing chain of the most enormous revolutions." So you might start out feeling all ironic about Karen Carpenter, then end up overwhelmed by her voice.)
But that just makes it all the more sad to realize that the rumors are true. Irony is now extinct, or at least in a coma. I got the evidence last week and have been bummed ever since.
The proverbial lightbulb over the head went on while reading American Prometheus: The Triumph and Tragedy of J. Robert Oppenheimer by Kai Bird and Martin J. Sherwin, published last month by Alfred A. Knopf. It is a massive book, the product of about a quarter century of research into the physicist's ascent to power and his marytrdom under the original wave of McCarthyism.
It is an absorbing book. The authors are both distinguished, and the story they tell is almost unnerving in its contemporary resonances. My reviewer's copy now has the usual marks in the margin to highlight various passages that made a strong impression.
But flipping back through it now, I find the record of another kind of readerly response. At some point, the authors begin applying the word "ironic," in its various forms, to situations occuring in Oppenheimer's life. And astonishingly enough, it seems that the authors never manage to use it in a meaningful way.
For example, in the spring of 1934, Oppenheimer earmarks three percent of his salary to help German physicists who are fleeing the Nazis. "Ironically," write Bird and Sherwin, "one of the refugees who may have been assisted by this fund was [Oppenheimer's] former professor in Gottingen, Dr. James Franck." (Here, it appears that they think "irony" means either "coincidentally" or "oddly enough.")
During the depression, Oppenheimer's wife had been a Communist who, "ironically, survived on government relief checks of $12.50." (Ayn Rand living on welfare -- now that would be ironic. But an unemployed left-winger?)
Other examples could be offered. In no case that I recall do Bird and Sherwin use the word in anything like an appropriate way. Which is all the more striking because Oppenheimer's story is thick with ironies. For example, during the McCarthy years, his effort to rebuff a Soviet agent's attempt to recruit him as a spy in the early 1940s turns into the "proof" that he was disloyal. It is a reversal worthy of Sophocles -- a situation that is profoundly ironic.
Not that the authors ever use the word in that (for once, appropriate) way. Instead, we stay trapped in that Alanis Morrissette song from the mid-1990s:
It's like rain on your wedding day It's a free ride when you've already paid It's the good advice that you just didn't take... And isn't it ironic ... don't you think?
To which the answer is, of course, "No." Such things are not ironic in any sense. (Inconvenient, yes. Ironic, no.)
Now, it could be that I'm overreacting. Maybe the fact that two intelligent and capable writers -- in a major book, on an important topic, published by one of the country's top presses -- end up sounding like Alanis Morrissette is not as worrisome as it seems. Perhaps irony is not dead after all?
Either that, or we need to define the word in a new way. "Ironic, adj., of or pertaining to a situation involving no irony whatsoever."
People who met Aldous Huxley would sometimes notice that, on any given day, the turns of his conversation would follow a brilliant, unpredictable, yet by no means random course. The novelist might start out by mentioning something about Plato. Then the discussion would drift to other matters -- to Poe, the papacy, and the history of Persia, followed by musings on photosynthesis. And then, perhaps, back to Plato.
So Huxley's friends would think: "Well, it's pretty obvious which volume of the Encyclopedia Britannica he was reading this morning."
Now, it's a fair guess that whoever recounted that story (to the author of whichever biography I read it in) meant to tell it at Huxley's expense. It's not just that it makes him look like an intellectual magpie, collecting shiny facts and stray threads of history. Nor even that his erudition turns out to be pre-sorted and alphabetical.
Rather, I suspect the image of an adult habitually meandering through the pages of an encyclopedia carries a degree of stigma. There is a hint of regression about it -- if not all the way back to childhood, at least to preadolescent nerdishness.
If anything, the taboo would be even sterner for a fully licensed and bonded academic professional.
Encyclopedia entries are among the lowest form of secondary literature. Very rare exceptions can be made for cases such as Sigmund Freud's entry on "Psychoanalysis" in the 13th edition of the Britannica, or Kenneth Burke's account of his own theory of dramatism in The International Encyclopedia of the Social Sciences. You get a certain amount of credit for writing for reference books -- and more for editing them. And heaven knows that the academic presses love to turn them out. See, for example, The Encyclopedia of Religion in the South (Mercer University Press), The Encyclopedia of New Jersey (Rutgers University Press) and The International Encyclopedia of Dance (Oxford University Press), not to mention The Encyclopedia of Postmodernism (Routledge).
It might be okay to "look something up" in an encyclopedia or some other reference volume. But read them? For pleasure? The implication that you spend much time doing so would be close to an insult - a kind of academic lese majesty.
At one level, the disdain is justified. Many such works are sloppily written, superficial, and/or hopelessly unreliable. The editors of some of them display all the conscientiousness regarding plagiarism one would expect of a failing sophomore. (They grasp the concept, but do not think about it so much as to become an inconvenience.)
But my hunch is that social pressure plays a larger role in it. Real scholars read monographs! The nature of an encyclopedia is that it is, at least in principle, a work of popularization. Probably less so for The Encyclopedia of Algebraic Topology, assuming there is one. But still, there is an aura of anti-specialization and plebian accessibility that seems implicit in the very idea. And there is something almost Jacobin about organizing things in alphabetical order.
Well then, it's time. Let me confess it: I love reading encyclopedias and the like, at least in certain moods. My collection is not huge, but it gets a fair bit of use.
Aside from still-useful if not cutting- edge works such as the four-volume Encyclopedia of Philosophy (Macmillan, 1967) and Eric Partridge's indispensible Short Etymological Dictionary of Modern English Origins (Macmillan, 1958), I keep at hand any number of volumes from Routledge and Blackwell offering potted summaries of 20th century thinkers. (Probably by this time next year, we'll have the 21st century versions.)
Not long ago, for a ridiculously small price, I got the four paperbound volumes of the original edition of the Scribners Dictionary of the History of Ideas, first published in 1973 -- the table of contents of which is at times so bizarre as to seem like a practical joke. There is no entry on aesthetics, but one called "Music as Demonic Art" and another called "Music as a Divine Art." An entry called "Freedom of Speech in Antiquity" probably ought to be followed with something that brings things up to more recent times -- but no such luck.
The whole thing is now available online, with its goofy mixture of the monographic ("Newton's Opticks and Eighteenth Century Imagination") and the clueless (no entries on Aristotle or Kant, empiricism or rationalism). But somehow the weirdness is more enjoyable between covers.
And then, of course, there is the mother of them all: the Encyclopedia or Rational Dictionary of the Sciences, Arts, and Crafts that Denis Diderot and friends published in the 1750s and '60s. Aside from a couple of volumes of selections, I've grabbed every book by or about Diderot in English that I've ever come across.
Diderot himself, appropriately enough, wrote the entry for "Encyclopedia" for the Encyclopedia.
The aim of such a work, he explained, is "to collect all the knowledge scattered over the face of the earth, to present its general structure to the men with whom we live, and to transmit this to those who will come after us, so that the work of past centuries may be useful to the following centuries, that our children, by becoming more educated, may at the same time become more virtuous and happier, and that we may not die without having deserved well of the human race."
Yeah! Now that's something to shoot for. It even makes reading encyclopedias seem less like a secret vice than a profound obligation.
And if, perchance, any of you share the habit -- and have favorite reference books that you keep at hand for diversion, edification, or moral uplift -- please pass the titles along below....
Some months back, one of the cable networks debuted a movie -- evidently the pilot for a potential show -- that inspired brief excitement in some quarters, though it seems not to have caught on. Its central character was someone whose grasp of esoteric knowledge allowed him or her (I'm not sure which, never having seen it) to command the awesome mysterious forces of the universe. Its title was The Librarian.
The program was, it seems, a reworking of a similar figure in Buffy the Vampire Slayer. That's in keeping with the fundamental law of the entertainment industry once defined by Ernie Kovacs, the great American surrealist TV pioneer: "Find something that works, then beat it to death."
At another level, though, the whole concept derived from a tradition that is pre-television, indeed, almost pre-literate. The idea that a command of books provides access to secret forces, the equation of the scholar with the magus, was already well established before Faust and Prospero worked their spells. The linkage has also left its trace at the level of the signifier. Both glamor, originally meaning a kind of witchy sex appeal, and grimoire, the sorcerer's reference book, derive from the word grammar -- one of the foundational disciplines of medieval learning, hence a source of power.
Today, it's much rarer to find the whole knowledge/power nexus treated in such explicitly occultic terms, at least outside pop culture. As for librarians, they are usually regarded as professionals working in the service sector of the information economy, rather than as full-fledged participants in contemporary intellectual life. That is, arguably, an injustice. But the division of labor and the logic of hierarchical distinctions have changed a lot since the day when Gottfried Leibniz (philosopher, statesman, inventor of calculus and the computer, and overall polymathic genius) held down his day job running a library.
The most persistent aspect of the old configuration is probably the link between glamor and grammar - the lingering aura of bookish eroticism. At least that's what the phenomenon of librarian porn would suggest. The topic deserves more scholarly attention, though an important start has been made by Daniel W. Lester, the network information coordinator for Boise State University in Idaho. His bibliography of pertinent livres lus avec une seule main ("books read with one hand") is not exhaustive, but the annotations are judicious. About one such tale of lust in the stacks, he writes: "Most of the library and librarian descriptions are reasonable, except for the number of books on a book cart."
But the role librarians play at the present time brings them closer to the most pressing issues in American cultural life than any cheesy TV show (or letter to Penthouse, for that matter) could possibly convey.
Their work constitutes the real intersection of knowledge and power -- not as concepts to be analyzed, but at the level of almost nonstop practical negotiation. It is the cultural profession most involved, from day to day, with questions concerning public budgets, information technology, the cost of new publications, and intellectual freedom. (On the latter, check out the American Library Association's page on the Patriot Act.)
Given all that, I've been curious to find out about discussions by academic librarians regarding current developments in their profession, in the university, and in the world outside. A collection of essays called The Successful Academic Librarianis due out this fall from Information Today, Inc. Its emphasis seems to fall on guidance in facing career demands. But how can an outsider keep up with what academic librarians are thinking about other issues?
Well, the first place to start is The Kept-Up Academic Librarian, the blog of Steven Bell, who is director of the Gutman Library at Philadelphia University. Bell provides a running digest of academic news, but for the most part avoids the kind of reflective and/or splenetic mini-essays one associates with blogdom.
My own effort to track down something more ruminative turned up a few interesting blogs lus avec une seule main run by librarians, such as this one. But this, while stimulating, was not quite on topic. So in due course I contacted Steven Bell, on the assumption that he was as kept-up as an academic librarian could be. Could he please name a few interesting blogs by academic librarians?
His answer came as a surprise: "When you ask specifically about blogs maintained by academic librarians," Bell wrote earlier this week, "the list would be short or non-existent."
He qualified the comment by noting the numerous gray areas. "There may be some academic librarians out there with an interesting blog, but in some cases I think the blogger is doing it anonymously and you don't really even know if the person is an academic librarian. For example, take a look at Blog Without a Library. I can't tell who this blogger is though I think he or she might be an academic librarian. On the other hand Jill Stover's Library Marketing blog is fairly new and pretty good, and she is an academic librarian -- but the blog really isn't specific to academic libraries.... Bill Drew of one of the SUNY libraries has something he calls BabyBoomer Librarian but it isn't necessarily about academic librarianship -- sometimes yes, but more often not."
Bell listed a few other blogs, including Humanities Librarian from the College of New Jersey. But very few of his suggestions were quite what I had in mind -- that is, public spaces devoted to thinking out loud about topics such as the much-vaunted "crisis in academic publishing." It was a puzzling silence.
"I can't say any individual has developed a blog that has emerged as the 'voice of academic librarianship,' " noted Bell in response to my query. "Why? If I had to advance a theory I'd say that as academic librarians we are still geared towards traditional, journal publishing as the way to express ourselves. I know that if I have something on my mind that I'd like to write about to share my thoughts and opinions, I'm more likely to write something for formal publication (e.g., see this piece.) Perhaps that is why we don't have a 'juicy' academic librarian out there who is taking on the issues of the day with vocal opinions."
And he added something that makes a lot of sense: "To have a really great blog you have to be able to consistently speak to the issues of the day and have great (or even good) insights into them -- and it just doesn't seem like any academic librarian out there is capable of doing that. I think there are some folks in our profession who might be capable of doing it. But if so they haven't figured out yet that they ought to be blogging, or maybe they just don't have the time or interest."
Now, that diagnosis may perhaps contain the elements of a solution. The answer might be the creation of a group blog for academic librarians -- some prominent in their field, others less well-known, and perhaps even a couple of them anonymous. No one participant would be under pressure to generate fresh insights every day or two. By pooling resources, such a group could strike terror in the hearts of budget-cutting administrators, price-gouging journal publishers, and even the occasional professor prone to associating academic stardom with aristocratic privilege.
Full disclosure: I am married to a librarian, albeit a non-academic one, who knew about the World Wide Web (and the proper grammar for using various search engines) long before most people did. She has proven to me, time and again, that librarians do indeed possess amazing powers. They also tend to have a lot to say about the bureaucracies that employ them -- and the patrons who patronize them.
An outspoken, incisive, and timely stream of commentary on the problems and possibilities facing academic libraries would enliven and enrich the public discourse. If anything, it's long overdue.
Scott McLemee writes Intellectual Affairs on Tuesdays and Thursdays. One of his previous columns was on the pleasures of reading encyclopedias.
Pierre Bourdieu had a way of getting under one's skin. I don't mean his overtly polemical works -- the writings against globalization and neoliberalism that appeared toward the end of his life, for example. He was at his most incisive and needling in works that were much less easy for the general public to read. Bourdieu's sociological research kept mapping out the way power, prestige, and exclusion work in the spheres of politics, economics, and culture.
He was especially sharp (some thought brutal) in analyzing the French academic world. At the same time, he did very well in that system; very well indeed. He was critical of the way some scholars used expertise in one field to leverage themselves into positions of influence having no connection with their training or particular field of confidence. It could make him sound like a scold. At the same time, it often felt like Bourdieu might be criticizing his own temptation to become an oracle.
In the course of my own untutored reading of Bourdieu over the years, there came a moment when the complexity of his arguments and the aggressiveness of his insights suddenly felt like manifestations of a personality that was angry on the surface, and terribly disappointed somewhere underneath. His tone registered an acute (even an excruciating) ambivalence toward intellectual life in general and the educational system in particular.
Stray references in his work revealed glimpses of Bourdieu as a "scholarship boy" from a family that was both rural and lower-middle class. You learned that he had trained to be a philosopher in the best school in the country. Yet there was also the element of refusal in even his most theoretical work -- an almost indignant rejection of the role of Master Thinker (played to perfection in his youth by Jean-Paul Sartre) in the name of empirical sociological research.
There is now a fairly enormous secondary literature on Bourdieu in English. Of the half-dozen or so books on him that I've read in the past few years, one has made an especially strong impression, Deborah Reed-Danahay's recent study Locating Bourdieu (Indiana University Press, 2005). Without reducing his work to memoir, she nonetheless fleshes out the autobiographical overtones of Bourdieu's major concepts and research projects. (My only complaint about the book is that it wasn't published 10 years ago: Although it is a monograph on his work rather than an introductory survey, it would also be a very good place for the new reader of Bourdieu to start.)
Reed-Danahay is a professor of anthropology at the University of Texas at Arlington. She recently answered a series of questions by e-mail. Q: Bourdieu published sociological analyses of the Algerian peasantry, the French academic system, the work of Martin Heidegger, and patterns of attendance at art galleries -- to give only a very incomplete list. Yet his work seems much more focused and coherent than a catalog of topics would suggest. Can you to sum up the gist of his work, or rather how it all holds together?
A: Yes, I agree that, at first glance, Bourdieu's work seems to cover a seemingly disparate series of studies. When I read Bourdieu's work on education in France after first being exposed to his Algerian peasant studies in my graduate work in anthropology, I wondered if this was the same person. But when the entire corpus is taken together, and when one carefully reads Bourdieu's many texts that returned to themes brought up earlier in his work, one can see several underlying themes and recurring intellectual questions.
One way to get a handle on his work is to realize that Bourdieu was interested in explaining social stratification, and the hierarchy of social values, in contemporary capitalist societies. He wanted to study systems of domination in a way that held some room for social agency but without a notion of complete individual freedom. Bourdieu often evoked Althusser as an example of a theorist who had too mechanical a view of internalized domination, while Sartre represented the opposite extreme of a philosopher who posited free will.
Bourdieu believed that we are all constrained by our internalized dispositions (our habitus), deriving from the milieu in which we are socialized, which influence our world view, values, expectations for the future, and tastes. These attributes are part of the symbolic or cultural capital of a social group.
In a stratified society, a higher value is associated with the symbolic capital of members of the dominant sectors versus the less dominant and "controlled" sectors of society. So that people who go to museums and like abstract art, for instance, are expressing a form of symbolic capital that is more highly valued than that of someone who either rarely goes to museums or who doesn't like abstract art. The person feels that this is "just" a matter of taste, but this can have important consequences for children at school who have not been exposed to various forms of symbolic capital by their families.
Bourdieu studied both social processes (such as the French educational system, Algerian postcolonial economic dislocations, or the rural French marriage system), and individual figures and their social trajectories -- including Heidegger, Flaubert, and an Algerian worker. Bourdieu was trying to show how the choices these people made (and he often wrote of choices that were not really choices) were expressions of the articulation of habitus and the social field in which it is operating.
Q: Something about his career always seemed paradoxical. Sartre was always his worst-case reference, for example. But by the time of his death in 2002, Bourdieu was regarded as the person who had filled Sartre's shoes. Has your work given you a sense of how to resolve this seeming contradiction?
A: There is a lot of silence in Bourdieu's work on the ways in which he acquired power and prestige within the French academic system or about how he became the most famous public intellectual in France at the time of his death. He was more self-reflexive about earlier periods in his life. I have trouble defending Bourdieu in this contradiction about his stance toward the public intellectual, even though I applaud his political engagements.
I think that Bourdieu felt he had more authority to speak to some of these issues than did other academics, given his social origins and empirical research in Algeria and France among the underclass. Bourdieu repeatedly positioned the sociologist (and one can only assume he meant himself here, too) as having a privileged perspective on the "reality" of systems of domination.
Bourdieu was very critical of Sartre for speaking out about the war in Algeria and for championing a sort of revolutionary spirit among Algerians. Bourdieu accused him of trying to be a "total intellectual" who could speak on any topic and who did not understand the situation in Algeria as profoundly as did the sociologist-ethnologist Bourdieu. When Bourdieu himself became more visible in his own political views (particularly in attacks against globalization and neo-liberalism), he does seem to have acted like the "journalist"-academics he lampooned in Homo Academicus. Nevertheless, when he was criticizing (in his essay On Television) what he saw as the necessity for "fast thinking" on television talk shows in France, where talking heads must quickly have something to say about anything, Bourdieu did (in his defense) refrain from pontificating about any and everything.
There is still a huge controversy raging in France about Bourdieu's political engagements. His detractors vilify him for his attacks against other intellectuals and journalists while he became a public intellectual himself. His defenders have published a book of his political writings ( Interventions, 1961-2001) seeking to show his long-standing commitments, and continue to guard his reputation beyond the grave.
I cannot help but think that Bourdieu's public combative persona, and his (in his own terms) refusals and ruptures, helped rather than thwarted his academic career. The degree to which this was calculated or (as he claimed) was the result of the "peasant" habitus he acquired growing up in southwestern France, is unknown.
Q: So much of his analysis of academic life is focused on the French university system that there is always a question of how well it could apply elsewhere. I'm curious about your thoughts on this. What's it been like to move between his concepts and models and your own experience as an American academic?
A: I see two ways to answer your question. Certainly, in the specifics, French academia is very different. I have experienced that directly. My own American cultural values of independence (which may, I am aware, be a total illusion) conflict with those of many French academics.
When I first arrived in France to do my dissertation fieldwork, I came with a grant that opened some doors to French academia, but I had little direct sponsorship by powerful patrons in the U.S. I was doing a project that had little to do with the work of my professors, none of whom had done research in France or Europe, and it was something that I had come up with on my own. This was surprising to the French, who were familiar with a patron-client system of professor/student relations. Most of the graduate students I met in France were involved in projects related to the work of their professors.
French academia, still centralized in Paris despite attempts at decentralization, is a much smaller universe than that of the vast American system. There is little room for remaining "outside" of various polemics there. I've learned, for instance, that some people whom I like and admire in France hated Bourdieu and that Bourdieu followers tend to be very fierce in their defense of him and want to promote their view of his work.
This is not to say that American academia doesn't have similar forces operating, but there are multiple points of value and hierarchy here. Whereas Bourdieu could say that Philosophy dominated French academia during the mid-20th century, it is harder to pinpoint one single dominant intellectual framework here.
I do, however, feel that Bourdieu's critique of academia as part of a larger project of the study of power (which he made very explicit in The State Nobility) is applicable beyond France. His work on academia provided us with a method of inquiry to look at the symbolic capital associated with academic advancement and, although the specific register of this will be different in different national contexts, the process may be similar. Just as Bourdieu did in France, for example, one could study how it is that elite universities here "select" students and professors.
Q: We have a memoir of Sartre's childhood in The Words. Is there anything comparable for Bourdieu?
A: Bourdieu produced self-referential writings that began to appear in the late 1990s, with "Impersonal Confessions" in Pascalian Meditations (1997), a section called "Sketch for a Self-Analysis" in his final lectures to the Collège de France, Science of Science and Reflexivity (2001), and then the stand-alone volume Esquisse pour une Auto-Analyse, published posthumously in 2004. [ Unlike the other titles listed, this last volume is not yet available in English. -- S.M.]
A statement by Bourdieu that "this is not an autobiography" appears as an epigraph to the 2004 essay. I find his autobiographical writings interesting because they show us a bit about how he wanted to use his own methods of socio-analysis on himself and his own life, with a focus particularly on his formative years -- his childhood, his education, his introduction to academia, and his experiences in Algeria.
Bourdieu was uncomfortable with what he saw as narcissism in much autobiography, and also was theoretically uncomfortable with life stories that stressed the individual as hero without sufficient social analysis. He had earlier written an essay on the 'biographical illusion" that elaborated on his biographical approach, but without self-reference. These essays are not, then, autobiographical in the conventional sense of a linear narrative of a life. Bourdieu felt that a truly scientific sociology depended on reflexivity on the part of the researcher, and by this he meant being able to analyze one's own position in the social field and one's own habitus.
On the one hand, however, Bourdieu's auto-analysis was a defensive move meant to preempt his critics. Bourdieu included a section on self-interpretation in his book on Heidegger, in which he referred to it as "the riposte of the author to those interpretations and interpreters who at once objectify and legitimize the author, by telling him what he is and thereby authorizing him to be what they say he is..." (101). As Bourdieu became increasingly a figure in the public eye and increasingly a figure of analysis and criticism, he wanted to explain himself and thus turned to self-interpretation and auto-analysis.
Q: In a lot of ways, Bourdieu seems like a corrosive thinker: someone who strips away illusions, rationalizations, the self-serving beliefs that institutions foster in their members. But can you identify a kernel of something positive or hopeful in his work -- especially in regard to education? I'd like to think there is one....
A: Bourdieu had little to say about how schools and universities operate that is positive, and he was very critical of them. The hopeful kernel here is that in understanding how they operate, how they inflict symbolic violence and perpetuate the illusions that enable systems of domination, we can improve educational institutions.
Bourdieu felt strongly that by de-mystifying the discourses and aura of authority surrounding education (especially its elite forms), we can learn something useful. The trick is how to turn this knowledge into power, and Bourdieu did not have any magical solutions for this. That is work still to be done.
The other result was a string of advertisements for online services offering to hook you up with married people in your area. For now, anyway, those ads have disappeared -- perhaps as the result of some tweak in Google's famous algorithms. In any case, they came as a surprise. But my wife (who has forgotten more about search engines I will ever know) rolled her eyes and said, "I knew it was going to happen when you named the column that."
Ex post facto, it does seem obvious. After all "intellectual" doesn't count for much, product-placement-wise. In the American vernacular, it is a word usually accompanied by such modifiers as "pseudo" and "so-called" (just as the sea in Homer is always described as "wine-dark"). No doubt the Google algorithm, if tweaked a bit more, will one day lead you right to the personals ads for the New York Review of Books. For now, at least, the offers for a carnal carnival cruise are gone.
Meanwhile, Inside Higher Ed has now launched a page with a running list of Intellectual Affairs columns from February to the present. It has more than three dozen items, so far -- an assortment of essays, interviews, causeries, feuilletons, and uncategorizable thumbsuckers ... all in one central location, suitable for bookmarking.
It's also worth mentioning that Inside Higher Ed itself now offers RSS and XML feeds. (The editors are too busy or diffident to announce this, but some public notice of it is overdue.) To sign up, go to the home page and look for the buttons at the bottom.
This might also be a good time to invite readers to submit tips for Intellectual Affairs -- your thoughts on subjects to cover, books to examine, arguments to follow, people to interview. This column will strive, in coming months, to be equal parts Dennis Diderot and Walter Winchell. Your brilliant insights, unconfirmed hunches, and unsubstantiated hearsay are more than welcome. (Of course, that means I'll have to go confirm and substantiate them, but such is the nature of the gig.) Direct your mail here.
Word has it that IA is going to be tapped by Emily Gordon, of the Emdashes blog, for the current "book meme" -- a circulating questionnaire that invites participants to list what they've bought and read lately.
As you may recall, the field of memetics came into a certain short-lived prominence some years ago - one of those cases of an extended metaphor morphing into something like a school of thought. It rested on an analogy between ideas and genetic material. Concepts, ideologies, and trends were self-replicating "memes" that propagated themselves by spreading through cultural host populations, like mononucleosis at a rave.
Memetics itself hasn't had all that much staying power; it seems, by and large, to have gone the way of the Y2K survival kit. But the term "meme" has, paradoxically enough, proven much hardier -- particularly in the blogosphere. (One theory is that it appeals to bloggers because it has "me" in it, twice.)
So anyway, my responses to the meme, forthwith.
How many books have you owned?
This I cannot answer with any confidence. At present, I have somewhere between three and four thousand. Over the years, I have made regular efforts to clear room by selling or giving large numbers of books away. A few months ago, for example, I parted with about a thousand of them.
Such a purge feels good, once the initial hesitation is overcome. There is even a kind of giddiness, as the herd begins to thin. But afterwards, I always have pangs of regret. When you need it, a missing title is like a phantom limb. There's a maddening and persistent itch you can no longer scratch.
The remaining volumes are arranged alphabetically by author's name - with certain exceptions. For example, there is about half a shelf containing collections of papers on postcolonialism, post-Fordism, postmodernism, and poststructuralism. These are planted right next to some titles by the neoconservative writer Norman Podhoretz. (I like to imagine that these books make each other really uncomfortable.)
What is the last book you bought?
That would be a set of eight rather hefty pamphlets called The Key to Love and Sex (1928) by Joseph McCabe, whose 40-volume series The Key to Culture (1927) is proving somewhat more difficult to collect. I am also awaiting the arrival of McCabe's autobiography Eighty Years aRebel, first published in 1947 -- and, like most of his work, long since out of print.
McCabe is now all but completely forgotten, but in his prime he was a force of nature. Born into a working-class family in Manchester, England, McCabe entered a Franciscan monastery at the age of 15 and became a professor of philosophy for the order. After years of private dialectical wrangling, he concluded that God did not exist. That meant starting over at the age of 27. Whatever anguish his years as a monk cost him, the experience left McCabe with a command of languages and powers of concentration that almost defy imagining.
Apart from translating about 30 volumes of literary, scientific, historical, and philosophical work, McCabe wrote a staggering array of books and pamphlets. Many of his books were works of popularization, but several were specialized works of scholarship in their own right. He was also a tireless lecturer and debater -- as aggressive a spokesman for secular rationality as one can imagine.
H.G. Wells called McCabe "the trained athlete of disbelief." That was, if anything, an understatement. For an admiring (indeed, almost hagiographical) account of his life and work, see Bill Cooke's recent biographical study A Rebel to His Last Breath: Joseph McCabe and Rationalism (Prometheus Books).
Name five books that mean a lot to you.
The list would come out differently with a little more thought. But off the top of my head, and in chronological order of their discovery, I'd name:
The Modern Library edition of Franz Kafka's short stories. How this ended up in a rural high school in the Bible Belt is still something of a mystery. I started reading Kafka in the best possible way - that is, with no idea who he was, or what reputation he might have. (Of course, he had no reputationat all in East Texas, come to think of it.) This made the shock of revelation that much more keen.
Susan Sontag, Against Interpretation. Same high school, different year. (Escape plans forming.) Apart from the cool grace of her own style, Sontag's early essays provided a reading list of figures such as Barthes, Benjamin, Lukacs, and Foucault. It was a much more lively engagement with their concerns than anything I've encountered at, say, MLA over the years.
Seymour Krim, Views of a Nearsighted Cannoneer. Part old-fashioned New York intellectual, part Beat hipster, Seymour Krim wrote this batch of critical and personal essays in the 1950s, long before the term "creative nonfiction" made its inexplicable appearance on the creative-writing scene. (Nothing he published afterwards was even half as good.) I'm almost reluctant to mention this volume in public. Rereading Views has been an occasional private ritual of mine for almost 25 years..
Kenneth Burke, Permanence and Change. There is scarcely a word that begins to describe this book from the 1930s. Burke tried to fuse Marx, Nietzsche, Freud, Veblen, George Herbert Mead, and who knows what else into a theory that would help him understand what was going on (1) in the world at large and (2) between his own ears. (Not long after the Great Depression started, his marriage disintegrated. Burke had a lot to theorize about, in other words.) This is in the category of books that I keep around in two copies: one filled with annotations, the other kept unmarked for reading without distraction.
C. L. R. James, Spheres of Existence. This anthology of political and cultural writings (the last of three volumes of James's selected works) was my first introduction to a revolutionary activist, historian, and thinker whose legacy only looks more important with time. (For a biographical overview, go here. [ www.mclemee.com/id84.html ) Sinceencountering Spheres, I've spent years on research into James's life and work, and edited a couple of volumes of his writings, neither of which was half as good as Spheres. It really ought to be back in print.
What are you reading now?
Various things by and about Joseph McCabe, that freethinking workaholic. Not long ago, I bought a pamphlet by McCabe called The Meaning of Existentialism (1946). At lunch last week with some mutual friends, I showed it to Henry Farrell of Crooked Timber, who seemed amused by the work's long, explanatory subtitle: "The New Philosophy, Founded by Sartre, That Has Made Quick Progress Among the Volatile Young Men of Paris' Latin Quarter."
After hearing my thumbnail biographical lecture, Henry started reading the booklet. After a couple of pages, he looked up and said, with evident surprise, "This is pretty good! It isn't hackwork."
Indeed not. As an overview of Sartre's thought, it was probably as good as anything in English at the time. And McCabe was nearly 80 years old when he wrote it. (He was down to publishing only a few hundred thousand words a year.)
That's why I have adopted a new motto to get through each day. It's short, simple, and easy to remember: "What would Joseph McCabe do?"
The nature of this sort of meme is that I should, at this point, invite four or five people to answer the same questions. Please note the comments field below, and consider yourself invited.
One sign of the great flexibility of American English -- if also of its high tolerance for ugliness -- is the casual way users will turn a noun into a verb. It happens all the time. And lately, it tends to be a matter of branding. You "xerox" an article and "tivo" a movie. Just for the record, neither Xerox nor TiVo is very happy about such unauthorized usage of its name. Such idioms are, in effect, a dilution of the trademark.
Which creates an odd little double bind for anyone with the culture-jamming instinct to Stick It To The Man. Should you absolutely refuse to give free advertising to either Xerox or TiVo by using their names as verbs, you have actually thereby fallen into line with corporate policy. Then again, if you defy their efforts to police ordinary language, that means repeating a company name as if it were something natural and inevitable. See, that's how they get ya.
On a less antiglobalizational note, I've been trying to come up with an alternative to using "meme" as a verb. For one thing, it is too close to "mime," with all the queasiness that word evokes.
As discussed here on Tuesday, meme started out as a noun implying a theory. It called to mind a more or less biological model of how cultural phenomena (ideas, fads, ideologies, etc.) spread and reproduce themselves over time. Recently the term has settled into common usage -- in a different, if related, sense. It now applies to certain kinds of questionnaires or discussion topics that circulate within (and sometimes between) blogospheric communities.
There does not seem to be an accepted word to name the creation and initial dissemination of a meme. So it could be that "meme" must also serve, for better or worse, as a transitive verb.
In any case, my options are limited.... Verbal elegance be damned: Let's meme.
The ground rules won't be complicated. The list of questions is short, but ought to yield some interesting responses. With luck, the brevity will speed up circulation.
In keeping with meme protocol, I'll "tap"a few bloggers to respond. Presumably they will do likewise. However, the invitation is not restricted to that handful of people: This meme is open to anyone who wants to participate.
So here are the questions:
(1) Imagine it's 2015. You are visiting the library at a major research university. You go over to a computer terminal (or whatever it is they use in 2015) that gives you immediate access to any book or journal article on any topic you want. What do you look up? In other words, what do you hope somebody will have written in the meantime?
(2) What is the strangest thing you've ever heard or seen at a conference? No names, please. Refer to "Professor X" or "Ms. Y" if you must. Double credit if you were directly affected. Triple if you then said or did something equally weird.
(3) Name a writer, scholar, or otherwise worthy person you admire so much that meeting him or her would probably reduce you to awestruck silence.
(4) What are two or three blogs or other Web sites you often read that don't seem to be on many people's radar? Feel free to discard anything you don't care to answer.
To get things started, I'm going to tap a few individuals -- people I've had only fairly brief contact with in the past. As indicated, however, anyone else who wants to respond is welcome to do so. The initial list:
An afterthought on the first question -- the one about getting a chance to look things up in a library of the future: Keep in mind the cautionary example of Enoch Soames, the minor late-Victorian poet whose story Max Beerbohm tells. He sold his soul to the devil for a chance to spend an afternoon in the British Library, 100 years in the future, reading what historians and critics would eventually say about his work.
Soames ends up in hell a little early: The card catalog shows that posterity has ignored him even more thoroughly than his contemporaries did.
Proof, anyway, that ego surfing is really bad for you, even in the future. A word to the wise.
I only saw her out of the corner of my eye as I rushed into the book exhibit at the conference, but I was sure I knew her. Her face registered as out of context, somehow, but familiar. A second later, I realized it was one of my students, a recent English-major graduate of the liberal arts college at which I teach. I stopped, turned around and called to her.
She was pleased to see me. She’s a marketing assistant for a major academic publishing house, it turns out. I could tell she was proud of her job, pushing English composition and literature texts to English professors like me. We arranged to meet for dinner the next day, two professionals on a business trip.
I stopped by the marketing assistant’s exhibit while she was out at lunch, and her colleagues were anxious to find out how she had been in my classes. "She must have been a great student, huh?" one of her colleagues prompted me. Hmm. She had been solid, reliable, a good writer, and she always had something interesting to say in class, but the marketing assistant had not been one of our stars. Still, none of our stars of recent years had jobs like hers, working with literature.
Clearly her co-workers loved her. They spoke very fondly of her, and, indeed, she seemed to be very good at her job. What I hadn’t noticed in the classroom was the key quality that was working for the marketing assistant in the world after college: not her knowledge of literature but her skills with people. This I discovered very quickly the next evening at dinner.
I had already had a date for dinner that night with a friend of mine, a fiction writer, so I asked the marketing assistant if I could bring him along. "Sure," she said. "I can expense it. I’m just taking two English professors out." A new verb for me: to expense. I liked it.
She quickly took charge of the expedition, finding good restaurants and putting her name on the waiting list of one while we searched for another (Why had I never thought of that? I guess it’s not really cheating).
The marketing assistant had always been ready with an answer in class, but we’d never actually talked much about anything other than Victorian literature. Turns out she’s pretty funny, and very professional. She told great stories, often at the expense of some poor academic schmuck who stopped by her booth, intent on pitching his or her latest project. I felt sorry for the folks she described, but not because she mocked them -- she didn’t; she described them quite affectionately, as if she knew they couldn’t help themselves. The fiction writer and I shook our heads with her when she described the guy whose project was so impossibly narrow that no academic press would ever publish it. We chuckled along, though less heartily, when she wondered aloud at the fashion sense of the professoriate.
"When you look around the gate at the airport, you can always tell who’s going to the same conference you are," the marketing assistant said. Of course, we could, too, and the fiction writer and I had already had that obligatory conversation, this being his first professional conference. But it was different hearing it from the perspective of the marketing assistant. After all, as a friend of mine said ruefully, gazing around the lobby of one of the convention hotels a few years ago, “These are my people.”
When the marketing assistant got to the social skills of professors, we felt ourselves on relatively safe ground. Fiction writer has a fabulous, wry sense of humor and is good to have at parties, and I have always prided myself on being able to talk to anybody. We are not nerdy bookworms -- we both went to our proms. I snickered at her description of awkward social interactions she’d observed between academics. “It’s amazing you guys found people to get hooked up with,” she declared good-naturedly -- in her eyes we were no different from the guy we had just seen mumbling to himself as he wandered through the book exhibit. Maybe we weren’t. Maybe these really were our people.
They weren’t her people; that’s for sure. The marketing assistant had perspective on our folk that we clearly didn’t have. And that made it really fun to talk to her. I enjoyed seeing her in her professional persona. I was proud of her, glad one of our grads seemed to be heading for a successful career in publishing. But seeing her made me realize that I may not be the best assessor of my students’ skills.
Although the marketing assistant is great at her job, I would not have been able to predict that. When I look at my students, I realize, I have always concentrated on particular skills that are not necessarily the ones that will serve them best after college. Who writes the best? Whose research is most thorough? Whose reading of the novel is the most subtle? Not the most marketable skills, though they will get you into graduate school.
The marketing assistant is the same young woman she was when she was in my classroom. But much of her incisive observation, her wit, her distanced assessment and clever summing-up had passed me by when she was in college. What a letter of recommendation I could write for her now. Of course, she doesn’t need it now. She’s already moved on.
Paula Krebs is professor of English at Wheaton College, in Massachusetts.