Right after 9/11, the obituaries started to appear: Irony, the reports said, was dead. Either that or in really bad condition.
It had been a very 1990s thing, this irony. Never before in human history had so many people so often used that two-handed gesture to inscribe quotation marks in the air. Or pronounced the word really with an inflection conveying the faux enthusiasm that doubled as transparent contempt (as in; "I really like that new Britney Spears single"). The manner had been forged in earlier times -- by pioneers at the Harvard Lampoon, for example. But it really caught on during the cold peace that followed the Cold War. Suddenly, irony became available to everyone, on the cheap. It was the wit of the witless, the familiar smirk beneath the perpetually raised eyebrow.
And then it died. Hard realities broke through the callow veneer of detachment. Everybody became very earnest. And then America entered its present golden age of high seriousness.
Oh, no, wait.... That last bit never actually happened. The rest of the story is familiar enough, though. So much so, in fact, that I am reluctant to note my own recent suspicion that, after all, it's more or less true. Irony really is dead.
It's not just that irony is a much richer notion than sarcasm. Broadly defined, it means the coexistence of two radically counterposed (even mutually contradictory) meanings within the same utterance. The simplest case would be saying, "What a beautiful day!" in the middle of a hurricane. But the subtler kinds spin out into infinity....
There is the irony of Plato's dialogues, where men who are very sure of their own competence try to explain things to Socrates (who says that he knows nothing, yet quickly, through simple questions, ties their arguments into the Athenian equivalent of pretzels). There is dramatic irony, in which action on stage means one thing for the characters and something very different for the audience. And let's not even get started on where the German philosophers went with it -- beyond noting that it turned into something like the essence of art, consciousness,and human existence.
I'm not saying that there is no connection at all between the Philosophical Fragments of Friedrich Schlegel and the camp value of listening to The Carpenters' Greatest Hits. Actually, they go together pretty well, if you're in the right mood. (As Schlegel put it: "For a man who has achieved a certain height and universality of cultivation, his inner being is an ongoing chain of the most enormous revolutions." So you might start out feeling all ironic about Karen Carpenter, then end up overwhelmed by her voice.)
But that just makes it all the more sad to realize that the rumors are true. Irony is now extinct, or at least in a coma. I got the evidence last week and have been bummed ever since.
The proverbial lightbulb over the head went on while reading American Prometheus: The Triumph and Tragedy of J. Robert Oppenheimer by Kai Bird and Martin J. Sherwin, published last month by Alfred A. Knopf. It is a massive book, the product of about a quarter century of research into the physicist's ascent to power and his marytrdom under the original wave of McCarthyism.
It is an absorbing book. The authors are both distinguished, and the story they tell is almost unnerving in its contemporary resonances. My reviewer's copy now has the usual marks in the margin to highlight various passages that made a strong impression.
But flipping back through it now, I find the record of another kind of readerly response. At some point, the authors begin applying the word "ironic," in its various forms, to situations occuring in Oppenheimer's life. And astonishingly enough, it seems that the authors never manage to use it in a meaningful way.
For example, in the spring of 1934, Oppenheimer earmarks three percent of his salary to help German physicists who are fleeing the Nazis. "Ironically," write Bird and Sherwin, "one of the refugees who may have been assisted by this fund was [Oppenheimer's] former professor in Gottingen, Dr. James Franck." (Here, it appears that they think "irony" means either "coincidentally" or "oddly enough.")
During the depression, Oppenheimer's wife had been a Communist who, "ironically, survived on government relief checks of $12.50." (Ayn Rand living on welfare -- now that would be ironic. But an unemployed left-winger?)
Other examples could be offered. In no case that I recall do Bird and Sherwin use the word in anything like an appropriate way. Which is all the more striking because Oppenheimer's story is thick with ironies. For example, during the McCarthy years, his effort to rebuff a Soviet agent's attempt to recruit him as a spy in the early 1940s turns into the "proof" that he was disloyal. It is a reversal worthy of Sophocles -- a situation that is profoundly ironic.
Not that the authors ever use the word in that (for once, appropriate) way. Instead, we stay trapped in that Alanis Morrissette song from the mid-1990s:
It's like rain on your wedding day It's a free ride when you've already paid It's the good advice that you just didn't take... And isn't it ironic ... don't you think?
To which the answer is, of course, "No." Such things are not ironic in any sense. (Inconvenient, yes. Ironic, no.)
Now, it could be that I'm overreacting. Maybe the fact that two intelligent and capable writers -- in a major book, on an important topic, published by one of the country's top presses -- end up sounding like Alanis Morrissette is not as worrisome as it seems. Perhaps irony is not dead after all?
Either that, or we need to define the word in a new way. "Ironic, adj., of or pertaining to a situation involving no irony whatsoever."
People who met Aldous Huxley would sometimes notice that, on any given day, the turns of his conversation would follow a brilliant, unpredictable, yet by no means random course. The novelist might start out by mentioning something about Plato. Then the discussion would drift to other matters -- to Poe, the papacy, and the history of Persia, followed by musings on photosynthesis. And then, perhaps, back to Plato.
So Huxley's friends would think: "Well, it's pretty obvious which volume of the Encyclopedia Britannica he was reading this morning."
Now, it's a fair guess that whoever recounted that story (to the author of whichever biography I read it in) meant to tell it at Huxley's expense. It's not just that it makes him look like an intellectual magpie, collecting shiny facts and stray threads of history. Nor even that his erudition turns out to be pre-sorted and alphabetical.
Rather, I suspect the image of an adult habitually meandering through the pages of an encyclopedia carries a degree of stigma. There is a hint of regression about it -- if not all the way back to childhood, at least to preadolescent nerdishness.
If anything, the taboo would be even sterner for a fully licensed and bonded academic professional.
Encyclopedia entries are among the lowest form of secondary literature. Very rare exceptions can be made for cases such as Sigmund Freud's entry on "Psychoanalysis" in the 13th edition of the Britannica, or Kenneth Burke's account of his own theory of dramatism in The International Encyclopedia of the Social Sciences. You get a certain amount of credit for writing for reference books -- and more for editing them. And heaven knows that the academic presses love to turn them out. See, for example, The Encyclopedia of Religion in the South (Mercer University Press), The Encyclopedia of New Jersey (Rutgers University Press) and The International Encyclopedia of Dance (Oxford University Press), not to mention The Encyclopedia of Postmodernism (Routledge).
It might be okay to "look something up" in an encyclopedia or some other reference volume. But read them? For pleasure? The implication that you spend much time doing so would be close to an insult - a kind of academic lese majesty.
At one level, the disdain is justified. Many such works are sloppily written, superficial, and/or hopelessly unreliable. The editors of some of them display all the conscientiousness regarding plagiarism one would expect of a failing sophomore. (They grasp the concept, but do not think about it so much as to become an inconvenience.)
But my hunch is that social pressure plays a larger role in it. Real scholars read monographs! The nature of an encyclopedia is that it is, at least in principle, a work of popularization. Probably less so for The Encyclopedia of Algebraic Topology, assuming there is one. But still, there is an aura of anti-specialization and plebian accessibility that seems implicit in the very idea. And there is something almost Jacobin about organizing things in alphabetical order.
Well then, it's time. Let me confess it: I love reading encyclopedias and the like, at least in certain moods. My collection is not huge, but it gets a fair bit of use.
Aside from still-useful if not cutting- edge works such as the four-volume Encyclopedia of Philosophy (Macmillan, 1967) and Eric Partridge's indispensible Short Etymological Dictionary of Modern English Origins (Macmillan, 1958), I keep at hand any number of volumes from Routledge and Blackwell offering potted summaries of 20th century thinkers. (Probably by this time next year, we'll have the 21st century versions.)
Not long ago, for a ridiculously small price, I got the four paperbound volumes of the original edition of the Scribners Dictionary of the History of Ideas, first published in 1973 -- the table of contents of which is at times so bizarre as to seem like a practical joke. There is no entry on aesthetics, but one called "Music as Demonic Art" and another called "Music as a Divine Art." An entry called "Freedom of Speech in Antiquity" probably ought to be followed with something that brings things up to more recent times -- but no such luck.
The whole thing is now available online, with its goofy mixture of the monographic ("Newton's Opticks and Eighteenth Century Imagination") and the clueless (no entries on Aristotle or Kant, empiricism or rationalism). But somehow the weirdness is more enjoyable between covers.
And then, of course, there is the mother of them all: the Encyclopedia or Rational Dictionary of the Sciences, Arts, and Crafts that Denis Diderot and friends published in the 1750s and '60s. Aside from a couple of volumes of selections, I've grabbed every book by or about Diderot in English that I've ever come across.
Diderot himself, appropriately enough, wrote the entry for "Encyclopedia" for the Encyclopedia.
The aim of such a work, he explained, is "to collect all the knowledge scattered over the face of the earth, to present its general structure to the men with whom we live, and to transmit this to those who will come after us, so that the work of past centuries may be useful to the following centuries, that our children, by becoming more educated, may at the same time become more virtuous and happier, and that we may not die without having deserved well of the human race."
Yeah! Now that's something to shoot for. It even makes reading encyclopedias seem less like a secret vice than a profound obligation.
And if, perchance, any of you share the habit -- and have favorite reference books that you keep at hand for diversion, edification, or moral uplift -- please pass the titles along below....
Some months back, one of the cable networks debuted a movie -- evidently the pilot for a potential show -- that inspired brief excitement in some quarters, though it seems not to have caught on. Its central character was someone whose grasp of esoteric knowledge allowed him or her (I'm not sure which, never having seen it) to command the awesome mysterious forces of the universe. Its title was The Librarian.
The program was, it seems, a reworking of a similar figure in Buffy the Vampire Slayer. That's in keeping with the fundamental law of the entertainment industry once defined by Ernie Kovacs, the great American surrealist TV pioneer: "Find something that works, then beat it to death."
At another level, though, the whole concept derived from a tradition that is pre-television, indeed, almost pre-literate. The idea that a command of books provides access to secret forces, the equation of the scholar with the magus, was already well established before Faust and Prospero worked their spells. The linkage has also left its trace at the level of the signifier. Both glamor, originally meaning a kind of witchy sex appeal, and grimoire, the sorcerer's reference book, derive from the word grammar -- one of the foundational disciplines of medieval learning, hence a source of power.
Today, it's much rarer to find the whole knowledge/power nexus treated in such explicitly occultic terms, at least outside pop culture. As for librarians, they are usually regarded as professionals working in the service sector of the information economy, rather than as full-fledged participants in contemporary intellectual life. That is, arguably, an injustice. But the division of labor and the logic of hierarchical distinctions have changed a lot since the day when Gottfried Leibniz (philosopher, statesman, inventor of calculus and the computer, and overall polymathic genius) held down his day job running a library.
The most persistent aspect of the old configuration is probably the link between glamor and grammar - the lingering aura of bookish eroticism. At least that's what the phenomenon of librarian porn would suggest. The topic deserves more scholarly attention, though an important start has been made by Daniel W. Lester, the network information coordinator for Boise State University in Idaho. His bibliography of pertinent livres lus avec une seule main ("books read with one hand") is not exhaustive, but the annotations are judicious. About one such tale of lust in the stacks, he writes: "Most of the library and librarian descriptions are reasonable, except for the number of books on a book cart."
But the role librarians play at the present time brings them closer to the most pressing issues in American cultural life than any cheesy TV show (or letter to Penthouse, for that matter) could possibly convey.
Their work constitutes the real intersection of knowledge and power -- not as concepts to be analyzed, but at the level of almost nonstop practical negotiation. It is the cultural profession most involved, from day to day, with questions concerning public budgets, information technology, the cost of new publications, and intellectual freedom. (On the latter, check out the American Library Association's page on the Patriot Act.)
Given all that, I've been curious to find out about discussions by academic librarians regarding current developments in their profession, in the university, and in the world outside. A collection of essays called The Successful Academic Librarianis due out this fall from Information Today, Inc. Its emphasis seems to fall on guidance in facing career demands. But how can an outsider keep up with what academic librarians are thinking about other issues?
Well, the first place to start is The Kept-Up Academic Librarian, the blog of Steven Bell, who is director of the Gutman Library at Philadelphia University. Bell provides a running digest of academic news, but for the most part avoids the kind of reflective and/or splenetic mini-essays one associates with blogdom.
My own effort to track down something more ruminative turned up a few interesting blogs lus avec une seule main run by librarians, such as this one. But this, while stimulating, was not quite on topic. So in due course I contacted Steven Bell, on the assumption that he was as kept-up as an academic librarian could be. Could he please name a few interesting blogs by academic librarians?
His answer came as a surprise: "When you ask specifically about blogs maintained by academic librarians," Bell wrote earlier this week, "the list would be short or non-existent."
He qualified the comment by noting the numerous gray areas. "There may be some academic librarians out there with an interesting blog, but in some cases I think the blogger is doing it anonymously and you don't really even know if the person is an academic librarian. For example, take a look at Blog Without a Library. I can't tell who this blogger is though I think he or she might be an academic librarian. On the other hand Jill Stover's Library Marketing blog is fairly new and pretty good, and she is an academic librarian -- but the blog really isn't specific to academic libraries.... Bill Drew of one of the SUNY libraries has something he calls BabyBoomer Librarian but it isn't necessarily about academic librarianship -- sometimes yes, but more often not."
Bell listed a few other blogs, including Humanities Librarian from the College of New Jersey. But very few of his suggestions were quite what I had in mind -- that is, public spaces devoted to thinking out loud about topics such as the much-vaunted "crisis in academic publishing." It was a puzzling silence.
"I can't say any individual has developed a blog that has emerged as the 'voice of academic librarianship,' " noted Bell in response to my query. "Why? If I had to advance a theory I'd say that as academic librarians we are still geared towards traditional, journal publishing as the way to express ourselves. I know that if I have something on my mind that I'd like to write about to share my thoughts and opinions, I'm more likely to write something for formal publication (e.g., see this piece.) Perhaps that is why we don't have a 'juicy' academic librarian out there who is taking on the issues of the day with vocal opinions."
And he added something that makes a lot of sense: "To have a really great blog you have to be able to consistently speak to the issues of the day and have great (or even good) insights into them -- and it just doesn't seem like any academic librarian out there is capable of doing that. I think there are some folks in our profession who might be capable of doing it. But if so they haven't figured out yet that they ought to be blogging, or maybe they just don't have the time or interest."
Now, that diagnosis may perhaps contain the elements of a solution. The answer might be the creation of a group blog for academic librarians -- some prominent in their field, others less well-known, and perhaps even a couple of them anonymous. No one participant would be under pressure to generate fresh insights every day or two. By pooling resources, such a group could strike terror in the hearts of budget-cutting administrators, price-gouging journal publishers, and even the occasional professor prone to associating academic stardom with aristocratic privilege.
Full disclosure: I am married to a librarian, albeit a non-academic one, who knew about the World Wide Web (and the proper grammar for using various search engines) long before most people did. She has proven to me, time and again, that librarians do indeed possess amazing powers. They also tend to have a lot to say about the bureaucracies that employ them -- and the patrons who patronize them.
An outspoken, incisive, and timely stream of commentary on the problems and possibilities facing academic libraries would enliven and enrich the public discourse. If anything, it's long overdue.
Scott McLemee writes Intellectual Affairs on Tuesdays and Thursdays. One of his previous columns was on the pleasures of reading encyclopedias.
Pierre Bourdieu had a way of getting under one's skin. I don't mean his overtly polemical works -- the writings against globalization and neoliberalism that appeared toward the end of his life, for example. He was at his most incisive and needling in works that were much less easy for the general public to read. Bourdieu's sociological research kept mapping out the way power, prestige, and exclusion work in the spheres of politics, economics, and culture.
He was especially sharp (some thought brutal) in analyzing the French academic world. At the same time, he did very well in that system; very well indeed. He was critical of the way some scholars used expertise in one field to leverage themselves into positions of influence having no connection with their training or particular field of confidence. It could make him sound like a scold. At the same time, it often felt like Bourdieu might be criticizing his own temptation to become an oracle.
In the course of my own untutored reading of Bourdieu over the years, there came a moment when the complexity of his arguments and the aggressiveness of his insights suddenly felt like manifestations of a personality that was angry on the surface, and terribly disappointed somewhere underneath. His tone registered an acute (even an excruciating) ambivalence toward intellectual life in general and the educational system in particular.
Stray references in his work revealed glimpses of Bourdieu as a "scholarship boy" from a family that was both rural and lower-middle class. You learned that he had trained to be a philosopher in the best school in the country. Yet there was also the element of refusal in even his most theoretical work -- an almost indignant rejection of the role of Master Thinker (played to perfection in his youth by Jean-Paul Sartre) in the name of empirical sociological research.
There is now a fairly enormous secondary literature on Bourdieu in English. Of the half-dozen or so books on him that I've read in the past few years, one has made an especially strong impression, Deborah Reed-Danahay's recent study Locating Bourdieu (Indiana University Press, 2005). Without reducing his work to memoir, she nonetheless fleshes out the autobiographical overtones of Bourdieu's major concepts and research projects. (My only complaint about the book is that it wasn't published 10 years ago: Although it is a monograph on his work rather than an introductory survey, it would also be a very good place for the new reader of Bourdieu to start.)
Reed-Danahay is a professor of anthropology at the University of Texas at Arlington. She recently answered a series of questions by e-mail. Q: Bourdieu published sociological analyses of the Algerian peasantry, the French academic system, the work of Martin Heidegger, and patterns of attendance at art galleries -- to give only a very incomplete list. Yet his work seems much more focused and coherent than a catalog of topics would suggest. Can you to sum up the gist of his work, or rather how it all holds together?
A: Yes, I agree that, at first glance, Bourdieu's work seems to cover a seemingly disparate series of studies. When I read Bourdieu's work on education in France after first being exposed to his Algerian peasant studies in my graduate work in anthropology, I wondered if this was the same person. But when the entire corpus is taken together, and when one carefully reads Bourdieu's many texts that returned to themes brought up earlier in his work, one can see several underlying themes and recurring intellectual questions.
One way to get a handle on his work is to realize that Bourdieu was interested in explaining social stratification, and the hierarchy of social values, in contemporary capitalist societies. He wanted to study systems of domination in a way that held some room for social agency but without a notion of complete individual freedom. Bourdieu often evoked Althusser as an example of a theorist who had too mechanical a view of internalized domination, while Sartre represented the opposite extreme of a philosopher who posited free will.
Bourdieu believed that we are all constrained by our internalized dispositions (our habitus), deriving from the milieu in which we are socialized, which influence our world view, values, expectations for the future, and tastes. These attributes are part of the symbolic or cultural capital of a social group.
In a stratified society, a higher value is associated with the symbolic capital of members of the dominant sectors versus the less dominant and "controlled" sectors of society. So that people who go to museums and like abstract art, for instance, are expressing a form of symbolic capital that is more highly valued than that of someone who either rarely goes to museums or who doesn't like abstract art. The person feels that this is "just" a matter of taste, but this can have important consequences for children at school who have not been exposed to various forms of symbolic capital by their families.
Bourdieu studied both social processes (such as the French educational system, Algerian postcolonial economic dislocations, or the rural French marriage system), and individual figures and their social trajectories -- including Heidegger, Flaubert, and an Algerian worker. Bourdieu was trying to show how the choices these people made (and he often wrote of choices that were not really choices) were expressions of the articulation of habitus and the social field in which it is operating.
Q: Something about his career always seemed paradoxical. Sartre was always his worst-case reference, for example. But by the time of his death in 2002, Bourdieu was regarded as the person who had filled Sartre's shoes. Has your work given you a sense of how to resolve this seeming contradiction?
A: There is a lot of silence in Bourdieu's work on the ways in which he acquired power and prestige within the French academic system or about how he became the most famous public intellectual in France at the time of his death. He was more self-reflexive about earlier periods in his life. I have trouble defending Bourdieu in this contradiction about his stance toward the public intellectual, even though I applaud his political engagements.
I think that Bourdieu felt he had more authority to speak to some of these issues than did other academics, given his social origins and empirical research in Algeria and France among the underclass. Bourdieu repeatedly positioned the sociologist (and one can only assume he meant himself here, too) as having a privileged perspective on the "reality" of systems of domination.
Bourdieu was very critical of Sartre for speaking out about the war in Algeria and for championing a sort of revolutionary spirit among Algerians. Bourdieu accused him of trying to be a "total intellectual" who could speak on any topic and who did not understand the situation in Algeria as profoundly as did the sociologist-ethnologist Bourdieu. When Bourdieu himself became more visible in his own political views (particularly in attacks against globalization and neo-liberalism), he does seem to have acted like the "journalist"-academics he lampooned in Homo Academicus. Nevertheless, when he was criticizing (in his essay On Television) what he saw as the necessity for "fast thinking" on television talk shows in France, where talking heads must quickly have something to say about anything, Bourdieu did (in his defense) refrain from pontificating about any and everything.
There is still a huge controversy raging in France about Bourdieu's political engagements. His detractors vilify him for his attacks against other intellectuals and journalists while he became a public intellectual himself. His defenders have published a book of his political writings ( Interventions, 1961-2001) seeking to show his long-standing commitments, and continue to guard his reputation beyond the grave.
I cannot help but think that Bourdieu's public combative persona, and his (in his own terms) refusals and ruptures, helped rather than thwarted his academic career. The degree to which this was calculated or (as he claimed) was the result of the "peasant" habitus he acquired growing up in southwestern France, is unknown.
Q: So much of his analysis of academic life is focused on the French university system that there is always a question of how well it could apply elsewhere. I'm curious about your thoughts on this. What's it been like to move between his concepts and models and your own experience as an American academic?
A: I see two ways to answer your question. Certainly, in the specifics, French academia is very different. I have experienced that directly. My own American cultural values of independence (which may, I am aware, be a total illusion) conflict with those of many French academics.
When I first arrived in France to do my dissertation fieldwork, I came with a grant that opened some doors to French academia, but I had little direct sponsorship by powerful patrons in the U.S. I was doing a project that had little to do with the work of my professors, none of whom had done research in France or Europe, and it was something that I had come up with on my own. This was surprising to the French, who were familiar with a patron-client system of professor/student relations. Most of the graduate students I met in France were involved in projects related to the work of their professors.
French academia, still centralized in Paris despite attempts at decentralization, is a much smaller universe than that of the vast American system. There is little room for remaining "outside" of various polemics there. I've learned, for instance, that some people whom I like and admire in France hated Bourdieu and that Bourdieu followers tend to be very fierce in their defense of him and want to promote their view of his work.
This is not to say that American academia doesn't have similar forces operating, but there are multiple points of value and hierarchy here. Whereas Bourdieu could say that Philosophy dominated French academia during the mid-20th century, it is harder to pinpoint one single dominant intellectual framework here.
I do, however, feel that Bourdieu's critique of academia as part of a larger project of the study of power (which he made very explicit in The State Nobility) is applicable beyond France. His work on academia provided us with a method of inquiry to look at the symbolic capital associated with academic advancement and, although the specific register of this will be different in different national contexts, the process may be similar. Just as Bourdieu did in France, for example, one could study how it is that elite universities here "select" students and professors.
Q: We have a memoir of Sartre's childhood in The Words. Is there anything comparable for Bourdieu?
A: Bourdieu produced self-referential writings that began to appear in the late 1990s, with "Impersonal Confessions" in Pascalian Meditations (1997), a section called "Sketch for a Self-Analysis" in his final lectures to the Collège de France, Science of Science and Reflexivity (2001), and then the stand-alone volume Esquisse pour une Auto-Analyse, published posthumously in 2004. [ Unlike the other titles listed, this last volume is not yet available in English. -- S.M.]
A statement by Bourdieu that "this is not an autobiography" appears as an epigraph to the 2004 essay. I find his autobiographical writings interesting because they show us a bit about how he wanted to use his own methods of socio-analysis on himself and his own life, with a focus particularly on his formative years -- his childhood, his education, his introduction to academia, and his experiences in Algeria.
Bourdieu was uncomfortable with what he saw as narcissism in much autobiography, and also was theoretically uncomfortable with life stories that stressed the individual as hero without sufficient social analysis. He had earlier written an essay on the 'biographical illusion" that elaborated on his biographical approach, but without self-reference. These essays are not, then, autobiographical in the conventional sense of a linear narrative of a life. Bourdieu felt that a truly scientific sociology depended on reflexivity on the part of the researcher, and by this he meant being able to analyze one's own position in the social field and one's own habitus.
On the one hand, however, Bourdieu's auto-analysis was a defensive move meant to preempt his critics. Bourdieu included a section on self-interpretation in his book on Heidegger, in which he referred to it as "the riposte of the author to those interpretations and interpreters who at once objectify and legitimize the author, by telling him what he is and thereby authorizing him to be what they say he is..." (101). As Bourdieu became increasingly a figure in the public eye and increasingly a figure of analysis and criticism, he wanted to explain himself and thus turned to self-interpretation and auto-analysis.
Q: In a lot of ways, Bourdieu seems like a corrosive thinker: someone who strips away illusions, rationalizations, the self-serving beliefs that institutions foster in their members. But can you identify a kernel of something positive or hopeful in his work -- especially in regard to education? I'd like to think there is one....
A: Bourdieu had little to say about how schools and universities operate that is positive, and he was very critical of them. The hopeful kernel here is that in understanding how they operate, how they inflict symbolic violence and perpetuate the illusions that enable systems of domination, we can improve educational institutions.
Bourdieu felt strongly that by de-mystifying the discourses and aura of authority surrounding education (especially its elite forms), we can learn something useful. The trick is how to turn this knowledge into power, and Bourdieu did not have any magical solutions for this. That is work still to be done.
The other result was a string of advertisements for online services offering to hook you up with married people in your area. For now, anyway, those ads have disappeared -- perhaps as the result of some tweak in Google's famous algorithms. In any case, they came as a surprise. But my wife (who has forgotten more about search engines I will ever know) rolled her eyes and said, "I knew it was going to happen when you named the column that."
Ex post facto, it does seem obvious. After all "intellectual" doesn't count for much, product-placement-wise. In the American vernacular, it is a word usually accompanied by such modifiers as "pseudo" and "so-called" (just as the sea in Homer is always described as "wine-dark"). No doubt the Google algorithm, if tweaked a bit more, will one day lead you right to the personals ads for the New York Review of Books. For now, at least, the offers for a carnal carnival cruise are gone.
Meanwhile, Inside Higher Ed has now launched a page with a running list of Intellectual Affairs columns from February to the present. It has more than three dozen items, so far -- an assortment of essays, interviews, causeries, feuilletons, and uncategorizable thumbsuckers ... all in one central location, suitable for bookmarking.
It's also worth mentioning that Inside Higher Ed itself now offers RSS and XML feeds. (The editors are too busy or diffident to announce this, but some public notice of it is overdue.) To sign up, go to the home page and look for the buttons at the bottom.
This might also be a good time to invite readers to submit tips for Intellectual Affairs -- your thoughts on subjects to cover, books to examine, arguments to follow, people to interview. This column will strive, in coming months, to be equal parts Dennis Diderot and Walter Winchell. Your brilliant insights, unconfirmed hunches, and unsubstantiated hearsay are more than welcome. (Of course, that means I'll have to go confirm and substantiate them, but such is the nature of the gig.) Direct your mail here.
Word has it that IA is going to be tapped by Emily Gordon, of the Emdashes blog, for the current "book meme" -- a circulating questionnaire that invites participants to list what they've bought and read lately.
As you may recall, the field of memetics came into a certain short-lived prominence some years ago - one of those cases of an extended metaphor morphing into something like a school of thought. It rested on an analogy between ideas and genetic material. Concepts, ideologies, and trends were self-replicating "memes" that propagated themselves by spreading through cultural host populations, like mononucleosis at a rave.
Memetics itself hasn't had all that much staying power; it seems, by and large, to have gone the way of the Y2K survival kit. But the term "meme" has, paradoxically enough, proven much hardier -- particularly in the blogosphere. (One theory is that it appeals to bloggers because it has "me" in it, twice.)
So anyway, my responses to the meme, forthwith.
How many books have you owned?
This I cannot answer with any confidence. At present, I have somewhere between three and four thousand. Over the years, I have made regular efforts to clear room by selling or giving large numbers of books away. A few months ago, for example, I parted with about a thousand of them.
Such a purge feels good, once the initial hesitation is overcome. There is even a kind of giddiness, as the herd begins to thin. But afterwards, I always have pangs of regret. When you need it, a missing title is like a phantom limb. There's a maddening and persistent itch you can no longer scratch.
The remaining volumes are arranged alphabetically by author's name - with certain exceptions. For example, there is about half a shelf containing collections of papers on postcolonialism, post-Fordism, postmodernism, and poststructuralism. These are planted right next to some titles by the neoconservative writer Norman Podhoretz. (I like to imagine that these books make each other really uncomfortable.)
What is the last book you bought?
That would be a set of eight rather hefty pamphlets called The Key to Love and Sex (1928) by Joseph McCabe, whose 40-volume series The Key to Culture (1927) is proving somewhat more difficult to collect. I am also awaiting the arrival of McCabe's autobiography Eighty Years aRebel, first published in 1947 -- and, like most of his work, long since out of print.
McCabe is now all but completely forgotten, but in his prime he was a force of nature. Born into a working-class family in Manchester, England, McCabe entered a Franciscan monastery at the age of 15 and became a professor of philosophy for the order. After years of private dialectical wrangling, he concluded that God did not exist. That meant starting over at the age of 27. Whatever anguish his years as a monk cost him, the experience left McCabe with a command of languages and powers of concentration that almost defy imagining.
Apart from translating about 30 volumes of literary, scientific, historical, and philosophical work, McCabe wrote a staggering array of books and pamphlets. Many of his books were works of popularization, but several were specialized works of scholarship in their own right. He was also a tireless lecturer and debater -- as aggressive a spokesman for secular rationality as one can imagine.
H.G. Wells called McCabe "the trained athlete of disbelief." That was, if anything, an understatement. For an admiring (indeed, almost hagiographical) account of his life and work, see Bill Cooke's recent biographical study A Rebel to His Last Breath: Joseph McCabe and Rationalism (Prometheus Books).
Name five books that mean a lot to you.
The list would come out differently with a little more thought. But off the top of my head, and in chronological order of their discovery, I'd name:
The Modern Library edition of Franz Kafka's short stories. How this ended up in a rural high school in the Bible Belt is still something of a mystery. I started reading Kafka in the best possible way - that is, with no idea who he was, or what reputation he might have. (Of course, he had no reputationat all in East Texas, come to think of it.) This made the shock of revelation that much more keen.
Susan Sontag, Against Interpretation. Same high school, different year. (Escape plans forming.) Apart from the cool grace of her own style, Sontag's early essays provided a reading list of figures such as Barthes, Benjamin, Lukacs, and Foucault. It was a much more lively engagement with their concerns than anything I've encountered at, say, MLA over the years.
Seymour Krim, Views of a Nearsighted Cannoneer. Part old-fashioned New York intellectual, part Beat hipster, Seymour Krim wrote this batch of critical and personal essays in the 1950s, long before the term "creative nonfiction" made its inexplicable appearance on the creative-writing scene. (Nothing he published afterwards was even half as good.) I'm almost reluctant to mention this volume in public. Rereading Views has been an occasional private ritual of mine for almost 25 years..
Kenneth Burke, Permanence and Change. There is scarcely a word that begins to describe this book from the 1930s. Burke tried to fuse Marx, Nietzsche, Freud, Veblen, George Herbert Mead, and who knows what else into a theory that would help him understand what was going on (1) in the world at large and (2) between his own ears. (Not long after the Great Depression started, his marriage disintegrated. Burke had a lot to theorize about, in other words.) This is in the category of books that I keep around in two copies: one filled with annotations, the other kept unmarked for reading without distraction.
C. L. R. James, Spheres of Existence. This anthology of political and cultural writings (the last of three volumes of James's selected works) was my first introduction to a revolutionary activist, historian, and thinker whose legacy only looks more important with time. (For a biographical overview, go here. [ www.mclemee.com/id84.html ) Sinceencountering Spheres, I've spent years on research into James's life and work, and edited a couple of volumes of his writings, neither of which was half as good as Spheres. It really ought to be back in print.
What are you reading now?
Various things by and about Joseph McCabe, that freethinking workaholic. Not long ago, I bought a pamphlet by McCabe called The Meaning of Existentialism (1946). At lunch last week with some mutual friends, I showed it to Henry Farrell of Crooked Timber, who seemed amused by the work's long, explanatory subtitle: "The New Philosophy, Founded by Sartre, That Has Made Quick Progress Among the Volatile Young Men of Paris' Latin Quarter."
After hearing my thumbnail biographical lecture, Henry started reading the booklet. After a couple of pages, he looked up and said, with evident surprise, "This is pretty good! It isn't hackwork."
Indeed not. As an overview of Sartre's thought, it was probably as good as anything in English at the time. And McCabe was nearly 80 years old when he wrote it. (He was down to publishing only a few hundred thousand words a year.)
That's why I have adopted a new motto to get through each day. It's short, simple, and easy to remember: "What would Joseph McCabe do?"
The nature of this sort of meme is that I should, at this point, invite four or five people to answer the same questions. Please note the comments field below, and consider yourself invited.
One sign of the great flexibility of American English -- if also of its high tolerance for ugliness -- is the casual way users will turn a noun into a verb. It happens all the time. And lately, it tends to be a matter of branding. You "xerox" an article and "tivo" a movie. Just for the record, neither Xerox nor TiVo is very happy about such unauthorized usage of its name. Such idioms are, in effect, a dilution of the trademark.
Which creates an odd little double bind for anyone with the culture-jamming instinct to Stick It To The Man. Should you absolutely refuse to give free advertising to either Xerox or TiVo by using their names as verbs, you have actually thereby fallen into line with corporate policy. Then again, if you defy their efforts to police ordinary language, that means repeating a company name as if it were something natural and inevitable. See, that's how they get ya.
On a less antiglobalizational note, I've been trying to come up with an alternative to using "meme" as a verb. For one thing, it is too close to "mime," with all the queasiness that word evokes.
As discussed here on Tuesday, meme started out as a noun implying a theory. It called to mind a more or less biological model of how cultural phenomena (ideas, fads, ideologies, etc.) spread and reproduce themselves over time. Recently the term has settled into common usage -- in a different, if related, sense. It now applies to certain kinds of questionnaires or discussion topics that circulate within (and sometimes between) blogospheric communities.
There does not seem to be an accepted word to name the creation and initial dissemination of a meme. So it could be that "meme" must also serve, for better or worse, as a transitive verb.
In any case, my options are limited.... Verbal elegance be damned: Let's meme.
The ground rules won't be complicated. The list of questions is short, but ought to yield some interesting responses. With luck, the brevity will speed up circulation.
In keeping with meme protocol, I'll "tap"a few bloggers to respond. Presumably they will do likewise. However, the invitation is not restricted to that handful of people: This meme is open to anyone who wants to participate.
So here are the questions:
(1) Imagine it's 2015. You are visiting the library at a major research university. You go over to a computer terminal (or whatever it is they use in 2015) that gives you immediate access to any book or journal article on any topic you want. What do you look up? In other words, what do you hope somebody will have written in the meantime?
(2) What is the strangest thing you've ever heard or seen at a conference? No names, please. Refer to "Professor X" or "Ms. Y" if you must. Double credit if you were directly affected. Triple if you then said or did something equally weird.
(3) Name a writer, scholar, or otherwise worthy person you admire so much that meeting him or her would probably reduce you to awestruck silence.
(4) What are two or three blogs or other Web sites you often read that don't seem to be on many people's radar? Feel free to discard anything you don't care to answer.
To get things started, I'm going to tap a few individuals -- people I've had only fairly brief contact with in the past. As indicated, however, anyone else who wants to respond is welcome to do so. The initial list:
An afterthought on the first question -- the one about getting a chance to look things up in a library of the future: Keep in mind the cautionary example of Enoch Soames, the minor late-Victorian poet whose story Max Beerbohm tells. He sold his soul to the devil for a chance to spend an afternoon in the British Library, 100 years in the future, reading what historians and critics would eventually say about his work.
Soames ends up in hell a little early: The card catalog shows that posterity has ignored him even more thoroughly than his contemporaries did.
Proof, anyway, that ego surfing is really bad for you, even in the future. A word to the wise.
I only saw her out of the corner of my eye as I rushed into the book exhibit at the conference, but I was sure I knew her. Her face registered as out of context, somehow, but familiar. A second later, I realized it was one of my students, a recent English-major graduate of the liberal arts college at which I teach. I stopped, turned around and called to her.
She was pleased to see me. She’s a marketing assistant for a major academic publishing house, it turns out. I could tell she was proud of her job, pushing English composition and literature texts to English professors like me. We arranged to meet for dinner the next day, two professionals on a business trip.
I stopped by the marketing assistant’s exhibit while she was out at lunch, and her colleagues were anxious to find out how she had been in my classes. "She must have been a great student, huh?" one of her colleagues prompted me. Hmm. She had been solid, reliable, a good writer, and she always had something interesting to say in class, but the marketing assistant had not been one of our stars. Still, none of our stars of recent years had jobs like hers, working with literature.
Clearly her co-workers loved her. They spoke very fondly of her, and, indeed, she seemed to be very good at her job. What I hadn’t noticed in the classroom was the key quality that was working for the marketing assistant in the world after college: not her knowledge of literature but her skills with people. This I discovered very quickly the next evening at dinner.
I had already had a date for dinner that night with a friend of mine, a fiction writer, so I asked the marketing assistant if I could bring him along. "Sure," she said. "I can expense it. I’m just taking two English professors out." A new verb for me: to expense. I liked it.
She quickly took charge of the expedition, finding good restaurants and putting her name on the waiting list of one while we searched for another (Why had I never thought of that? I guess it’s not really cheating).
The marketing assistant had always been ready with an answer in class, but we’d never actually talked much about anything other than Victorian literature. Turns out she’s pretty funny, and very professional. She told great stories, often at the expense of some poor academic schmuck who stopped by her booth, intent on pitching his or her latest project. I felt sorry for the folks she described, but not because she mocked them -- she didn’t; she described them quite affectionately, as if she knew they couldn’t help themselves. The fiction writer and I shook our heads with her when she described the guy whose project was so impossibly narrow that no academic press would ever publish it. We chuckled along, though less heartily, when she wondered aloud at the fashion sense of the professoriate.
"When you look around the gate at the airport, you can always tell who’s going to the same conference you are," the marketing assistant said. Of course, we could, too, and the fiction writer and I had already had that obligatory conversation, this being his first professional conference. But it was different hearing it from the perspective of the marketing assistant. After all, as a friend of mine said ruefully, gazing around the lobby of one of the convention hotels a few years ago, “These are my people.”
When the marketing assistant got to the social skills of professors, we felt ourselves on relatively safe ground. Fiction writer has a fabulous, wry sense of humor and is good to have at parties, and I have always prided myself on being able to talk to anybody. We are not nerdy bookworms -- we both went to our proms. I snickered at her description of awkward social interactions she’d observed between academics. “It’s amazing you guys found people to get hooked up with,” she declared good-naturedly -- in her eyes we were no different from the guy we had just seen mumbling to himself as he wandered through the book exhibit. Maybe we weren’t. Maybe these really were our people.
They weren’t her people; that’s for sure. The marketing assistant had perspective on our folk that we clearly didn’t have. And that made it really fun to talk to her. I enjoyed seeing her in her professional persona. I was proud of her, glad one of our grads seemed to be heading for a successful career in publishing. But seeing her made me realize that I may not be the best assessor of my students’ skills.
Although the marketing assistant is great at her job, I would not have been able to predict that. When I look at my students, I realize, I have always concentrated on particular skills that are not necessarily the ones that will serve them best after college. Who writes the best? Whose research is most thorough? Whose reading of the novel is the most subtle? Not the most marketable skills, though they will get you into graduate school.
The marketing assistant is the same young woman she was when she was in my classroom. But much of her incisive observation, her wit, her distanced assessment and clever summing-up had passed me by when she was in college. What a letter of recommendation I could write for her now. Of course, she doesn’t need it now. She’s already moved on.
Paula Krebs is professor of English at Wheaton College, in Massachusetts.
On Tuesday -- as the republic awaited the formal launch of the latest Supreme Court nomination death-match -- Stanley Fish appeared in The New York Times with a short article titled "Intentional Neglect." Its thesis is sharp, bold, and deceptively straightforward.
As we enter the inescapable squall of debate over who shall take the place of Sandra Day O'Connor, announces Fish, we need to be clear on some basic things. Interpreting the Constitution is a matter of determining its authors' intent. Talk of "a living constitution" that must remain open to the changing times -- that, in short, is not interpretation, but a roundabout means of rewriting the Constitution.
The response in some quarters has involved gestures of shock -- and from one or two conservatives, anyway, of gratified astonishment. How sensible the man is! What a voice for sweet reason! Is this Stanley Fish not the same man who turned the English department at Duke into a training camp for left-wing theoretical guerillas? Has he perhaps had a change of heart?
Fish is widely recognized, even outside academe, as a celebrity and a power broker. He is the one person at the annual convention of the Modern Language Association who does not wear a name tag. And he has a well-established profile as the champion of the anti-foundationalism that non-academic civilians understand to be the, well, foundation of contemporary academic radicalism.
So when he goes on "The O'Reilly Show" -- or weighs in with an op-ed in the Times -- many people naturally assume that Fish is speaking as some kind of leftist. Hence the surprise at his latest article, which at least some readers might take as an application to join the Federalist Society.
All of which underscores the difference between being well-known and being well-understood.
There is nothing in Tuesday's op-ed that Fish hasn't argued many times over the years. Many, many times, over many, many years. (Whatever debate may exist over his other virtues, the man is a stickler for consistency.) But he is so famous that his ideas have long since been dwarfed by his reputation.
His current stress on the framers' intent as the necessary focus of interpreting the Constitution sound paradoxical, coming from a literary theorist who came into prominence, in the 1970s, as the most dogged champion of reader-response criticism. Actually, there is a pretty direct line of march from one position to the next.
One modest claim in favor of the reader-response school might be that its very name was a case of truth in advertising. (I speak of it in the past tense because it's been some while since the movement was at its peak. No doubt there are still a few partisans still fighting for the cause, like those stray Japanese soldiers from World War Two who used to turn up on islands in the Pacific.) Reader-response analysis involved looking at how the audience of a literary work interacted with it -- how, in a sense, the meaning of a text was produced at the moment the reader was consuming it.
That sounds like a recipe for, well, just making stuff up. Mix one part epistemological relativism and one part narcissism, and you get the sophomore's hermeneutic: "That's what the book means to me." Add a dash of paranoia, and you get: "I think Shakespeare was a Freemason, and my reading is as good as any other."
But you can't judge a method by its most inane or implausible applications. In the case of Fish's version of reader-response analysis, there was a sort of hermeneutic shock-absorber built right in. He called it "the interpretive community." An individual reader might come up with some bizarre personal meaning for a work. For the most part, however, reading is conditioned by one's membership in groups, and those groups tend to create something like a consensus about what counts as the range of sound understanding. There are rules for what counts as good evidence, or a well-made argument.
Normally those rules aren't written down someplace. They exist at the level of tacit knowledge; you either absorb them and read accordingly, or you aren't really part of that particular community. And the rules can change over time. A work's meaning isn't fixed for all time, like a face sculpted in marble. Nor does it change at random, like a kaleidoscope image. It's more like the various productions of a play -- varying over time depending on who's directing, who's acting, and how big the stage is.
Fish's later writings on law and on current issues are, in effect, an expansion of the idea of the interpretive community to the world beyond the printed page. We participate in institutions and in civic life in the same sense that we read and understand a work of literature -- as people who always find ourselves embedded in a structure of rules, assumptions, traditions, etc., that implicitly govern what counts as acceptable.
From Fish's perspective, it is the mistake of a certain kind of fundamentalism (religious or secular) to think that we can get to the level of bedrock truths that aren't so conditioned. Or to think that, by reasoning, we can ascend to lofty heights of abstraction, far above all the diverse and squabbling micro-communities. You never get outside of some kind of interpretive community, following rules that are socially constructed.
But that doesn't mean they are imaginary -- that anything goes.
Fish's often uses the game of baseball as his example of something that is both socially constructed and real. Does a baseball or bat exist in nature? Does "three strikes and you're out" follow from any law known to the sciences? The answer, in each case, is "no." Are baseballs and bats real? Does the three-strikes rule have predictable effects on the course of the game? Likewise, the answer is "yes." So the game of baseball is both socially constructed and real. It is the product of human activity, but not subject to anybody's whim. (The umpire's eyesight, yes. But that doesn't gainsay Fish's basic point.)
Now, talk of social construction always sounds like it might have a radical agenda. To anyone who thinks in terms of natural law, it reeks of Jacobinism. After all, if something is socially constructed, that means that it might be re-constructed, right? And that means it probably should be, at some point.... The next thing you know, there are guillotines.
But actually, if you look at them closely, Fish's ideas seem a bit closer to the counter-Enlightenment doctrines that emerged following the French revolution. The binding force of community, the subordination of reason to the implicit code of tradition, the sense that our freedom is limited (or at least conditioned) by rules that can't be redrawn all at once ... this sounds a little bit like something from Edmund Burke, or at least from Russell Kirk's The Conservative Mind.
Not to go overboard with this. When he gives voice to political opinions (in favor of affirmative action, say, or in defense of speech codes) he tends to sound like a garden-variety liberal. But Fish has been very skeptical of the academic left, on the grounds that radical professors tend to blur the distinction between scholarship and political activity. As he argued in Professional Correctness, published 10 years ago by Harvard University Press, "queering Shakespeare" isn't political in the same sense as mobilizing to increase AIDS funding; rather, it's a matter of making certain moves in the interpretive community that is interested in Elizabethan literature.
In Fish's own words: "There are no regular routes by which the accomplishments of academics in general and literary academics in particular can be transformed into the currency of politics." And the effort to bring his ideas to bear on legal theory, over the years, have not really disproved that point.
In effect, Fish's writings have been a way of minimizing the possible interaction between law and literature. He has argued -- with exhausting, even wearying consistency -- that the conduct of legal affairs is ultimately a matter of the legal interpretive community following its own codes, traditions, and methods.
A case in point is Fish's seemingly straightforward claim that "interpreting the Constitution" means "trying to figure out what the framers had in mind." That sounds like a directive -- as if Fish is saying that we'd just need to find the right quotation from The Federalist Paper, perhaps, to understand how to apply the Constitution to legislation regarding stem-cell research. And there is, then, a strong tendency to assume that such an interpretation would then tilt toward the conservative side.
But not so fast. As Fish noted in a discussion of the Bork nomination, it is a mistake to cede "original intent" arguments to the right, just because some conservative jurists frame their arguments in those terms.
"It is perfectly possible," wrote Fish, "to be in favor of abortion rights and also to label oneself as an originalist, as someone who hews to the intention of the framers. It would just be a matter of characterizing those intentions so that the right to abortion would seem obviously to follow from them.
One might, for example, argue (as many have) that even though the Fourteenth Amendment nowhere mentions abortion rights, a correct understanding of its authors' more general intention requires that such rights be protected." Likewise, one could argue against abortion rights on grounds that aren't anchored in claims about original intent.
"In short, there is no necessary relationship between declaring oneself an originalist and coming out on one side or the other of a particular issue."
Putting it another way, the effort to define "original intent" is both a basic part of the work of the legal interpretive community and a product of rules specific to that community. Some sharp-eyed person may well put my head on a platter for saying this, but what the hell: It sure looks as if Stanley Fish has reinvented legal positivism by way of a kind of roll-with-the-punches pragmatism.
Speaking of punches ... they should start flying any minute now. What does Fish have to say about the debates that are about to ensue? How should the issues of the nomination fight be understood by those of us who are mere citizens of the Republic, rather than members of the legal interpretive community? His advice, in short, is to recognize that it's not a question of whether or not the nominee is an originalist, but rather, of what kind.
"So," as he put it on Tuesday, " if you want to know how someone is likely to act on the bench, you will have to set all the labels aside and pay attention to the nominee's reasoning in response to the posing of hypothetical situations.... Does he or she construe intention narrowly and limit it to possibilities the framers could have foreseen, or is intention considered more broadly and extended to the positions the framers would likely have taken if they knew then what we know now? ... And then, if after having made that calculation you decide you are for this person, you can hope that the performance you see today predicts the performances of years to come. But don't bet on it."
There may be a certain amount of insight in Fish's thoughts. Still, it seems like the kind of wisdom that doesn't really do anybody much good.
In January 2002, on his way back from an academic conference, a young journalist named David W. Miller was killed by an intoxicated driver, along with the two people who were giving him a ride home from the airport. As often happens, the drunk was unhurt. Now he's in prison -- where, with any luck, he will serve every single day of his sentence. There are old and very reasonable arguments for why justice cannot, by definition, be a matter of revenge. But I am happy to ignore them, in this case -- for David was my colleague, and someone I respected enormously, and he was just about to take off a couple of months of paternity leave following the birth of his second child. It does not seem possible that the man who killed him could suffer enough.
Now, it would be sentimental overstatement for me to claim a deep friendship. But there was more to our collegiality than the usual blend of mutual tolerance and bland amicability required to make a workplace tolerable. That we could talk without yelling at one another seemed, at the time, like a tiny miracle of civilization. David had worked for the Heritage Foundation's Policy Review and was not exactly shamefaced about being a conservative -- while in my cubicle there was a portrait of Lenin.
In fact, he looks down on me now, here in my study at home. I have sworn to take his picture down if, and only if, Henry Kissinger ends up on trial for crimes against humanity. (Frankly, I'm tired of looking at old V.I., but am still awaiting that necessary bit of evidence that bourgeois democracy is capable of truth and reconciliation.)
David and I had the occasional, let's say, spirited conversation. Neither of us ever persuaded the other of much. With hindsight, however, it's clear that knowing him was incredibly instructive -- and not just because he kept up with scholarship in the social sciences that were far from my own stomping grounds.
He was, as the saying goes, a "movement conservative," in touch with the ideas and arguments being cooked up in the right-wing think tanks. But he was as intellectually honest as anyone could be. Around the time we first met, he had just published an article on the famous "broken window syndrome" -- that basic doctrine of conservative social policy -- showing there was scarcely any solid research to back it up. And when he did argue for any given element of the right's agenda, it was hard to escape the sense that he did so from the firm conviction that it would bring the greatest good to the greatest number of people.
In short, talking with David meant facing a repeated obligation to think the unthinkable: that someone could be a conservative without suffering from either cognitive deficit or profound moral stupidity.
Of course, any person who spends very long on the left must come face to face, eventually, with the hard truth that a certain percentage of one's comrades are malevolent, cretinous, thoughtless, or palpably insane. This is troubling, but you get used to it. What proves much more disconcerting is the realization that someone from the other side possesses real virtues -- and that they hold their views, not in spite of their better qualities, but in consequence of them.
All of this came back to mind upon reading a recent profile of Roger Scruton, the conservative British philosopher. In most respects, it is a typical newspaper piece on a thinker. That is, it avoids any effort to discuss his work (or even to describe it) and focuses instead on his personality, which on a generous estimate may be described as curmudgeonly.
There was one passage in particular that hit home. It's when Scruton says, "One of the great distinctions between the left and the right in the intellectual world is that left-wing people find it very hard to get on with right-wing people, because they believe that they are evil. Whereas I have no problem getting on with left-wing people, because I simply believe that they are mistaken. After a while, if I can persuade them that I'm not evil, I find it a very useful thing. I know that my views on many things are open to correction. But if you can't discuss with your opponents, how can you correct your views?"
Scruton is on to something. Of course, the point is very seriously blunted by the way he pretends that Manicheanism is a peculiarly leftist failing. In his heart of hearts, he must know better. Certainly the American right is very keen on the language of apocalyptic confrontation with absolute evil.And Scruton himself is not above a certain amount of nastiness, once the polemical fires are stoked.
That's just the way of the passions, though -- the tendency in our nature that must be controlled by "the inner check," to borrow a very old-fashioned conservative notion discussed by the political scientist Wesley McDonald, who teaches at Elizabethtown College. In his book Russell Kirk and the Age of Ideology, published last year by the University of Missouri Press, he explains that the inner check is that factor in the soul that can subdue the more vicious parts of one's nature -- in the interest of the common good, and of the higher human potentialities.
See also, "superego." But there is perhaps a value to the more frankly moralistic expression "inner check." The superego is what makes you neurotic. By contrast, the inner check is what makes it possible to say, as Scruton does, "I know that my views on many things are open to correction. But if you can't discuss with your opponents, how can you correct your views?"
For an instructive display of the inner check in action (and a reminder of how much work it has cut out for it) you might check out a recent exchange concerning Unholy Alliance by David Horowitz.
According to its author, this is the book that provides the ballast of heavy thinking to back up "Discover the Network." And a good thing it does, for otherwise "Discover" might be regarded as a laughable exercise in guilt-by-association that makes the John Birch Society's None Dare Call it Conspiracy look like sober political analysis.
So it was interesting -- encouraging, even -- to see that Timothy Burke had written a long commentary on the book. The impression one gets from reading Burke's essays, over time, is that his ideas are measured without being equivocal. He tends to be scrupulous about defining where his arguments are coming from and where they are going. That precision is not the same as rigidity, however. He would probably be identified by most conservatives as a man of the left. But more than anything else, his writings call to mind a comment by Raymond Aron, who for decades was considered the anti-Sartre of French political and intellectual life. "The last word is never said, and one must not judge one's adversaries as if one's own cause were identified with absolute truth."
During a previous exchange, Horowitz had challenged Burke to grapple with Unholy Alliance, which demonstrates (says Horowitz} the linkage between radical Islamism and the American left. And Burke took up the gauntlet.
Burke begins by noting that "there is an intellectual history waiting to be written that plausibly connects the New Left with some of the forms of romantic anti-Western sentiment among some American (and European) activists and intellectuals that flourished between 1980 and the present."
He adds that such a book would do well to examine "a wider, more diffuse 20th Century history of connections between anti-Western ideas, texts and political commitments within Europe and the United States that would not be isolated in any simple way to 'the left' (indeed, would cross over at points to authors and thinkers typically regarded as conservative)."
Let's be clear on this: from the start, Burke more than half concedes a point that Horowitz takes as urgent: that there are indeed continuities between some parts of the Third World-ist left and modes of thought and politics that are, in the strictest sense, reactionary. But Burke thinks that the matter has to be faced with a certain degree of rigor and scholarship. Otherwise, why bother?
Burke argues that Horowitz has not offered even the most rudimentary approximation of the kind of analysis that he has promised. And yet Burke also makes an extremely (to my mind, astonishingly) generous estimate of Horowitz's potential to write something intelligent and serious.
In answer, Horowitz has issued a petulant, abusive, and interminable response that one suspects will turn into a chapter in his next autobiography.
At this point, it is hard not to think of the "inner check" -- the doctrine that there is (or should be) a small voice of constraint within the soul. "Man must put a control upon his will and his appetite," as Russell Kirk put it in The Conservative Mind (1953), "for conservatives know man to be governed more by emotion than by reason."
The inner check is not a part of the self -- but, rather, that internal force subduing the self, which would otherwise howl and rave, and demand that the world adore its every claim to glory. Reading Burke and Horowitz side by side, it's not hard to come to figure out which one really embodies that principle.
Now, over the past couple of years, I've tried hard to honor the memory of David Miller, who, in the year before his death at the ridiculously young age of 35, taught me so much by his example -- by his decency, his modesty, and his wry indulgence of what he must have seen as muddled leftist attitudes. For one thing, it's meant striving to understand things, from time to time, as he might; to consider the strongest, most coherent forms of conservative argument.
To that end, my reading diet now includes a certain amount of right-wing intellectual output -- journals like The Modern Age and The Claremont Review of Books, for example, and books by Russell Kirk, Michael Oakeshott, and Willmoore Kendall. It's not necessary to enjoy this stuff, or to agree with it.But it does seem important as part of the process of thinking outside one's familiar ruts.
But now it's time to go another step. There is only one way to keep from reinforcing the worst impressions of the conservative movement. Henceforth, I will never read another word by David Horowitz.
One part of Milovan Djilas's Conversations with Stalin lingers in the memory well after the rest of the book fades. The author himself calls it "a scene such as might be found only in Shakespeare's plays." Actually, it does have its parallels to Rabelais, as well; for like many another gathering of the Soviet elite amidst the privations of World War II that Djilas recounts, there is an enormous feast, and a marathon drinking session.
This particular miniature carnival occurs in the final months of the war. Stalin is hosting a reception at the Kremlin for the Yugoslavian delegation. But before the partying begins, he must disburden himself; for Stalin has heard that Djilas (who would later become vice president under Marshall Tito) has criticized the behavior of some units of the Red Army as it has made its way across Europe.
"Does Djilas, who is himself a writer, not know what human suffering and the human heart are?" cries Stalin. "Can't he understand it if a soldier who has crossed thousands of kilometers through blood and fire and death has fun with a woman or takes a trifle?"
By "having fun," he was referring to well over two million rapes, by Soviet soldiers, of women of all ages and backgrounds. The very indiscriminateness of the sexual violence gives the lie to the idea that it was revenge for the suffering inflicted by the Germans. Inmates liberated from Nazi concentration camps were raped as well.
As for Djilas, it must have seemed, for a moment, as if Stalin's outburst were the kiss of death. Luckily for him, the dictator's mood changed. "He proposed frequent toasts," recalls the author, "flattered one person, joked with another, teased a third, kissed my wife because she was a Serb, and again shed tears over the hardships of the Red Army and Yugoslav ingratitude."
Perhaps in response to the criticism, Stalin issued a command that soldiers behave themselves. The Soviet officers read the proclamation to their troops with a smirk. Everyone knew it meant nothing. Boys will be boys.
The anonymous memoir A Woman in Berlin, now appearing in a new English translation from Metropolitan Books, is an extraordinary chronicle of life in the streets as the Thousand Year Reich turned into rubble and the advancing "Ivans" had their fun. The author was a German editor and journalist who died in 2001. Her book, based on a diary kept over two months during the spring of 1945, first appeared in English in 1954. It was only published in German in 1959, where it seems tohave been regarded as an intolerable faux pas, a violation of the unstated rule that the events never be mentioned again.
The book's rediscovery now comes in the wake of Antony Beevor's massive documentation of the rape campaign in The Fall of Berlin 1945, published three years ago by Viking Press. To judge by the reservations of some military historians, Beevor's account may not be the last word on howSoviet forces advanced into Germany. (A reviewer for Parameters, the journal of the U.S. Army War College, praised it as a work of popular history, but lodged some complaints about certain gaps in the book's account of troop manuevers.) Yet the book did take an unflinching look at the extent of the sexual terror.
Beevor supplies an introduction to the new edition of A Woman in Berlin, situating the document in historical context. He notes, for example, that the statistics about rape for Berlin "are probably the most reliable in all of Germany," falling somewhere between 95,000 and 130,000 victims "according to the two leading hospitals."
He also points out that there is no particular evidence that rape was treated as a deliberate strategy of war -- as human-rights activists have recently charged the Sudanese military with doing in Darfur. "No document from the Soviet archives indicates anything of the sort in 1945," writes Beevor. But he suggests that the scale of the attacks may have been a by-product of the Red Army's internal culture, even so: "Many soldiers had been so humiliated by their own officers and commissars during the four years of war that they felt driven to expiate bitterness, and German women presented the easiest target. Polish women and female slave laborers in Germany also suffered."
Reading the memoir itself, you find all such interpretive questions being put on hold. It is not just a document. The author, an urbane and articulate woman in her early 30s, writes about the fall of Berlin and her own repeated violation with an astounding coolness -- a bitter, matter-of-fact lucidity, the extreme candor of which is almost disconcerting, given the lack of even a hint of self-pity.
"No doubt about it," she writes after being raped several times in a row. "I have to find a single wolf to keep away the pack. An officer, as high-ranking as possible, a commandant, a general, whatever I can manage. After all, what are my brains for, my little knowledge of the enemy's language?... My mind is firmly made up. I'll think of something when the time comes. I grin to myself in secret, feel as if I'm performing on the stage. I couldn't care less about the lot of them! I've never been so removed from myself, so alienated. All my feelings seem dead, except for the drive to live."
I've just reviewed the latest edition of A Woman in Berlin for Newsday, and will spare you a recycling of that effort (now available here ). Since then, a look at other reviews has revealed some debate over the authenticity of the book. The comments of J.G. Ballard ( no stranger to questions of sexuality in extreme conditions ) are indicative.
"It is hard to believe, as the author claims, that it was jotted down with a pencil stub on old scraps of paper while she crouched on her bed between bouts of rape," wrote Ballard in The New Statesman a few weeks ago. "The tone is so dispassionate, scenes described in so literary a way, with poignant references to the strangeness of silence and the plaintive cry of a distant bird. We live at a time that places an almost sentimental value on the unsparing truth, however artfully deployed. But the diary seems convincingly real, whether assembled later from the testimonies of a number of women or recorded at first hand by the author."
Given that concern, it is worth looking up the original edition of A Woman in Berlin, now more than 50 years old. It came with an introduction by C.W. Ceram, whose book Gods, Graves, and Scholars, first published in 1951, remains one of the best introductions to the history of archeology. Ceram recalls meeting the author of A Woman in Berlin not long after the war.
"From some hints that she dropped," he wrote, "I learned of this diary's existence. When, after another six months passed, I was permitted to read it, I found described in detail what I already knew from the accounts of others."
That means Ceram saw the book in 1947, at the latest. "It took me more than five years, however, to persuade the author that her diary was unique, that it simply had to be published."
She had, he writes, "jotted down in old ledgers and on loose pages what happened to her.... These pages lie before me as I write. Their vividness as expressed in the furtiveness of the short penciled notes; the excitement they emanate whenever the pencil refuses to describe the facts; the combination of shorthand, longhand, and secret code ... all of this will probably be lost in the depersonalizing effect of the printed word."
Ceram's introduction is interesting for its testimony about the book's provenance. But that remark about "the depersonalizing effect of the printed word" will seem odd to anyone who has read A Woman in Berlin.
In many ways, of course, the book is an account of brutality. (War is a force that turns people into things, as Simone Weil once put it; and killing them is just one of the ways.) But the anonymous author also created a record of what is involved in resisting depersonalization. At times, she is able to see the occupiers, too, as human beings. You cannot put the book down without wondering about the rest of her life.