Philosophy

Divided Mind

George Scialabba is an essayist and critic working at Harvard University who has just published a volume of selected pieces under the title Divided Mind, issued by a small press in Boston called Arrowsmith. The publisher does not have a Web site. You cannot, as yet, get Divided Mind through Amazon, though it is said to be available in a few Cambridge bookstores. This may be the future of underground publishing: Small editions, zero publicity, and you have to know the secret password to get a copy. (I'll give contact information for the press at the end of this column, for anyone willing to put a check in the mail the old-fashioned way.)

In any case, it is about time someone brought out a collection of Scialabba's work. That it's only happening now (15 years after the National Book Critics Circle gave him its first award for excellence in reviewing) is a sign that things are not quite right in the world of belles lettres. He writes in what William Hazlitt -- the patron saint of generalist essayists -- called the "the familiar style," and he is sometimes disarmingly explicit about the difficulties, even the pain, he experiences in trying to resolve cultural contradictions. That is no way to create the aura of mystery and mastery so crucial for awesome intellectual authority.

Scialabba has his admirers, even so, and one of the pleasant surprises of Divided Mind is the set of comments on the back. "I am one of the many readers who stay on the lookout for George Scialabba's byline," writes Richard Rorty. "He cuts to the core of the ethical and political dilemmas he discusses." The novelist Norman Rush lauds Scialabba's prose itself for "bring[ing] the review-essay to a high state of development, incorporating elements of memoir and skillfully deploying the wide range of literary and historical references he commands." And there is a blurb from Christopher Hitchens praising his "eloquence and modesty" --  though perhaps that is just a gesture of relief that Scialabba has not reprinted his candid reassessment of Hitch,  post-9/11.

One passage early in the collection gives a roll call of exemplary figures practicing a certain kind of writing. It includes Randolph Bourne, Bertrand Russell, George Orwell, and Maurice Merleau-Ponty, among others. "Their primary training and frame of reference," Scialabba writes, "were the humanities, usually literature or philosophy, and they habitually, even if only implicitly, employed values and ideals derived from the humanities to criticize contemporary politics.... Their 'specialty' lay not in unearthing generally unavailable facts, but in penetrating especially deeply into the shared culture, in grasping and articulating its contemporary moral/political relevance with special originality and force."

The interesting thing about this passage -- aside from its apt self-portrait of the author -- is the uncertain meaning of that slashmark in the phrase "contemporary moral/political relevance." Does it serve as the equivalent of an equals sign? I doubt that. But it suggests that the relationship is both close and problematic.

We sometimes say that a dog "worries" a bone, meaning he chews it with persistent attention; and in that sense, Divided Mind is a worried book, gnawing with a passion on the "moral/political" problems that go with holding an egalitarian outlook. Scialabba is a man of the left. If you can imagine a blend of Richard Rorty's skeptical pragmatism and Noam Chomsky's geopolitical worldview -- and it's a bit of a stretch to reconcile them, though somehow he does this -- then you have a reasonable sense of Scialabba's own politics. In short, it is the belief that life would be better, both in the United States and elsewhere, with more economic equality, a stronger sense of the common good, and the end of that narcissistic entitlement fostered by the American military-industrial complex.

A certain amount of gloominess goes with holding these principles without believing that History is on the long march to their fulfillment. But there is another complicating element in Divided Mind. It is summed in a passage from the Spanish philosopher José Ortega y Gasset's The Revolt of the Masses, from 1930 -- though you might find the same thought formulated by a dozen other conservative thinkers.

"The most radical division it is possible to make of humanity," Ortega y Gasset declares, "is that which splits it into two classes of creatures: those who make great demands on themselves, piling up difficulties and duties; and those who demand nothing special of themselves, but for whom to live is to be every moment what they already are, without imposing on themselves any effort toward perfection; mere buoys that float on the waves."

Something in Ortega y Gasset's statement must have struck a chord with Scialabba. He quotes it in two essays. "Is this a valid distinction?" he asks. "Yes, I believe it is...." But the idea bothers him; it stimulates none of the usual self-congratulatory pleasures of snobbery. The division of humanity into two categories -- the noble and "the masses" -- lends itself to anti-democratic sentiments, if not the most violently reactionary sort of politics.

At the very least, it undermines the will to make egalitarian changes. Yet it is also very hard to gainsay the truth of it. How, then, to resolve the tension? Divided Mind is a series of efforts -- provisional, personal, and ultimately unfinished -- to work out an answer.

At this point it bears mentioning that Scialabba's reflections do not follow the protocols of any particular academic discipline. He took his undergraduate degree at Harvard (Class of 1969) and has read his way through a canon or two; but his thinking is not, as the saying now goes, "professionalized." He is a writer who works at Harvard -- but not in the way that statement would normally suggest.

"After spells as a substitute teacher and Welfare Department social worker," he told me recently in an e-mail exchange, "I was, for 25 years, the manager or superintendent of a mid-sized academic office building, which housed Harvard's Center for International Affairs and several regional (East Asian, Russian, Latin American, Middle Eastern, etc) research centers. I gave directions to visitors, scheduled the seminar rooms, got offices painted, carpets installed, shelves built, windows washed, keys made, bills paid. I flirted with graduate students and staff assistants, schmoozed with junior faculty, and saw, heard, overheard, and occasionally got to know a lot of famous and near-famous academics."

As day jobs go, it was conducive to writing. "I had a typewriter and a copy machine," he says, "a good library nearby, and didn't come home every night tired or fretting about office politics." When the "homely mid-sized edifice" was replaced with "a vast, two-building complex housing the political science and history departments as well," the daily grind changed as well: "I'm now part of a large staff, and most of my days are spent staring at a flickering screen."

More pertinent to understanding what drives him as a writer, I think, are certain facts about his background that the reader glimpses in various brief references throughout his essays. The son of working-class Italian-American parents, he was once a member of the ascetic and conservative Roman Catholic group Opus Dei. In adolescence, he thought he might have a religious vocation. The critical intelligence of his critical writings is now unmistakably secular and modernist. He shows no sign of nostalgia for the faith now lost to him. But the extreme dislocation implied in leaving one life for another gives an additional resonance to the title of his collection of essays.

"For several hundred years," he told me, "a small minority of Italian/French/Spanish adolescent peasant or working-class boys -- usually the sternly repressed or (like me) libido-deficient ones -- have been devout, well-behaved, studious. Depending on their abilities and on what sort of priest they're most in contact with, they join a diocese or a religious order. Among the latter, the bright ones become Jesuits; the more modestly gifted or mystically inclined become Franciscans. I grew up among Franciscans and at first planned to become one, but I just couldn't resist going to college -- intellectual concupiscence, I guess."

Instead, he was drawn into Opus Dei -- a group trying, as he puts it, "to make a new kind of religious vocation possible, combining the traditional virtues and spiritual exercises with a professional or business career."

He recalls being "tremendously enthusiastic for the first couple of years, trying very hard, though fruitlessly, to recruit my fellow Catholic undergraduates at Harvard in the late 1960s. It was a strain, being a divine secret agent and trying at the same time to survive academically before the blessed advent of grade inflation. But the reward -- an eternity of happiness in heaven!"

The group permitted him to read secular authors, the better to understand and condemn their heresies.

"Then," he says, "Satan went to work on me. As I studied European history and thought, my conviction gradually grew that the Church had, for the most part, been on the wrong side. Catholic philosophy was wrong; Catholic politics were authoritarian....On one occasion, just after I had read Dostoevsky's parable of the Grand Inquisitor, I was rebuked for my intellectual waywardness by a priestly superior with, I fancied, a striking physical resemblance to the terrifying prelate in Ivan's fable. The hair stood up on the back of my neck."

The departure was painful. The new world he discovered on the other side of his crossing "wasn't in the slightest degree an original discovery," he says. "I simply bought the now-traditional narrative of modernity, hook, line and sinker. I still do, pretty much." But he was not quite ready to plunge without reserve into the counterculture of the time -- sex, drugs, rock and roll.

"I was, to an unusual degree, living in my head rather than my body," he says about the 1970s. "I had emerged from Opus Dei with virtually no friends, a conscious tendency to identify my life course with the trajectory of modernity, and an unconscious need to be a saint, apostle, missionary. And I had inherited from my working-class Italian family no middle-class expectations, ambitions, social skills, ego structures."

Instead, he says, "I read a lot and seethed with indignation at all forms of irrational authority or even conventional respectability. So I didn't take any constructive steps, like becoming a revolutionary or a radical academic.... In those days, it wasn't quite so weird not to be ascending some career ladder."

So he settled into a job that left him with time to think and write. And to deal with the possibility of eternal damnation -- something that can occasionally bedevil one part of the mind, even while the secular and modernist half retains its disbelief.

Somewhere in my study is a hefty folder containing, if not George Scialabba's complete oeuvre, then at least the bulk of it. After several years of reading and admiring his essays, I can testify that Divided Mind is a well-edited selection covering many of his abiding concerns. It ought to be interest to anyone interested in the "fourth genre," as the essay is sometimes called. (The other three -- poetry, drama, and fiction -- get all the glory.)

As noted, the publisher seems to be avoiding crass commercialism (not to mention convenience to the reader) by keeping Divided Mind out of the usual online bookselling venues. You can order it from the address below for $13, however. That price includes shipping and handling.

Arrowsmith
11 Chestnut Street
Medford, MA  02155

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee writes Intellectual Affairs each week. Suggestions and ideas for future columns are welcome.

Remember Baudrillard

A few days ago, I tried the thought experiment of pretending never to have read anything by Jean Baudrillard – instead trying to form an impression based only on media coverage following his death last week. And there was a lot more of it than I might have expected. The gist being that, to begin with, he was a major postmodernist thinker. Everyone agrees about that much, usually without attempting to define the term, which is probably for the best. It also seems that he invented virtual reality, or at least predicted it. He may have had something to do with YouTube as well, though his role in that regard is more ambiguous. But the really important thing is that he inspired the "Matrix" movie franchise.

A segment on National Public Radio included a short clip from the soundtrack in which Lawrence Fishburn’s character Morpheus intones the Baudrillard catchphrase, “Welcome to the desert of the real.” The cover of Simulacra and Simulation -- in some ways his quintessential theoretical text, first published in a complete English translation by the University of Michigan in 1994 -- is shown in the first film. Furthermore, the Wachowski brothers, who wrote and directed the trilogy, made the book required reading for all the actors, including Keanu Reeves. (It is tempting to make a joke at this point, but we will all be better people for it if I don’t.)

There was more to Baudrillard than his role as Marshall McLuhan of the cyberculture. And yet I can’t really blame harried reporters for emphasizing the most blockbuster-ish dimensions of his influence. "The Matrix" was entertainment, not an educational filmstrip, and Baudrillard himself said that its take on his work “stemmed mostly from misunderstandings.” But its computer-generated imagery and narrative convolutions actually did a pretty decent job of conveying the feel, if not the argument, of Baudrillard’s work.

As he put it in an essay included in The Illusion of the End (Stanford University Press, 1994): “The acceleration of modernity, of technology, events and media, of all exchanges – economic, political, sexual – has propelled us to ‘escape velocity,’ with the result that we have flown free of the referential sphere of the real and of history.” You used to need digitalized special effects to project that notion. But I get the feeling of being “flown free of the referential sphere of the real and of history” a lot nowadays, especially while watching certain cable news programs.

Some of the coverage of Baudrillard’s death was baffled but vaguely respectful. Other commentary has been more hostile – though not always that much more deeply informed. A case in point would be an article by Canadian pundit Robert Fulford that appeared in The National Post on Saturday. A lazy diatribe, it feels like something kept in a drawer for the occasion of any French thinker’s death – with a few spots left blank, for details to be filled in per Google.

A tip-off to the generic nature of the piece is the line: “Strange as it seems, in the 1970s much of the Western world was ready to embrace him.” Here, Fulford can count on the prefab implication of a reference to that decade as a time of New Left-over radicalism and  countercultural indulgence. In fact Baudrillard was little known outside France until the 1980s, and even then he had a very small audience until late in the decade. The strong mood coming from most of Baudrillard’s work is that of bitter disappointment that oppositional social movements of earlier years had been neutralized – absorbed into academic bureaucracy and consumer society, with no reason to think that they would revive.

And if we are going to play the game of periodization-by-decade, well, it is perhaps worth mentioning that “much of the Western world was ready to embrace him" only after several years of watching Ronald Reagan -- a man whose anecdotes routinely confused his roles in motion pictures with actual experiences from his own life -- in a position of great power. The distinction between reality and simulation had been worn away quite a bit, by that point. Some of Baudrillard’s crazier flights of rhetoric were starting to sound more and more like apt descriptions of the actual.

Even then, it was by no means a matter of his work persuading university professors “that novels and poems had become irrelevant as subject matter for teaching and research,” as the macro setting for culture-war boilerplate on Fulford’s computer puts it.

Enthusiasm for Baudrillard’s work initially came from artists, writers, and sundry ne’er-do-wells in the cultural underground. The post-apocalyptic tone of his sentences, the science-fictionish quality of his concepts, resonated in ways that at least some people found creatively stimulating, whether or not they grasped his theories. (True confession: While still in my teens, I started writing a novel that opened with an epigraph from one of his books, simply because it sounded cool.)

Baudrillard’s work played no role whatever in the debates of “the canon” to which Fulford alludes. But he was, in a different sense, the most literary of theorists. He translated Bertolt Brecht, among other German authors, into French. Some of his earliest writings were critical articles on the fiction of William Styron and Italo Calvino. In 1978, he published a volume of poems. And a large portion of his output clearly belongs to the literary tradition of the aphorism and the “fragment” (not an unfinished work, but a very dense and compact form of essay). These are things you notice if you actually read Baudrillard, rather than striking po-faced postures of concern about how literature should be “subject matter for teaching and research.”

Besides, it is simply untrue to say that Baudrillard’s reception among American academics was one of uncritical adulation. If there was a protracted lag between the appearance of his first books in the 1960s and the dawn of interest in his work among scholars here in the 1980s, that was not simply a matter of the delay in translation. For one thing, it was hard to know what to make of Baudrillard, and a lot of the initial reception was quite skeptical.

In the mid-1960s, he became a professor of sociology at the University of  Paris at Nanterre , but the relationship of his work to the canon of social theory (let alone empirical research) is quite oblique. It’s also difficult to fit him into the history of philosophy as a discipline. Some of his work sounds like Marxist cultural theory, such as the material recently translated in Utopia Deferred: Writings for ‘Utopie’ 1967-1978 -- a collection distributed by MIT Press, a publisher known, not so coincidentally, for its books on avant-garde art. Still, there is plenty in Baudrillard’s work to irritate any Marxist (he grew profoundly cynical about the idea of social change, let alone socialism). And he delighted in baiting feminists with statements equating femininity with appearance, falsehood, and seduction.

Baudrillard was, in short, a provocateur. After a while that was all he was – or so it seemed to me, anyway. The rage of indignant editorialists notwithstanding, a lot of the response to Baudrillardisme amounted to treating him as a stimulating but dubious thinker: not so much a theorist as a prose-poet. A balanced and well-informed critical assessment of his work comes from Douglas Kellner, a professor of philosophy at UCLA, who wrote Jean Baudrillard: From Marxism to Postmodernism and Beyond (Stanford University Press, 1989), the first critical book on him in English. Kellner has provided me with the manuscript of a forthcoming essay on Baudrillard, which I quote here with permission.

“So far,” he writes, “no Baudrillardian school has emerged. His influence has been largely at the margins of a diverse number of disciplines ranging from social theory to philosophy to art history, thus it is difficult to gauge his impact on the mainstream of philosophy, or any specific academic discipline.”

At this point I’d interject that his questionable position within the disciplinary matrix (so to speak) tends to reinforce Baudrillard’s status as a minor literary figure, rather than an academic superstar. Kellner goes on to note that Baudrillard “ultimately goes beyond conventional philosophy and theory altogether into a new sphere and mode of writing that provides occasionally biting insights into contemporary social phenomena and provocative critiques of contemporary and classical thought. Yet he now appears in retrospect as a completely idiosyncratic thinker who went his own way....”

Not that Baudrillard exactly suffered for going his own way, however. A self-portrait of the postmodern intellectual as global jet-setter emerges in the five volumes of his notebook jottings published under the title “Cool Memories.” You get the sense that he spent a lot of time catching planes to far-flung speaking engagements – not to mention seeing various unnamed women out the door, once they had been given a practicum in the theories worked out in his book De la Séduction.

Many of the writings that appeared during the last two decades of his life simply recycled ideas from his early work. But celebrity is a full-time job.

One offer he did turn down was the chance to do a cameo in one of the Matrix sequels. (Instead, it was Cornel West who did his star turn onscreen as gnomic philosophical figure.) Still the appearance of "Simulacra and Seduction" in the first film greatly increased the book’s distribution, if not comprehension of its themes.

According to Mike Kehoe, the sales manager for the University of Michigan Press, which published the English translation, sales doubled in the year following “The Matrix.” The book had often been assigned in university courses. But those sales, too, jumped following the release of the film.

Rather than indulging my own halfbaked quasi-Baudrillardan speculations about how his theories of media metaphysics were reabsorbed by the culture industry, I decided to bring the week’s musings to a close by finding out more about how the book itself ended up on screen.

“It wasn’t the usual sort of product placement,” LeAnn Fields, a senior executive editor for the press, told me by phone. “That is, we didn’t pay them. It was the other way around. The movie makers contacted us for permission. But they reserved the right to redesign the cover for it when it appeared onscreen.”

The familiar Michigan edition is a paperback with bergundy letters on a mostly white cover. “But in the film,” said Fields, “it become a dark green hardcover book. We were quite surprised by that, but I guess it’s understandable since it serves as a prop and a plot device, as much as anything.” (If memory serves, some kind of cyber-gizmo is concealed in it by Keanu Reeves.)

I asked Fields if the press had considered bringing out a special version of the book, simulating its simulation in a deluxe hardback edition. “No,” she said with a laugh, “I don’t think we ever considered that. Maybe we should have, though.”

Recommended Reading: Mark Poster's edition of Baudrillard's Selected Writings, originally published by Stanford University Press in 1988, is now available as a PDF document. The single best short overview of Baudrillard's work is Douglas Kellner's entry on him for the Stanford Encyclopedia of Philosophy. There is an  International Journal of Baudrillard Studies  that publishes both commentary on his work and translations of some of his shorter recent writings. 

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Requiem for a Heavyweight

Word that Richard Rorty was on his deathbed – that he had pancreatic cancer, the same disease that killed Jacques Derrida almost three years ago – reached me last month via someone who more or less made me swear not to say anything about it in public. The promise was easy enough to keep. But the news made reading various recent books by and about Rorty an awfully complicated enterprise. The interviews in Take Care of Freedom and Truth Will Take Care of Itself (Stanford University Press, 2006) and the fourth volume of Rorty’s collected papers, Philosophy as Cultural Politics (Cambridge University Press, 2007) are so bracingly quick-witted that it was very hard to think of them as his final books.

But the experience was not as lugubrious as it may sound. I found myself laughing aloud, and more than once, at Rorty’s consistent indifference to certain pieties and protocols. He was prone to outrageous statements delivered with a deadpan matter-of-factness that could be quite breathtaking. The man had chutzpah.

It’s a “desirable situation,” he told an interviewer, “not to have to worry about whether you are writing philosophy or literature. But, in American academic culture, that’s not possible, because you have to worry about what department you are in.”

The last volume of his collected papers contains a piece called “Grandeur, Profundity, and Finitude.” It opens with a statement sweeping enough to merit that title: “Philosophy occupies an important place in culture only when things seem to be falling apart – when long-held and widely cherished beliefs are threatened. At such periods, intellectuals reinterpret the past in terms of an imagined future. They offer suggestions about what can be preserved and what must be discarded.”

Then, a few lines later, a paradoxical note of rude modesty interrupts all the grandeur and profundity. “In the course of the 20th century," writes Rorty, "there were no crises that called forth new philosophical ideas.”

It's not that the century was peaceful or crisis-free, by any means. But philosophers had less to do with responding to troubles than they once did. And that, for Rorty, is a good thing, or at least not a bad one – a sign that we are becoming less intoxicated by philosophy itself, more able to face the need to face crises at the level (social, economic, political, etc.) they actually present themselves. We may yet be able to accept, he writes, “that each generation will solve old problems only by creating new ones, that our descendants will look back on much that we have done with incredulous contempt, and that progress towards greater justice and freedom is neither inevitable nor impossible.”

Nothing in such statements is new, of course. They are the old familiar Rorty themes. The final books aren’t groundbreaking. But neither was there anything routine or merely contrarian about the way Rorty continued to challenge the boundaries within the humanities, or the frontier between theoretical discussion and public conversation. It is hard to imagine anyone taking his place.

An unexpected and unintentional sign of his influence recently came my way in the form of an old essay from The Journal of American History. It was there that David A. Hollinger, now chair of the department of history at the University of California at Berkeley, published a long essay called “The Problem of Pragmatism in American History.”

It appeared in 1980. And as of that year, Hollinger declared, it was obvious that “‘pragmatism’ is a concept most American historians have proved that they can get along without. Some non-historians may continue to believe that pragmatism is a distinctive contribution of America to modern civilization and somehow emblematic of America, but few scholarly energies are devoted to the exploration or even the assertion of this belief.”

Almost as an afterthought, Hollinger did mention that Richard Rorty had recently addressed the work of John Dewey from a “vividly contemporary” angle. But this seemed to be the a marginal exception to the rule. “If pragmatism has a future,” concluded Hollinger in 1980, “it will probably look very different from the past, and the two may not even share a name.”

Seldom has a comment about the contemporary state of the humanities ever been overtaken by events so quickly and so thoroughly. Rorty’s Philosophy and the Mirror of Nature (Princeton University Press, 1979) had just been published, and he was finishing the last of the essays to appear in Consequences of Pragmatism (University of Minnesota Press, 1982).

It is not that the revival was purely Rorty's doing, and some version of it might have unfolded even without his efforts. In such matters, the pendulum does tend to swing.

But Rorty's suggestion that John Dewey, Martin Heidegger, and Ludwig Wittgenstein were the three major philosophers of the century, and should be discussed together -- this was counterintuitive, to put it mildly. It created excitement that blazed across disciplinary boundaries, and even carried pragmatism out of the provinces and into international conversation. I'm not sure how long Hollinger's point that pragmatism was disappearing from textbooks on American intellectual history held true. But scholarship on the original pragmatists was growing within a few years, and anyone trying to catch up with the historiography now will soon find his or her eyeballs sorely tested.

In 1998, Morris Dickstein, a senior fellow at the City University of New York Graduate Center, edited a collection of papers called The Revival of Pragmatism: New Essays on Social Thought, Law, and Culture (Duke University Press) -- one of the contributors to it being, no surprise, Richard Rorty. “I’m really grieved,” he told me on Monday. "Rorty evolved from a philosopher into a mensch.... His respect for his critics, without yielding much ground to them, went well with his complete lack of pretension as a person.”

In an e-mail note, he offered an overview of Rorty that was sympathetic though not uncritical.

“To my mind," Dickstein wrote, "he was the only intellectual who gave postmodern relativism a plausible cast, and he was certainly the only one who combined it with Dissent-style social democratic politics. He admired Derrida and Davidson, Irving Howe and Harold Bloom, and told philosophers to start reading literary criticism. His turn from analytic philosophy to his own brand of pragmatism was a seminal moment in modern cultural discourse, especially because his neopragmatism was rooted in the 'linguistic turn' of analytic philosophy. His role in the Dewey revival was tremendously influential even though Dewey scholars universally felt that it was his own construction. His influence on younger intellectuals like Louis Menand and David Bromwich was very great and, to his credit, he earned the undying enmity of hard leftists who made him a bugaboo."

The philosopher "had a blind side when it came to religion," continued Dickstein, "and he tended to think of science as yet another religion, with its faith in empirical objectivity. But it's impossible to write about issues of truth or objectivity today without somehow bouncing off his work, as Simon Blackburn and Bernard Williams both did in their very good books on the subject. I liked him personally: he was generous with his time and always civil with opponents.”

A recent essay discussing Rorty challenges the idea that Rorty “had a blind side when it came to religion.” Writing in Dissent, Casey Nelson Blake, a professor of history and American studies at Columbia University, notes that Rorty “in recent years stepped back from his early atheist pronouncements, describing his current position as ‘anti-clerical,’ and he has begun to explore, with increasing sympathy and insight, the social Christianity that his grandfather Walter Rauschenbusch championed a century ago.”

Blake quotes a comment by Rorty from The Future of Religion, an exchange with the Catholic philosopher Gianni Vattimo that Columbia University Press published in 2005. (It comes out in paperback this summer.)

“My sense of the holy,” wrote Rorty, “insofar as I have one, is bound up with the hope that someday, any millennium now, my remote descendants will live in a global civilization in which love is pretty much the only law. In such a society, communication would be domination-free, class and caste would be unknown, hierarchy would be a matter of temporary pragmatic convenience, and power would be entirely at the disposal of the free agreement of a literate and well-educated electorate.”

I'm not sure whether that counts as a religious vision, by most standards. But it certainly qualifies as something that requires a lot of faith.

Two items of great interest came to my attention too late to include in this column. One is the final interview with Rorty, conducted by Danny Postel just before the philosopher's death. The other is a tribute to Rorty by Jürgen Habermas.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

If Not Religion, What?

In a variety of arenas, from politics to high schools, from colleges to the military, Americans argue as though the proper face-to-face discussion in our society ought to be between religion and science. This is a misunderstanding of the taxonomy of thought. Religion and science are in different families on different tracks: science deals with is vs. isn’t and religion, to the extent that it relates to daily life, deals with should vs. shouldn’t.

These are fundamentally different trains. They may hoot at each other in passing, and many people attempt to switch them onto the same track (mainly in order to damage science), but this is an act of the desperate, not the thoughtful.

It is true that a portion of religious hooting has to do with is vs. isn’t questions, in the arena of creationism and its ancillary arguments. However, this set of arguments, important as it might be for some religious people, is not important to a great many (especially outside certain Protestant variants), while the moral goals and effects of religious belief are a far more common and widespread concern among many faiths. I was raised in Quaker meeting, where we had a saying: Be too busy following the good example of Jesus to argue about his metaphysical nature.

Until recently, most scientists didn’t bother trying to fight with religion; for the most part they ignored it or practiced their own faiths. However, in recent years Carl Sagan, Richard Dawkins, Daniel Dennett and Sam Harris have decided to enter the ring and fight religion face to face. The results have been mixed. I have read books by all of these authors on this subject, as well as the interesting 2007 blog exchange between Harris and Andrew Sullivan, one of the best writers active today and a practicing Catholic, and it is clear that a great deal of energy is being expended firing heavy ordnance into black holes with no likelihood of much effect.

The problem that the scientific horsemen face is that theirs is the language of is/isn’t. Their opponents (mostly Christians but by implication observant Jews and Muslims as well) don’t use the word “is” to mean the same thing. To a religious person, God is and that’s where the discussion begins. To a nonreligious scientist, God may or may not be, and that is where the discussion begins.

The two sides, postulating only two for the moment, are each on spiral staircases, but the stairs wind around each other and never connect: this is the DNA of unmeeting thoughts. Only shouting across the gap happens, and the filters of meaning are not aligned. That is why I don’t put much faith, you’ll pardon the expression, in this flying wedge of scientific lancers to change very many minds.

Dennett’s approach is quite different from the others at a basic level; he views religious people as lab rats and wants to study why they squeak the way they do. That way of looking at the issue seems insulting at first but is more honest and practical in that it doesn’t really try to change minds that are not likely to change.

But these arguments are the wrong ones at a very basic level, especially for our schools and the colleges that train our teachers. The contrapuntal force to religion, that force which is in the same family, if a different genus, speaks the same language in different patterns regarding the same issues. It is not science, it is philosophy. That is what our teachers need to understand, and this distinction is the one in which education colleges should train them.

Those of us who acknowledge the factual world of science as genuine and reject the idea of basing moral and “should” questions in the teachings of religion are left seeking an alternate source for sound guidance. Our own judgment based in experience is a strong basic source. The most likely source, the ‘respectable’ source with sound academic underpinnings that can refine, inform and burnish our judgment, is philosophy in its more formal sense.

The word “philosophy” conjures in many minds the image of dense, dismal texts written by oil lamp with made-up words in foreign languages, and far beyond mortal ken. In fact, many writers on philosophy are quite capable of writing like human beings; some of their books are noted below.

When we introduce more religious studies into our K-12 schools, as we must if people are ever to understand each other’s lives, the family of learning into which they must go also contains philosophy. It is this conversation, between the varieties of religious outlooks and their moral conclusions, and the same questions discussed by major philosophers, that needs to happen.

Philosophy is not all a dense, opaque slurry of incomprehensible language. Some excellent basic books are available that any reasonably willing reader can comprehend and enjoy. Simon Blackburn’s Think, Robert Solomon and Kathleen Higgins’ A Passion for Wisdom and Erik Wielenberg’s Value and Virtue in a Godless Universe are some recent examples.

An older text providing a readable commentary on related issues is John Jay Chapman’s Religion and Letters, still in print in his Collected Works but hard to find in the original, single volume . Chapman wrote of changes in our school system that:

“It is familiarity with greatness that we need—an early and first-hand acquaintance with the thinkers of the world, whether their mode of thought was music or marble or canvas or language. Their meaning is not easy to come at, but in so far as it reaches us it will transform us. A strange thing has occurred in America. I am not sure that it has ever occurred before. The teachers wish to make learning easy. They desire to prepare and peptonize and sweeten the food. Their little books are soft biscuits for weak teeth, easy reading on great subjects, but these books are filled with a pervading error: they contain a subtle perversion of education. Learning is not easy, but hard: culture is severe.”

This, published in 1910, is remarkably relevant to education at all levels today. The idea that philosophy is too hard for high school students, which I doubt, simply means that we need to expect more of students all through K-12. Many of them would thank us.

Paul Kurtz’s Affirmations and my brother John Contreras’s Gathering Joy are interesting “guidebooks” that in effect apply philosophical themes in an informal way to people’s real lives. There are also somewhat more academic books that integrate what amount to philosophical views into daily life such as Michael Lynch’s True to Life: Why Truth Matters, physicist Alan Lightman’s A Sense of The Mysterious and the theologian John O’Donohue’s Beauty: The Invisible Embrace.

Some of these are denser than others and not all are suited for public schools, but the ideas they discuss are often the same ideas discussed in the context of religions, and sometimes with similar language. It is this great weave of concepts that our students should be exposed to, the continuum of philosophical thought blended with the best that different religions have to offer.

The shoulds and shouldn’ts that are most important to the future of our society need to be discussed in colleges, schools and homes, and the way to accomplish this is to bring religions and philosophies back to life as the yin and yang of right and wrong. That is the great conversation that we are not having.

Author/s: 
Alan Contreras
Author's email: 
newsroom@insidehighered.com

Alan L. Contreras has been administrator of the Oregon Office of Degree Authorization, a unit of the Oregon Student Assistance Commission, since 1999. His views do not necessarily represent those of the commission. He blogs at http://oregonreview.blogspot.com.

Becoming Richard Rorty

In the late 1940s, as Richard Rorty was finishing his undergraduate studies and considering a future as a professional philosopher, his parents began to worry about him. This is not surprising. Parents worry; and the parents of philosophers, perhaps especially. But just why Rorty's parents worried – well now, that part is surprising.

They were prominent left-wing journalists. His father, James, also had some minor reputation as a poet; and his mother, Winifred, had done important work on the sociology of race relations, besides trying her hand at fiction. In a letter, James Rorty speculated that going straight into graduate work might be something Richard would later regret. His son would do well to take some time “to discover yourself, possibly through a renewed attempt to release your own creative need: through writing, possibly through poetry....”

In short, becoming an academic philosopher sounded too practical.

Not to go overboard and claim that this is the defining moment of the philosopher’s life (Rosebud!). But surely it is the kind of experience that must somehow mark one’s deepest sense of priorities. How does that inner sense of self then shape a thinker’s work?

Neil Gross’s book Richard Rorty: The Making of an American Philosopher, to be published next month by University of Chicago Press, is not exactly a biography of its subject, who died last year. Rather, it is a study of how institutional forces shape an intellectual’s sense of personal identity, and vice versa. (Gross is currently in transit from Harvard University to the University of British Columbia, where as of this summer he will be an associate professor of sociology.)

Influenced by recent work in sociological theory – but with one eye constantly on the archive of personal correspondence, unpublished writings, and departmental memoranda – Gross reconstructs how Rorty’s interests and intellectual commitments developed within the disciplinary matrix of academic philosophy. He takes the story up through the transformative and sui generis work of Rorty’s middle years, Philosophy and the Mirror of Nature (1979) and Consequences of Pragmatism (1982).

This includes a look at Rorty’s complicated and unhappy relationship with his colleagues at Princeton University in the 1970s. “I find it a bit terrifying,” he wrote in a letter at the time, “that we keep turning out Ph.D.'s who quite seriously conceive of philosophy as a discipline in which one does not read anything written before 1970, except for the purposes of passing odd examinations.” Nor did it help that Rorty felt other professors were taking his ex-wife’s side in their divorce. (What’s the difference between departmental gossip and cultural history? In this case, about 30 years.)

Gross has written the most readable of monographs; and the chapter titled “The Theory of Intellectual Self-Concept” should be of interest even to scholars who aren’t especially concerned with Rorty’s long interdisciplinary shadow. I interviewed Gross recently by e-mail, just before he headed off to Canada. The transcript of our discussion follows.

Q:You identify your work on Richard Rorty not as a biography, or even as a work of intellectual history, but rather as an empirical case study in "the new sociology of ideas." What is that? What tools does a sociologist bring to the job that an intellectual historian wouldn't?

A: Sociology is a diverse field, but if I had to offer a generalization, I'd say that most sociologists these days aim to identify the often hidden social mechanisms, or cascading causal processes, that help to explain interesting, important, or counterintuitive outcomes or events in the social world. How and why do some movements for social change succeed in realizing their goals when others fail to get off the ground? Why isn't there more social mobility? What exactly is the connection between neighborhood poverty and crime? Few sociologists think anymore that universal, law-like answers to such questions can be found, but they do think it possible to isolate the role played by more or less general mechanisms.

Sociologists of ideas are interested in identifying the hidden social processes that can help explain the content of intellectuals' ideas and account for patterns in the dissemination of those ideas. My book attempts to make a theoretical contribution to this subfield. I challenge the approaches taken by two of the leading figures in the area -- Pierre Bourdieu and Randall Collins -- and propose a new approach. I think that the best sociological theory, however, has strong empirical grounding, so I decided to develop my theoretical contribution and illustrate its value by deeply immersing myself in an empirical case: the development of the main lines of Richard Rorty's philosophy.

This entailed doing the same kind of work an intellectual historian would do: digging through archives, reading through Rorty's correspondence and unpublished manuscripts (to which he granted to access,) and of course trying to get a grasp on the diversity of Rorty's intellectual output for the period in question. This work is reflected in the first half of my book, which reads like an intellectual biography.

But the book isn't intended as a biography, and in the second half I try to show that thinking about Rorty's life and career in terms of the hidden social mechanisms at play offers unique explanatory leverage. I love intellectual history, but many intellectual historians are allergic to any effort at generalization. One of my aims in this book is to show them that they needn't be. The old sociology of knowledge may have been terribly reductive -- ideas are an expression of class interests or reflective of dominant cultural tendencies, etc etc -- but the sociology of ideas today offers much more fine-grained theoretical tools.

I only cover Rorty's life up until 1982 because by then most of the main lines of his philosophy had already been developed. After that point, he becomes for the sociologist of ideas a different kind of empirical case: an intellectual superstar and bête noire of American philosophy. It would be fascinating to write about the social processes involved with this, but that was too much for one book.

Q:This might seem like a chicken-or-egg question....Did an interest in Rorty lead you toward this sociological approach, or vice versa?

A: When I was a graduate student in the 1990s I read quite a bit of Rorty's work, and found it both interesting and frustrating. But my interest in the sociology of ideas developed independently. For me, Rorty is just a case, and I remain completely agnostic in the book about the value of his philosophy.

Q:But isn't there something already a little bit pragmatism-minded about analyzing a philosopher's work in sociological terms?

A: It's certainly the case that there are affinities between pragmatism and the sociology of knowledge. But I'm not trying to advance any kind of philosophical theory of knowledge, pragmatist or otherwise. I believe, like every other sociologist of ideas, that intellectuals are social actors and that their thought is systematically shaped by their social experiences. Whether that has any philosophical implications is best left to philosophers to figure out.

I do think that the classical pragmatist philosophers Charles Peirce, William James, John Dewey, and George Herbert Mead had it right in their account of human social action, as Hans Joas has persuasively argued. Some of their insights do make their way into my analysis.

Q: A common account of Rorty's career has him starting out as an analytic philosopher who then undertakes a kind of "turn to pragmatism" in the 1970s, thereby reviving interest in a whole current of American philosophy that had become a preserve of specialists. Your telling is different. What is the biggest misconception embedded in that more familiar thumbnail version?

A: Rorty didn't start out as an analytic philosopher. His masters thesis at Chicago was on Whitehead's metaphysics, and while his dissertation at Yale on potentiality was appreciative in part of analytic contributions, one of its major aims was to show how much value there might be in dialogue between analytic and non-analytic approaches. As Bruce Kuklick has shown, dialogue between various philosophical traditions, and pluralism, were watchwords of the Yale department, and Rorty was quite taken with these metaphilosophical ideals.

Rorty only became seriously committed to the analytic enterprise after graduate school while teaching at Wellesley, his first job. This conversion was directly related to his interest in moving up in the academic hierarchy to an assistant professorship in a top ranked graduate program. At nearly all such programs at the time, analytic philosophy had come to rule the roost. This was very much the case at Princeton, which hired him away from Wellesley, and his commitment to analytic philosophy solidified even more during the years when he sought tenure there.

But the conventional account is flawed in another way as well. It turns out that Rorty read a lot of pragmatism at Yale -- Peirce in particular -- and one of the things that characterized his earliest analytic contributions was a consistent interest in pointing out convergences and overlaps between pragmatism and certain recent developments in analytic thought. So when he finally started calling himself a pragmatist later in his career, it was in many respects a return to a tradition with which he had been familiar from the start, however much he might have come to interpret it differently than specialists in American philosophy would.

Q:You argue for the value of understanding what you call "the intellectual self-concept." Would you explain that idea? What does it permit us to grasp about Rorty that we might not, otherwise?

A: As I've already suggested, my goal in this book was not simply to write a biography of Rorty, but also to make a theoretical contribution to the sociology of ideas. Surprising as it might sound to some, the leading figures in this area today -- to my mind Pierre Bourdieu and Randall Collins -- have tended to depict intellectuals as strategic actors who develop their ideas and make career plans and choices with an eye toward accumulating intellectual status and prestige. That kind of depiction naturally raises the ire of those who see intellectual pursuits as more lofty endeavors -- it was not for nothing that Bourdieu described his study, Homo Academicus, as a "book for burning."

I argue that intellectuals do in fact behave strategically much of the time, but that another important factor influencing their lines of activity is the specific "intellectual self-concept" to which they come to cleave. By this I mean the highly specific narratives of intellectual selfhood that knowledge producers may carry around with them -- narratives that characterize them as intellectuals of such and such a type.

In Rorty's case, one of the intellectual self-concepts that came to be terribly important to him was that of a "leftist American patriot." I argue that intellectual self-concepts, thus understood, are important in at least two respects: they may influence the kinds of strategic choices thinkers make (for example, shaping the nature of professional ambition), and they may also directly influence lines of intellectual activity. The growing salience to Rorty of his self-understood identity as a leftist American patriot, for example, was one of the factors that led him back toward pragmatism in the late 1970s and beyond -- or so I claim.

I develop in the book an account of how the intellectual self-concepts of thinkers form and change over the life course. Rorty took on the leftist American patriot self-concept pretty directly from his parents, and it became reactivated in the 1970s in response to political and cultural developments and also their deaths. My argument is that the sociology of ideas would do well to incorporate the notion of intellectual self-concept into its theoretical toolkit.

But I must say that my ambitions extend beyond this. Bourdieu and Collins are not just sociologists of ideas, but general sociological theorists who happened to have applied their models to intellectual life. Implicit in my respectful criticisms of them is a call to supplement and revise their general models as well, and to fold notions of identity and subjectivity back into sociological theory -- conceptualized in the specific way I lay out, which eclectically draws on Anglo-American social psychology, theories of narrative identity, the ego psychology of Erikson, and other sources.

Q: The philosopher's father, James Rorty, is reasonably well-known to cultural historians as one of the left-wing anti-Communist public intellectuals of the mid-20th century. Your account of his life is interesting, but I found a lot of it rather familiar. By contrast, the chapter on Richard Rorty's mother was a revelation. Winifred Rorty was a clearly a remarkable person, and the question of her influence on her son seems very rich. What was it like to rediscover someone whose career might otherwise be completely forgotten?

A: It's well known that Rorty's mother, Winifred, was the daughter of social gospel theologian Walter Rauschenbusch. What's less well known is that she was a research assistant to the sociologist Robert Park at the University of Chicago. Winifred never entered academe -- she didn't formally enroll as a graduate student at Chicago, and in any event the opportunities for women on the academic labor market at the time were severely limited. Instead, after she left Chicago she worked, like her husband James, as a free lance writer and journalist. Her specialties were race riots and fashion. Very late in her life she wrote a biography of Park.

I ended up devoting one chapter each to Winifred and James because their influence on their son was profound, but also because theirs were fascinating stories that hadn't really been told before. Certainly there is no shortage of scholarship on the New York intellectuals -- a group of which they were loosely a part -- but both led remarkable and distinctive intellectual and writerly lives.

In the case of Rorty's mother I didn't set out to write about someone whose career might otherwise be forgotten, but I can say that it was a great pleasure to immerse myself in her papers and writings. Too often intellectual historians and sociologists of ideas alike focus their attention on the most prominent and "successful" thinkers, but feminist historians, among others, have helpfully reminded us that the stories of those whose careers have been stymied or blocked by discrimination or other factors can be every bit as rich and worth recovering.

Q: Suppose someone were persuaded to pursue research into Rorty's life and work after 1982, working from within the approach you call the "new sociology of ideas." What questions and problems concerning that period would you most want to see studied? What manner of archival resources or other documentary material would be most important for understanding the later phase of Rorty's career?

A: There are lots of questions about this period in Rorty's life that are worth pursuing, but I think one of the most important would be to figure out why Rorty struck a chord with so many people, was vehemently hated by others, and what role exactly his scholarship played in the more general revival of interest in classical American pragmatism that has taken place over the past twenty years or so. My book focuses primarily on the development of ideas, whereas this would be a question of diffusion and reception. I don't think it's possible to give an answer to the question without doing a lot of careful empirical research.

One would want to know about the state of the various intellectual fields in which Rorty's work was received; about the self-concepts and strategic concerns of those who responded to him positively or negatively; about the role of intellectual brokers who helped to champion Rorty and translate his ideas into particular disciplinary idioms; about the availability of resources for pragmatist scholarship; about the role played by scholarly organizations, such as the Society for the Advancement of American Philosophy, in doing the kind of organizational work necessary to lay the groundwork for an intellectual revival; and so on. Here again one might use Rorty as a window into a more general social phenomenon: the emergence of what Scott Frickel and I have called "scientific/intellectual movements," in this case a movement aimed at reviving an intellectual tradition that had long been seen as moribund.

Q: Rorty gave you access to his papers. The notes to your book cite e-mail exchanges you had with him. Any personal impressions that stick with you, beyond what you've had to say in the monographic format?

A: Although Dick and I never formed a friendship, he wrote to me not long after his diagnosis to tell me about it, and to suggest that if I had any unanswered factual questions about his life, I might want to consider asking them of him sooner rather than later.

Some might see this as reflecting a concern to manage his reputation, but he read drafts of the book and -- without commenting on the plausibility of my thesis -- never asked me to change a thing. I think what it shows instead is that he was an incredibly generous, kind, and decent man, even in his final hours; he didn't want to leave a young scholar in the lurch.

Whatever one thinks of Rorty's philosophy, those are qualities all intellectuals could stand to emulate, and live by even in the midst of intense disagreement.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

The Playboy Philosopher

When introduced to American audiences from the podium or by TV interviewers, Bernard-Henri Lévy is always called a philosopher -- a label that says less about the substance of his work than the efficiency of modern public-relations techniques. Like Sartre, he is a graduate of the École Normale Supérieure. Unlike Sartre, he was formidably good-looking in his prime, and is aging gracefully. His haircuts are as thoughtful as his books are stylish. And in the spirit of Andy Warhol and Paris Hilton, Lévy has always grasped -- more profoundly, or at least more profitably, than any mere philosopher could -- an important truth: the media must constantly be fed.

Ten years ago, Pierre Bourdieu coined a term for certain French intellectuals whose writings counted for less than their TV appearances. He called them “ les fast-thinkers.” Everyone knew who the sociologist had in mind as the prototype of this phenomenon. Long before the American public got used to hearing references to J-Lo and K-Fed, the French press had dubbed him BHL. His books, movies, TV appearances, political interventions, and romances have been a staple of the French media for more than three decades. But only in the past five years has he become as much a fixture in the U.S. media as the French.

His latest opuscule -- called in translation Left in Dark Times -- has just appeared from Random House. Writing about it elsewhere, I failed to note something peculiar about this development. How it is that a volume of afterthoughts on last year’s French presidential election should appear -- in such short order, no less -- from a major commercial publisher in the United States?

It seems counterintuitive, and a matter for concern. Clearly it is time to reinvest in America’s fast-thinking infrastructure. Dependence on foreign sources of ideological methane is just too risky. Besides, as a couple of my far-flung correspondents have recently pointed out, the recent embrace of BHL by the American media is raising questions about just how gullible we really are.

Lauren Elkin, a Ph.D. candidate in English at CUNY Graduate Center and the Université de Paris VII, says that the very occasional links to BHL items on her blog tend to bring out the worst in her readers. One mention can be reliably predicted to yield 10 gripes.

“In Paris, it's just the done thing to bash BHL,” she tells me. “Recently I featured an awesome graphic that went along with a BHL piece on Sarah Palin in New York magazine -- an image of Palin getting bopped on the head with a baguette -- and I included a link to the NY mag article, because hey, I re-used their graphic, I owed them a link. The comments that followed amounted to taking the baguette and turning it on BHL!” (Well, at least it wasn’t a cream pie.)

Usually the expressions of exasperation are “all in good fun,” says Elkin. But one item at her blog -- linking to a BHL piece on Simone de Beauvoir -- provoked an exceptionally pompous display of aggravation from a French journalist.

“You and your fellow Americans,” he wrote, “should realize that BHL is not a philosopher but a clown and a buffoon. You want real French philosophy, read Derrida, Foucault, Badiou, Baudrillard, if you are a right winger, read Aron, but please forget about this pompous arrogant shmuck BHL and his unending and shameless self-promotion. As a Frenchman, I am ashamed of BHL.”

The notion that silly Americans are somehow responsible for Lévy’s prominence is a bit rich. By my estimate, his career has spanned more than a third of a century -- yet BHL, Inc., has had a fully staffed U.S. office for barely half a decade. (Note to Wikipedians: This is a figure of speech. No actual office exists, so far as I know.) And it is the work of a long, ill-spent day at the library to try to track down any discussion of his work by American intellectuals who take Lévy seriously as a philosopher. Our culture has its faults. This is not one of them.

“What really got me, as you can probably guess,” says Elkin, “was the ‘you Americans’ bit and the implication that as such we could not possibly tell Derrida from Aron, much less evaluate BHL for ourselves.” All the more galling, perhaps, given that Elkin has never concerned herself with BHL’s books. “I've been too busy reading Derrida and Foucault, so pat me on the head,” she told her blog’s interlocutor.

Given her own neglect of the playboy’s philosophy, Elkin says she “really can't comment on whether the bashing is appropriate.” But she suspects the strong feelings Lévy’s work provokes is a cultural phenomenon. “The French disdain for BHL is reflective of an inherent distaste for blatant self-promotion; as for the non-French who read my blog and write in with these comments, hating on BHL is as good a way as any to fit in.”

In an incisive review published a couple of years ago, Doug Ireland cited a critical analysis of BHL’s oeuvre, characterizing him as “a philosopher who’s never taught the subject in any university, a journalist who creates a cocktail mingling the true, the possible, and the totally false, a patch-work filmmaker, a writer without a real literary oeuvre....”

Yet Lévy swims in the main currents of European culture, and does not sink. If anything, he belongs on the short list of the world’s best-known intellectuals. How is that possible?

It seemed like a good question to pose to Arthur Goldhammer, a canny observer of French politics and culture who chairs the seminar for visiting scholars at the Center for European Studies at Harvard University. He responded to my inquiry with an e-mail note -- albeit one that amounted to a judicious essay on the mystery of BHL.

“How does he pull it off?” wrote Goldhammer. “First, it must be recognized that he's not a total fraud. Though a wretched scholar, he is neither stupid nor uneducated. His rhetoric, at least in French, has some of the old Normalien brilliance and flair. He had the wit to recognize before anyone else that a classic French role, that of the universal intellectual as moral conscience of the age, had become a media staple, creating a demand that a clever entrepreneur could exploit. He understood that it was no longer necessary first to prove one's mettle in some field of literature, art, or thought. I think that someone once said of Zsa Zsa Gabor that she was ‘famous for being famous.’ Lévy realized that one could be famous for being righteous, and that celebrity itself could establish a prima facie claim to righteousness.”

Righteous or not, BHL is certainly timely. His denunciations of Communism in the late 1970s were hardly original. But they appeared as the radical spirit of May ‘68 was exhausting itself -- and just before the Soviet invasion of Afghanistan and the Chinese party’s own denunciations of late-period Maoism. BHL developed a knack for showing up in war zones and sending out urgent dispatches. Last month he did a toe-touch in Georgia following the Russian invasion -- filing an article that was impassioned, if, it seems, imaginative.

“He chooses his causes shrewdly,” continues Goldhammer. “He may not have been the first to divine the waning of revolutionary radicalism, but he made himself revisionism's publicist. He has a knack for placing himself at the center of any scene and for depicting his presence as if it were what rendered the scene important.... His critics keep him constantly in the limelight and actually amplify his voice, and why should a ‘philosopher’ of universal range stoop to respond to ‘pedants’ who trouble the clarity of his vision with murky points of detail?”

And so he has acquired a sort of power that survives all debunking. If the topic of BHL comes up at “a typical dinner party of Parisian intellectuals,” says Goldhammer, seven of the guests will be sarcastic. “But the eighth, enticed by the allure of making a brilliant defense of a lost cause, a venerable French oratorical tradition, will launch into an elaborate defense beginning, ‘Say what you will about the man, and I wouldn't contradict a word of it, but still you must admit that for the Chechens (or Bosnians or Georgians or boat people or insert your favorite cause here), he has not been without effect.’

“The French love their litotes,” Goldhammer continues (rhetoric lesson here), “and of course no one can say that BHL has been without effect, that he has probably done more good for someone somewhere than most of us, so the revilers are reduced to sheepish silence for fear of appearing heartless.”

The role of the intellectual as famous, full-time spokesman for the Universal is well-established in France. It began with Voltaire and culminated in Sartre, its last great exemplar. (Not that other philosophers have not emerged in the meantime, of course, but none has occupied quite the same position.) From time to time, Lévy has mourned the passing of this grand tradition, while hinting, not too subtly, that it lives on in him. Clearly there is a steady French market for his line in historical reenactments of intellectual engagement.

It seems surprising, though, to find the BHL brand suddenly being imported to these shores after years of neglect -- particularly during a decade when Francophobia has become a national sport.

But like the song says, there’s a thin line between love and hate. Lévy has capitalized on American ambivalence towards France -- the potential of fascination to move from “-phobia” to “-philia” -- by performing a certain role. He is, in effect, the simulacrum of Sartre, minus the anti-imperialism and neo-Marxism.

“Lévy plays on both registers,” explains Goldhammer. “At the height of anti-French feeling in the U.S., in the period just before the Iraq War, he positioned himself as a philo-American. He made himself the avenger of Daniel Pearl. Arrogant he might be, airily infuriating in just the right way to confirm the philistine's loathing of the abstract and abstruse that philosophy is taken to embody, and yet there he was, pouring scorn on "Islamofascism" and touring the country with the New Yorker reader's nonpareil Francophile, Adam Gopnik.... Lévy chose his moment well. He insinuated himself into the American subconscious by playing against type.”

This is savvy. Also, convenient for journalists. BHL has now become “the respectable media's go-to guy whenever a French opinion is needed.” Goldhammer cites a recent article in The New York Times in which Lévy, like the presidents of Pakistan and Chile, was quoted as “as an exemplar of what ‘the world’ wants to know from the next American president.” Get in the right Rolodex, it seems, and you are the embodiment of cosmopolitanism itself.

“To those familiar with the sad nullity of Lévy's work,” says Goldhammer, “this is infuriating, but to protest is only to perpetuate the folly. His celebrity is a bubble that must be allowed to burst, but we can be sure that when it does, no crisis will ensue.”

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

'Examined Life'

Wandering around the Lyceum with an entourage, Aristotle would hold forth on his conception of the universe: one in which God is the Unmoved Mover, while all else shuttles between the potential and the actual. Part of what we know about Aristotle’s thought comes via notes from those lectures. (You picture a student scribbling furiously as the philosopher pauses to dislodge a stone from his sandal.)

This picture does not square with the usual notion of intellectual activity, which is a cross between Descartes’s self-portrait (the cogito talking to itself in a warm room) and Rodin’s nude dude. But there is a counter-tradition in philosophy -- one which takes thought to be, in essence, shambolic.

“A sedentary life is the real sin against the Holy Spirit,” says Nietzsche, blaspheming tongue not entirely in cheek. “Only those thoughts that come by walking have any value.” And more recently, Martha Nussbaum has insisted that running is an organic part of the philosopher’s professional ethos: “Lawyers tend to be tennis and squash players -- maybe it's the competitive element -- but philosophers tend to be runners, perhaps because of the loner, contemplative quality."

All of this by way of introduction to "Examined Life," the latest documentary by Astra Taylor, whose Žižek ! now turns up on the Sundance Channel from time to time. Taylor’s camera follows nine thinkers of various disciplinary extractions -- here’s a list -- as they walk on the street, ride in the backseat of a car, paddle around the pond in New York’s Central Park, and haul luggage around an international airport. They speak for about 10 minutes each -- sometimes in dialogue with Taylor or one another, sometimes in peripatetic soliloquy.

The trailer for "Examined Life" is now up on YouTube, though viewers should be warned against trying to form an impression of the film from it. "Examined Life" is more than an anthology of short lectures by famous talking heads. Taylor's intelligence as a documentarian extends to both content and form. The film is put together with a subtlety and wit that two minutes of highlights cannot capture. And she has not only scouted interesting or appropriate settings for her subjects (Anthony Appiah discussing cosmopolitanism in an airport, Slavoj Žižek challenging liberal environmentalism in a trash dump) but found common themes and points of implicit conflict among them.

But then Taylor takes another step. What might seem like a gimmick (the “philosopher-in-the-street” interview format, as I called it when blogging about the trailer last week) becomes a way to reflect on questions of context, meaning, and mobility. She does not explicitly mention Aristotle and Nietzsche, but the allusions are there, even so. Confirmation of this comes in her introduction to a book that The New Press will publish this June, based on interviews that Taylor did for the film. There, she cites another inspiration for her approach: Rousseau’s Reveries of a Solitary Walker.

One of the figures onscreen is her sister Sunuara Taylor, an artist and writer -- shown zipping through downtown San Francisco in her wheelchair with the queer theorist Judith Butler. They discuss what it means for a disabled person to “go for a walk” (and to insist on using that language even when it involv

Photo: Zeitgeist Films

Sunaura Taylor (left) and Judith Butler, in "Examined Life"

es a motor). I don’t dare try to paraphrase the exchange. The segment, which comes near the end of "Examined Life," is beautiful, fascinating, and transformative. It changes the context of all that has gone before in the film, and leaves the everyday world looking strange and new.

A couple of years ago, Tamara Chaplin, an assistant professor of history at the University of Illinois at Urbana-Champaign, published an absorbing book called Turning on the Mind: French Philosophers on Television (University of Chicago Press). It analyzed more than half a century of efforts to put abstract thought on screen. For the United States, no such monograph is necessary or, indeed, possible. The subject could be covered in a treatise the size of a take-out menu for a Chinese restaurant.

In short, Astra Taylor seems to be inventing her own genre of documentary film -- which means she is making it up as she goes along. After pestering her for an early DVD of "Examined Life," I followed up with a string of questions by e-mail about how she conceived the idea and put together the finished product.

When approaching potential participants, she described the project as “a feature length film consisting of a series of short contemplative 'walks' with world-renowned thinkers from various branches of philosophy." The formal challenge was to avoid an overly didactic approach. Getting thinkers out into public space was only part of this; it was also a matter of mode of address.

“When I first conceived the project,” says Taylor, “it was very clear to me that I wanted to try to make viewers feel like they were being engaged directly, or that they were part of a conversation even if there's only one person speaking on screen. So while half the subjects are doing direct address to the camera, the other half are actually talking to me (or in the case of Judith Butler and Sunaura Taylor, to each other). I didn't want the audience to feel lectured at, but this was difficult since the movie is monologue driven. The way the movie is directed and edited tries to make some space for viewers to insert themselves, both into the discourse and the environment.”

How did she decide who should appear on screen? “I looked for subjects whose work I value,” she responded, “who have made some sort of effort to speak to an audience outside of the academy, who focus on ethical issues, who seemed like they may enjoy the experience. The final requirement was absolutely essential. If the act of filming isn't fun, isn't a pleasure of some kind, the finished project will feel burdensome, stagnant. Slavoj Žižek, being such a movie buff, certainly brought his cinematic enthusiasm to the making of Žižek ! and that was truly invaluable. I was pleasantly surprised by the energy, playfulness, and sense of spectacle of everyone who appears in "Examined Life".... It was important to me to achieve a certain

Kwame Anthony Appiah, in "Examined Life"

diversity, not only in terms of intellectual outlook but also in regards to race, gender, age, ability, et cetera. But at a certain point it was just an intuitive sense that the cast made sense and that they would bounce well off one another.”

Everyone approached expressed a willingness to participate, but things did not always work out. The Marxist cultural critic Terry Eagleton was busy, and far away. Charles Taylor broke his arm. (I resist the temptation to ask if he didn’t just sprain it from trying to pick up a stack of his own, ever longer books.)

Taylor filmed “between 90 minutes and four hours of talking footage for each philosopher," she says, "shot over one or two days.” It then took “about two weeks to get a rough cut of each individual walk,” followed by a couple of months of work to shape the larger film. That meant “sequencing and refining, trying to tease out and highlight recurring themes, and also to figure out some sort of ‘narrative arc’ in a movie that lacks plot or chronology. How to make viewers feel they've been on a journey when there really no beginning, middle, or end to the tale?”

The result feels like a cinematic essay, instead of an educational filmstrip. It is the product of a sustained engagement with the figures onscreen, an effort to elucidate what they think and how they argue.

“I always had a bunch of prepared questions or talking points that I thought would guarantee usable material,” Taylor says. “Occasionally we worked out the brief argument we wanted to make in advance, though just as often the interview was free-floating, jumping from topic to topic, the central idea to be discovered in the editing room. Obviously a lot of material didn't make it into the final movie, which is why I decided to do the companion book.”

The project, she writes in the introduction to that volume, “doesn’t wrap everything up or pretend to provide a definitive answer to the difficult issues addressed in it; after all, if our answers were incontrovertible, we wouldn’t need philosophy.... If this effort inspires some people to pause and ponder how they come to hold the beliefs they do, to question the ethical assumptions and preconceptions they take for granted, to reconsider their responsibilities to others, or to see a problem in a new way, I’ll be content.”

(A list of playdates for "Examined Life" is available online.)

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

The Relevance of the Humanities

The deepening economic crisis has triggered a new wave of budget cuts and hiring freezes at America’s universities. Retrenchment is today’s watchword. For scholars in the humanities, arts and social sciences, the economic downturn will only exacerbate existing funding shortages. Even in more prosperous times, funding for such research has been scaled back and scholars besieged by questions concerning the relevance of their enterprise, whether measured by social impact, economic value or other sometimes misapplied benchmarks of utility.

Public funding gravitates towards scientific and medical research, with its more readily appreciated and easily discerned social benefits. In Britain, the fiscal plight of the arts and humanities is so dire that the Institute of Ideas recently sponsored a debate at King’s College London that directly addressed the question, “Do the arts have to re-brand themselves as useful to justify public money?”

In addition to decrying the rising tide of philistinism, some scholars might also be tempted to agree with Stanley Fish, who infamously asserted that humanities “cannot be justified except in relation to the pleasure they give to those who enjoy them.” Fish rejected the notion that the humanities can be validated by some standard external to them. He dismissed as wrong-headed “measures like increased economic productivity, or the fashioning of an informed citizenry, or the sharpening of moral perception, or the lessening of prejudice and discrimination.”

There is little doubt that the value of the humanities and social sciences far outstrip any simple measurement. As universities and national funding bodies face painful financial decisions and are forced to prioritize the allocation of scarce resources, however, scholars must guard against such complacency. Instead, I argue, scholars in the social sciences, arts, and humanities should consider seriously how the often underestimated value of their teaching and research could be further justified to the wider public through substantive contributions to today’s most pressing policy questions.

This present moment is a propitious one for reconsidering the function of academic scholarship in public life. The election of a new president brings with it an unprecedented opportunity for scholars in the humanities and social sciences. The meltdown of the financial markets has focused public attention on additional challenges of massive proportions, including the fading of American primacy and the swift rise of a polycentric world.

Confronting the palpable prospect of American decline will demand contributions from all sectors of society, including the universities, the nation’s greatest untapped resource. According to the Times Higher Education Supplement’s recently released rankings, the U.S. boasts 13 of the world’s top 20 universities, and 36 U.S. institutions figure in the global top 100. How can scholars in the arts, humanities and social sciences make a difference at this crucial historical juncture? How can they demonstrate the public benefits of their specialist research and accumulated learning?

A report published by the British Academy in September contains some valuable guidance. It argues that the collaboration between government and university researchers in the social sciences and humanities must be bolstered. The report, “Punching Our Weight: the Humanities and Social Sciences in Public Policy Making” emphasizes how expanded contact between government and humanities and social science researchers could improve the effectiveness of public programs. It recommends “incentivizing high quality public policy engagement.” It suggests that universities and public funding bodies should “encourage, assess and reward” scholars who interact with government. The British Academy study further hints that university promotion criteria, funding priorities, and even research agendas should be driven, at least in part, by the major challenges facing government.

The British Academy report acknowledges that “there is a risk that pressure to develop simplistic measures will eventually lead to harmful distortions in the quality of research,” but contends that the potential benefits outweigh the risks.

The report mentions several specific areas where researchers in the social sciences and humanities can improve policy design, implementation, and assessment. These include the social and economic challenges posed by globalization; innovative comprehensive measurements of human well-being; understanding and predicting human behavior; overcoming barriers to cross-cultural communication; and historical perspectives on contemporary policy problems.

The British Academy report offers insights that the U.S. government and American scholars could appropriate. It is not farfetched to imagine government-university collaboration on a wide range of crucial issues, including public transport infrastructure, early childhood education, green design, civil war mediation, food security, ethnic strife, poverty alleviation, city planning, and immigration reform. A broader national conversation to address the underlying causes of the present crisis is sorely needed. By putting their well-honed powers of perception and analysis in the public interest, scholars can demonstrate that learning and research deserve the public funding and esteem which has been waning in recent decades.

The active collaboration of scholars with government will be anathema to those who conceive of the university as a bulwark against the ever encroaching, nefarious influence of the state. The call for expanded university-government collaboration may provoke distasteful memories of the enlistment of academe in the service of the Cold War and the Vietnam War, a relationship which produced unedifying intellectual output and dreadfully compromised scholarship.

To some degree, then, skepticism toward the sort of government-university collaboration advocated here is fully warranted by the specter of the past. Moreover, the few recent efforts by the federal government to engage with researchers in the social sciences and humanities have not exactly inspired confidence.

The Pentagon’s newly launched Minerva Initiative, to say nothing of the Army’s much-criticized Human Terrain System, has generated a storm of controversy, mainly from those researchers who fear that scholarship will be placed in the service of war and counter-insurgency in Iraq and Afghanistan and produce ideologically distorted scholarship.

Certainly, the Minerva Initiative’s areas of funded research -- “Chinese military and technology studies, Iraqi and Terrorist perspective projects, religious and ideological studies," according to its Web site -- raise red flags for many university-based researchers. Yet I would argue that frustration with the Bush administration and its policies must not preclude a dispassionate analysis of the Minerva Initiative and block recognition of its enormous potential for fostering and deepening links between university research and public policy communities. The baby should not be thrown out with the bathwater. The Minerva Initiative, in a much-reformed form, represents a model upon which future university-government interaction might be built.

Cooperation between scholars in the social sciences and humanities and all of the government’s departments should be enhanced by expanding the channels of communication among them. The challenge is to establish a framework for engagement that poses a reduced threat to research ethics, eliminates selection bias in the applicant pool for funding, and maintains high scholarly standards. Were these barriers to effective collaboration overcome, it would be exhilarating to contemplate the proliferation of a series of “Minerva Initiatives” in various departments of the executive branch. Wouldn’t government policies and services -- in areas as different as the environmental degradation, foreign aid effectiveness, health care delivery, math and science achievement in secondary schools, and drug policy -- improve dramatically were they able to harness the sharpest minds and cutting-edge research that America’s universities have to offer?

What concrete forms could such university-government collaboration take? There are several immediate steps that could be taken. First, it is important to build on existing robust linkages. The State Department and DoD already have policy planning teams that engage with scholars and academic scholarship. Expanding the budgets as well as scope of these offices could produce immediate benefits.

Second, the departments of the executive branch of the federal government, especially Health and Human Services, Education, Interior, Homeland Security, and Labor, should devise ways of harnessing academic research on the Minerva Initiative model. There must be a clear assessment of where research can lead to the production of more effective policies. Special care must be taken to ensure that the scholarly standards are not adversely compromised.

Third, universities, especially public universities, should incentivize academic engagement with pressing federal initiatives. It is reasonable to envision promotion criteria modified to reward such interaction, whether it takes the form of placements in federal agencies or the production of policy relevant, though still rigorous, scholarship. Fourth, university presidents of all institutions need to renew the perennial debate concerning the purpose of higher education in American public life. Curricula and institutional missions may need to align more closely with national priorities than they do today.

The public’s commitment to scholarship, with its robust tradition of analysis and investigation, must extend well beyond the short-term needs of the economy or exigencies imposed by military entanglements. Academic research and teaching in the humanities, arts and social sciences plays a crucial role in sustaining a culture of open, informed debate that buttresses American democracy. The many-stranded national crisis, however, offers a golden opportunity for broad, meaningful civic engagement by America’s scholars and university teachers. The public benefits of engaging in the policy-making process are, potentially, vast.

Greater university-government cooperation could reaffirm and make visible the public importance of research in the humanities, arts and social sciences.

Not all academic disciplines lend themselves to such public engagement. It is hard to imagine scholars in comparative literature or art history participating with great frequency in such initiatives.

But for those scholars whose work can shed light on and contribute to the solution of massive public conundrums that the nation faces, the opportunity afforded by the election of a new president should not be squandered. Standing aloof is an unaffordable luxury for universities at the moment. The present conjuncture requires enhanced public engagement; the stakes are too high to stand aside.

Author/s: 
Gabriel Paquette
Author's email: 
doug.lederman@insidehighered.com

Gabriel Paquette is a lecturer in the history department at Harvard University.

Of Love

For most of us, any effort to philosophize about love would be an invitation to embarrassment. And not simply because of any limitations in our conceptual apparatus, or the considerable difficulty in avoiding cliches. Here, speculation soon runs to confession. The autobiographical strata of our thoughts do not stay buried for long. Much wisdom in this arena is, after all, the product of mistakes, if not necessarily of disillusionment.

This is true even if you are happy in love -- maybe especially then. It means the regrets have been sublated; you’ve made something out of them, which is no small feat. I won’t get any more memoiristic here than that ... except to say that luck has a lot to do with it.

At the other extreme from our episodic private fumblings towards meaning in such matters, we have the work of Irving Singer, a professor of philosophy at MIT. In 1966 he published The Nature of Love: From Plato to Luther -- the first volume of what became a trilogy covering thinking on its subject through the 20th century. Singer is thorough. Also unrelenting: since completing the trilogy in 1987, he devoted more books and papers to the subject. The MIT Press has just published his Philosophy of Love: A Partial Summing-Up -- a short, swift book that serves as a kind of portico to the Irving Singer Library, a series that will reprint his collected works (including the trilogy) in a uniform edition.

My own previous exposure to Singer’s work had been limited to The Pursuit of Love (Johns Hopkins University Press, 1994). But even from this narrow sampling, it was obvious that the durability of his concern reflects, in part, the scope of the subject, which is deep and broad. Once in love, you don’t get back out easily, or maybe ever. This is true of intellection about it, as well as the experience itself.

And while the new book is a scholarly rather than an intimate self-portrait, Singer makes clear that his interest in the history of philosophical efforts to think about love had a personal dimension. “I was motivated,” he writes, “by anxieties, confusions, unresolved ambivalences within myself as a human being and not merely as a thinker.... I felt that I could overcome the dilemmas in my own affective life by a careful, albeit plodding, analysis of what matters to everyone.”

We are spared the details, though that seems just as well. Philosophy of Love is intellectual autobiography in a casual register, but it does not offer false intimacy. Its approach takes for granted that the reader will share Singer’s concerns by virtue of a common share in the human condition.

That may sound presumptuous. Who’s to say there is a common human condition, anyway? There is some basis for this reservation: Singer’s literary and philosophical references are exclusively Western, nearly all of the figures he cites are male, and his conception of sexuality tends to be (as a clumsy word has it) heteronormative. I do not point this out in a spirit of political correctness. I point it out in the spirit of being an adult -- one who has lived long enough (and among enough sorts of people) to have lost the illusion that he can undertand very much about how other people experience the world. So isn’t Singer in danger of thinking on the basis of too narrow a set of references?

The short answer here is that yes, he is -- but he knows it, and remains open to argument and emendation. For the thrust of Singer’s reflection is always towards showing that the nature of love is far more complex than any of the available notions for understanding it would make it seem. He is an empiricist of the heart.

“I don’t think,” Singer writes, “that large-scale terms like love, happiness, meaning of life, meaning in life, sex, beauty, and such, are able to have any one definition. These phenomena are so enormous within our human nature – and the same is true of what we even mean by human nature – that we cannot justifiably constrict them within a single, fixed, and all-embracing definition....There will always be realities of feeling and experience that do not fit.”

What this leaves for the philosopher to do, then, is a more or less open-ended process of analyzing the inherited ideas about love (from Plato, Shakespeare, Freud, Sartre, etc.) in a way that is not simply a form of intellectual history – for the ideas are woven into our experience in ways that are terrifically subtle and tenacious.

A case in point is the doctrine that love is a desire for merger between two people. It is embodied in the myth recounted by Aristophanes in Plato’s “Symposium”; aspects of it can be found in Freudian theory. “There is a kind of romanticism that predicates a basic hunger in everyone for some such fusion,” says Singer. “Without denying the frequency of this aspiration, I see little reason to think that it is characteristic of all forms of romantic attachment, and I’m sure it is not fulfilled in any actual cases of love.”

We are, after all, ineluctably distinct: “The most that can happen is that because you think you’re merging, you end up falsifying ingredients in the reality of your relationship.” Some people figure this out; others never do. Either way, the power of the desire is not lessened by millennia of metaphysical speculation, not to mention half the popular songs ever written. It feeds “the overwhelming and quasi-religious emotionality that men and women may get from love, particularly sexual love” – but without recognizing another dimension of it, which is much more complex.

This Singer calls “bestowal” – one of his few terminological innovations, resting on a contrast with “appraisal” that becomes all the richer as he pursues it. Appraisal he defines as “the ability to discover value, in oneself or in other people.” This may or may not be passionate; for that matter, it can be quite self-interested and even utilitarian. The very term, with its mercantile overtones, suggests as much: “At the level of mere appraisal, we are all commodities for each other.”

By contrast, what Singer calls bestowal involves “an engendering of value by means of the relationship we have established, by means of one’s appreciative attitude toward the person, thing, or ideal to which we attend. It’s a kind of projection. It’s a creating of affective value, both in oneself and in the other....”

Appraisal identifies a value. Bestowal surpasses existing values, generating something new, something greater than the sum of its parts. It is a gift – one that you receive in the act of giving. And being in the nature of creativity itself, bestowal is open-ended, with no fixed rules for its possible forms. This is a vast idea, bigger than romantic love. Happy is the person who finds both together.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Kass Backwards

Last week Leon Kass, chairman of the Council of Bioethics under President Bush, took to the podium to deliver the Jefferson Lecture of the National Endowment for the Humanities -- an event I did not go to, though it was covered by one of IHE's intrepid reporters.

My reluctance to attend suggests that, without noticing it, I have come to accept Kass’s best-known idea, “the wisdom of repugnance.” There is, alas, all too little evidence I am getting any wiser with age -- but my visceral aversion to hearing a Bush appointee talk about human values is inarguable.

As you may recall, Kass wrote in the late 1990s that biotechnological developments such as cloning are “the emotional expression of deep wisdom, beyond reason’s power fully to articulate it.” In our rising gorge, he insisted, “we intuit and feel, immediately and without argument, the violation of things that we rightfully hold dear.... Shallow are the souls that have forgotten how to shudder.”

Judged simply as an argument, this is not, let’s say, apodictically persuasive. Anyone who as ever taken an introductory anthropology course, or read Herodotus -- or gone to a different part of town -- will have learned that different groups feel disgust at different things. The affect seems to be hard-wired into us, but the occasions provoking it are varied.

Kass invoked the "wisdom of repugnance" a few years before he joined an administration that treated the willingness to torture as a great moral virtue -- meanwhile coddling bigots for whom rage at gay marriage was an appropriate response to “the violation of things we hold rightfully dear.”

Now, as it happens, some of us do indeed feel disgust at one of these practices, and not at the other. We also suspect that Kass’s aphorism about the shallowness of souls that have forgotten how to shudder would make a splendid epigraph for the chapter in American history that has just closed.

In short, disgust is not quite so unambiguous and inarguable an expression of timeless values as its champion on the faculty of the University of Chicago has advertised. Given a choice between “deep wisdom” and “reason’s power fully to articulate,” we might do best to leave the ineffable to Oprah.

There is no serious alternative to remaining within the limits of reason. Which means argument, and indeed the valuing of argument -- however frustrating and inconclusive -- because even determining what the limits of reason themselves are tends to be very difficult.

Welcome to modernity. It’s like this pretty much all the time.

The account of Kass's speech in IHE -- and the text of it, also available online -- confirmed something that I would have been willing to wager my paycheck on, had there been a compulsive gambler around to take the bet. For I felt certain that Kass would claim, at some point, that the humanities are in bad shape because nobody reads the “great works” because everybody is too busy with the “deconstruction.”

It often seems like the culture wars are, in themselves, a particularly brainless form of mass culture. Some video game, perhaps, in which players keep shooting at the same zombies over and over, because they never change and just keep coming -- which is really good practice in case you ever have to shoot at zombies in real life, but otherwise is not particularly good exercise.

The reality is that you encounter actual deconstructionists nowadays only slightly more often than zombies. People who keep going on about them sound (to vary references a bit) like Grandpa Simpson ranting about the Beatles. Reading The New Criterion, you'd think that Derrida was still giving sold-out concerts at Che Stadium. Sadly, no.

But then it never makes any difference to point out that the center of gravity for argumentation has shifted quite a lot over the past 25 years. What matters is not actually knowing anything about the humanities in particular -- just that you dislike them in general.

The logic runs something like: “What I hate about the humanities is deconstructionism, because I have decided that everything I dislike should be called ‘deconstructionism.’ ” Q.E.D.!

Kass complained that people in the humanities fail to discuss the true, the good, and the beautiful; or the relationships between humanity, nature, and the divine; or the danger that comes from assuming that technical progress implies the growth of moral and civic virtue. Clearly this is a man who has not stopped at the new books shelf in a library since the elder George Bush was Vice President.

And so last week’s Jefferson lecture was, perhaps, an encouraging moment, in spite of everything. With it, Leon Kass was saying farewell to Washington for, with any luck, a good long while. Maybe now he can spend some time catching up with the range of work people in the humanities have actually been doing. At very least he could read some Martha Nussbaum.

Then he might even pause to reflect on his own role as hired philosopher for an administration that revived one of the interrogation techniques of the Khmer Rouge. The wisdom of repugnance begins at home.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Pages

Subscribe to RSS - Philosophy
Back to Top