Essay on what's missing in discussion of the humanities

Over the last year there has been a steady stream of articles about the “crisis in the humanities,” fostering a sense that students are stampeding from liberal education toward more vocationally oriented studies. In fact, the decline in humanities enrollments, as some have pointed out, is wildly overstated, and much of that decline occurred in the 1970s and 1980s. Still, the press is filled with tales about parents riding herd on their offspring lest they be attracted to literature or history rather than to courses that teach them to develop new apps for the next, smarter phone.

America has long been ambivalent about learning for its own sake, at times investing heavily in free inquiry and lifelong learning, and at other times worrying that we need more specialized training to be economically competitive. A century ago these worries were intense, and then, as now, pundits talked about a flight from the humanities toward the hard sciences.

Liberal education was a core American value in the first half of the 20th century, but a value under enormous pressure from demographic expansion and the development of more consistent public schooling. The increase in the population considering postsecondary education was dramatic. In 1910 only 9 percent of students received a high school diploma; by 1940 it was 50 percent. For the great majority of those who went on to college, that education would be primarily vocational, whether in agriculture, business, or the mechanical arts. But even vocationally oriented programs usually included a liberal curriculum -- a curriculum that would provide an educational base on which one could continue to learn -- rather than just skills for the next job. Still, there were some then (as now) who worried that the lower classes were getting “too much education.”

Within the academy, between the World Wars, the sciences assumed greater and greater importance. Discoveries in physics, chemistry, and biology did not seem to depend on the moral, political, or cultural education of the researchers – specialization seemed to trump broad humanistic learning. These discoveries had a powerful impact on industry, the military, and health care; they created jobs! Specialized scientific research at universities produced tangible results, and its methodologies – especially rigorous experimentation – could be exported to transform private industry and the public sphere. Science was seen to be racing into the future, and some questioned whether the traditional ideas of liberal learning were merely archaic vestiges of a mode of education that should be left behind.

In reaction to this ascendancy of the sciences, many literature departments reimagined themselves as realms of value and heightened subjectivity, as opposed to so-called value-free, objective work. These “new humanists” of the 1920s portrayed the study of literature as an antidote to the spiritual vacuum left by hyperspecialization. They saw the study of literature as leading to a greater appreciation of cultural significance and a personal search for meaning, and these notions quickly spilled over into other areas of humanistic study. Historians and philosophers emphasized the synthetic dimensions of their endeavors, pointing out how they were able to bring ideas and facts together to help students create meaning. And arts instruction was reimagined as part of the development of a student’s ability to explore great works that expressed the highest values of a civilization. Artists were brought to campuses to inspire students rather than to teach them the nuances of their craft. During this interwar period a liberal education surely included the sciences, but many educators insisted that it not be reduced to them. The critical development of values and meaning was a core function of education.

Thus, despite the pressures of social change and of the compelling results of specialized scientific research, there remained strong support for the notion that liberal education and learning for its own sake were essential for an educated citizenry. And rather than restrict a nonvocational education to established elites, many saw this broad teaching as a vehicle for ensuring commonality in a country of immigrants. Free inquiry would model basic democratic values, and young people would be socialized to American civil society by learning to think for themselves.

By the 1930s, an era in which ideological indoctrination and fanaticism were recognized as antithetical to American civil society, liberal education was acclaimed as key to the development of free citizens. Totalitarian regimes embraced technological development, but they could not tolerate the free discussion that led to a critical appraisal of civic values. Here is the president of Harvard, James Bryant Conant, speaking to undergraduates just two years after Hitler had come to power in Germany:

To my mind, one of the most important aspects of a college education is that it provides a vigorous stimulus to independent thinking.... The desire to know more about the different sides of a question, a craving to understand something of the opinions of other peoples and other times mark the educated man. Education should not put the mind in a straitjacket of conventional formulas but should provide it with the nourishment on which it may unceasingly expand and grow. Think for yourselves! Absorb knowledge wherever possible and listen to the opinions of those more experienced than yourself, but don’t let any one do your thinking for you.

This was the 1930s version of liberal learning, and in it you can hear echoes of Thomas Jefferson’s idea of autonomy and Ralph Waldo Emerson’s thoughts on self-reliance.

In the interwar period the emphasis on science did not, in fact, lead to a rejection of broad humanistic education. Science was a facet of this education. Today, we must not let our embrace of STEM fields undermine our well-founded faith in the capacity of the humanities to help us resist “the straitjackets of conventional formulas.” Our independence, our freedom, has depended on not letting anyone else do our thinking for us. And that has demanded learning for its own sake; it has demanded a liberal education. It still does.

Michael Roth is president of Wesleyan University. His new book, Beyond the University: Why Liberal Education Matters, will be published next year by Yale University Press. His Twitter handle is @mroth78

Image Source: 
Getty Images

Essay on Colin McGinn, reviewing, and the perils of paraphrase

Intellectual Affairs

In some fields, the ground tone of reviews is normally subtle and soothing: the sound of logs being gently rolled. Poets are especially prone to giving one another gold stars for effort. Journals devoted to contemporary art have cultivated a dialect in which even the syntax is oblique. The reviewers’ judgments (if that is what they are) resist paraphrase.

Philosophy is another matter. Extremely critical and even brutal book reviews, while hardly the norm, are at least an occupational hazard. The drubbing may be delivered in measured terms and an even tone. But the really memorable assessments nip near the jugular vein, when not ripping it wide open. Consider, for instance, this classic, from a notice appearing in the venerable British journal Mind, in 1921: “[The author’s] method of exegesis consists, in fact, of a combination of the suppressio veri with the suggestio falsi, both, of course, practised in the absolute good faith which comes from propagandist enthusiasm unchecked by any infusion of historical sense.... It is because Mr. Urwick's book is one long dogmatising without knowledge that I feel bound to put it on record that of all bad books on Plato his is the very worst.”

In a subsequent issue, Mr. Urwick called it “a delightfully abusive review,” which in the science of fistics is called “taking it on the chin.”

An enterprising publisher could put together an anthology of the such colorful passages of philosophers shellacking one another’s work. The anthologist would need to limit the scope to reviews from professional journals; invective from blogs or general-interest publications render the volume unwieldy. Another challenge would be limiting how much space it devotes to reviews by Colin McGinn are included.

As aggressive in the stately pages of The Philosophical Review as when writing for The New York Review of Books, McGinn set a new benchmark for philosophical savaging in 2007 with his comments on Ted Honderich's On Consciousness. But his published opinion of the book (“the full gamut from the mediocre to the ludicrous to the merely bad.... painful to read, poorly thought out, and uninformed”) is by no means as rough as it could have been.  “The review that appears here is not as I originally wrote it,” reads McGinn’s note accompanying the piece in The Philosophical Review. “The editors asked me to 'soften the tone' of the original; I have done so, though against my better judgment.”

The men were once colleagues at University College London, as it is somehow proves unsurprising to learn, and they have a long history of personal and intellectual hostilities, during which Honderich gave as good as he got. (Honderich is now professor emeritus of philosophy of mind and logic. McGinn was professor of philosophy at the University of Miami until his recent resignation.)

The review, and its prehistory and aftermath, inspired an interesting and unusual paper in the Journal of Consciousness Studies. It elucidates the technical questions at issue involved while also bringing the distance between gossip and intellectual history to an all-time low. McGinn is prolific, and I have not kept up with work in the two years since writing about The Meaning of Disgust in this column. But in the meantime, certain of his writings have generated even more attention and commentary than his phillipic against Honderich did.

The prose in question took the form of email and text messages to a research assistant in which he allegedly told her he “had a hand job imagining you giving me a hand job.” (Here is one account of the matter, behind a paywall, though you can also find a PDF of the article for free at this site; and here is another.) I attach the word “allegedly” per protocol, but McGinn has defended his use of the expression “hand job” as part of the philosophical banter around his work on a theory “that ostension and prehension are connected and that the mind is a ‘grasping organ,’ ” as the abstract for a lecture he gave last year puts it. Hence, most forms of human activity are -- ultimately, in a certain sense – “hand jobs.” He has issued a manifesto.

The new issue of Harper’s magazine reprints, under the title “Out on a Limb,” a blog post by McGinn from June 2013 in which he explains: “I have in fact written a whole book about the hand, Prehension, in which its ubiquity is noted and celebrated… I have given a semester-long seminar discussing the hand and locutions related to it. I now tend to use ‘hand job’ in the capacious sense just outlined, sometimes with humorous intent…. Academics like riddles and word games.”

Some more than others, clearly. McGinn then considers the complexity of the speech-act of one professional glassblower asking another, “Will you do a blow job for me while I eat my sandwich?” The argument here is that nothing he did should be regarded as sexual harassment of a graduate student, and the real victim here is McGinn himself: “One has a duty to take all aspects of the speech situation into account and not indulge in rash paraphrases. And one should also not underestimate the sophistication of the speaker.”

Nor overestimate the usefulness of sophistication as a shovel, once one has dug oneself into a hole and needs to get back out. McGinn subsequently thought the better of this little essay and deleted it from his blog, but the Harper’s “Readings” section preserves it for posterity. Life would be much simpler if good judgment weren’t so tardy at times.

The whole matter might reveal its true philosophical depths once Prehension is available. But Amazon doesn’t list it as forthcoming, and lately McGinn’s name seems to come up most often in discussions of sexual harassment, or of the tendency of philosophy as a discipline to resemble the Little Rascals’ treehouse.

But a recent review in Mind ( the journal that gave Urwick such delight in 1921) might shift attention away from McGinn’s alleged peccadillos and the hazards of paraphrase. Arguably it raises the bar higher than even his critique of Honderich did. It starts out with relatively understated and rather donnish clucking over the author’s transgression of specialist boundaries. By the end the end, the gloves are off:  

“As was said of the Sokal hoax, there is simply no way to do justice to the cringe-inducing nature of this text without quoting it in its entirety. But, in a nutshell, Basic Structures of Reality is an impressively inept contribution to philosophy of physics, and one exemplifying everything that can possibly go wrong with metaphysics: it is mind-numbingly repetitive, toe-curlingly pretentious, and amateurish in the extreme regarding the incorporation of physical fact. With work this grim, the only interesting questions one can raise concern not the content directly but the conditions that made it possible; and in this connection, one might be tempted to present the book as further evidence of the lack of engagement of metaphysicians with real science — something that has lately been subject to lively discussion (and I myself have slung some of the mud). But I would insist that to use this work to make a general point about the discipline would in fact be entirely unfair...

“For all the epistemic faux-modesty that this book purports to defend, the image that persists while grinding through its pages is of an individual ludicrously fancying themselves as uniquely positioned to solve the big questions for us, from scratch and unassisted, as if none of the rest of us working in the field have had anything worth a damn to contribute. It will however be clear by now that I take the reality to be substantially different. For me, then, the one pertinent question this work raises is why all of this went unrecognized: this book, after all, issues not from one of the many spurious publishing houses currently trolling graduate students, but Oxford University Press — a press whose stated aim is to ‘publish works that further Oxford University’s objective of excellence in research, scholarship, and education’. So why did they publish this?”

The reviewer ventures an explanation: The author of the offending volume “is a ‘big name’; and if that is sufficient for getting work this farcical in print with [Oxford University Press], then shame on our field as a whole.” The book could well provoke a worthwhile discussion, though sadly one focused on concerns rather different from those he himself had in mind.”

I came across this takedown within about an hour of reading the blog post reprinted by Harper’s. The author is Colin McGinn – the author of the book in question, that is, Basic Structures of Reality: Essays in Meta-Physics (Oxford, 2011). The reviewer is Kerry McKenzie, a postdoctoral fellow in philosophy at the University of Western Ontario. The piece in Mind is only her second review-essay, but I’d say it’s one for the anthology. (Note: This essay has been updated from an earlier version to correct Kerry McKenzie's current institution.)

Editorial Tags: 

Review of Julia Haslett, 'An Encounter with Simone Weil'

Philosophers have lives; saints have legends. No miracles are associated with Simone Weil (1909-43), and a glance at reference books on philosophy finds her listed alongside Jean-Paul Sartre and Simone de Beauvoir, her classmates at the  École normale supérieure. But she looks odd in their company -- in any company, really. Frail but indomitable, she was in the world but not quite of it. 

French intellectuals of her generation wrote essays on Marxism and the Spanish Civil War, while she worked in a factory and went to the front as part of an antifascist militia. She had an extraordinarily acute (some would say morbid) awareness of the depths and the extent of human suffering; it made comfort seem like complicity. Combine that sensitivity with her conviction that our world is the diminished or faulty image of a realm in which truth and justice are real and absolute – a Platonic notion, flecked perhaps with Gnostic elements -- and you have someone with a vocation, rather than a career.

Weil died at the age of 34, under what seem to have been suspiciously beatific circumstances: she succumbed to tuberculosis after months of refusing to eat more than the rations available to the French compatriots under German occupation. Her collected writings, which run to several stout volumes, range from pacifist essays and interventions in trade-union debates to reflections on atheism and mysticism (not entirely antithetical terms, in her experience) and studies of classical Greek literature and philosophy. Most of this work remained unpublished during her lifetime, apart from scattered essays in journals of no wide circulation.

At a couple of points in Julia Haslett’s film  “An Encounter with Simone Weil,” the camera focuses on a few lines of a manuscript, the words in French and Greek. Her handwriting appears small, precise, and highly concentrated – like Weil herself, by all accounts. “She was apparently unacquainted with doubt,” wrote Raymond Aron, a philosopher and political journalist who knew her in the 1920s, “and, although her opinions might change, they were always thoroughly categorical.” His choice of words may allude to one of Weil’s nicknames from their student days: “The Categorical Imperative in Skirts.”

“An Encounter with Simone Weil,” billed on its Facebook promotional page as “a documentary by Julia Haslett,” made the rounds of film festivals in 2011, and in the meantime the director (a visiting associate professor of cinema and comparative literature at the University of Iowa) has screened it at two dozen colleges and universities throughout the United States. It will be available on DVD and various digital platforms sometime in the next few months. Upon request, the director sent me a screener, indicating that she had just finished work on the French language version. The film is already available in Italian, with the German and Japanese translations due out this fall, and a Korean version in the works.

“Encounter” is at least as much a personal essay as a biographical portrait. Haslett’s fascination began when she came across a quotation from one of Weil’s letters: “Attention is the rarest and purest form of generosity.” It was an ideal point of entry, given how often Weil’s aphorisms sound like one of Pascal’s Pensées, and also how much moral and intellectual significance the word “attention” turns out to have in her work.

But it also carried a strong personal connotation. In the voiceover Haslett tells us of her father’s suicide when she was young, and in a video clip her brother discusses his struggle with anxiety and depression. “My father's death taught me that if I don't pay attention, someone might die," she says -- an enormous burden to have to carry.

Although Weil cannot be said to lighten the load, she at least understood the stakes. “The capacity to give one's attention to a sufferer is a very rare and difficult thing,” she says in another passage that Haslett quotes. “It is almost a miracle; it is a miracle.”

In short order she read Francine Du Plessix Gray’s biography of Weil (published by Penguin in 2001) in a single sitting. "Here was this brilliant, deeply ethical young woman speaking truth to power, putting her body on the line for her convictions, and providing such an incisive critique of political and economic power,” she told me an in e-mail exchange. “And yet I'd never read her. That despite studying philosophy, religion, and history at Swarthmore College -- an institution that embodies the Weil-like values of rigorous intellectual inquiry and a deeply held commitment to social justice.”

In recounting Weil’s period as a left-wing militant in the early years of the Great Depression, “Encounter” shows Haslett going over film footage of mass demonstrations in the archives of the French Communist Party – searching, almost desperately, to catch a glimpse of Weil in the crowd, but with no luck. She visits the apartment building in New York where Weil lived for a few months in 1942, and interviews the (very) few remaining people who knew Weil, as well as one of the editors preparing her collected works.

The effort to establish a connection with the philosopher even extends to having Soraya Broukhim (an actress with some resemblance to Weil) read enough of Weil's work to improvise responses in a mock interview. The sequence is odd. By the end, the situation has become unmistakably awkward for both parties. When Broukhim complains that Haslett seems to want answers she can’t give, the effect is strangely revealing. For a moment, it’s not quite clear whether she’s doing so in character, as Weil, or in real life.

To call Haslett’s quest a kind of pilgrimage would be tempting, if not for the most striking thing about the film: its emphasis on her as a secular figure and an activist. Most people who become interested in Weil do so through the theological side of her work. Even her appeal for nonbelievers, such as Albert Camus, comes in large measure from an awareness of her "tortured prowling outside the doors of the Catholic Church, like a starving wild animal,” to borrow the poet Kenneth Rexroth’s apt characterization.

Weil’s spiritual writings “are certainly the reason she gets studied in this country,” Haslett acknowledged by e-mail. “For example, many of the annual meetings of the American Weil Society are held at theological seminaries and most participants are religious scholars or at least people of faith.” But for the director, “it was the way she combined such an incisive critique of power and her willingness to sacrifice everything to tell the truth as she knew it (by directly experiencing that about which she wrote) that drew me in so completely.… She became a guide for me through the very dark first decade of this century, when our politicians were dispensing with the truth and our media wasn't holding them accountable.”

The film gives due attention to Weil's religious passion, but suggests that her mystical turn came after deep disillusionment with radical politics. Haslett seems largely uninterested in the very difficult matter of Weil’s relationship to Judaism. (Her parents were so completely assimilated into French secular culture that they neglected to mention this element of her identity until she was about 10 years old). And the director treads lightly around the topic of Weil’s mental health, although she does briefly consider the possibility that the final period of self-denial may have been a kind of suicide by self-starvation.

Haslett makes her resistance to certain aspects of her subject’s life and thought explicit, saying in one voiceover that she felt “betrayed by Weil’s turn toward God.” This does not detract from the value of the film in the least.

On the contrary, ambivalence and discomfort are essential to any meaningful encounter with Weil. To borrow a remark by T.S. Eliot, whose grounds for admiration were as different from Haslett’s as they could be: “I cannot conceive of anybody agreeing with all of her views, or of not disagreeing violently with some of them. But agreement and rejection are secondary: what matters is to make contact with a great soul.”

For more information about the film, or to arrange a screening, see the Line Street Productions website.

Editorial Tags: 

APA forms committee to tackle sexual harassment in philosophy departments

Smart Title: 

Disturbed by reports of gender discrimination in philosophy, the American Philosophical Association announces a new study, including site visits to some departments.

New book argues for favoritism and against 'fairness'

Smart Title: 

Philosopher discusses new book on why favoritism is better than fairness.

Essay on the idea that non-philosophers should judge philosophers

One of the oldest questions of philosophy is, "Who guards the guardians?" When Plato posed this question -- if not quite this succinctly -- his concern was with how a community can keep its leaders focused on the good of the whole. Plato's answer was that guardians should govern themselves — philosophy would train their souls so that they would choose wisely rather than unjustly. Kings would become philosophers, and philosophers kings.

This is not how we do things today. In representative forms of government the people rule, at least intermittently, through processes such as voting, recalls, and referenda. Particularly within the American experiment everybody guards everyone else — through a system of "checks and balances." But there is at least one major institution that still follows Plato's lead, relying on self-governance and remaining proudly nondemocratic: the academy.

We academics have long argued that we have a special justification for self-rule. We claim that our activities — which consist of the production of knowledge, and its dissemination via presentations, publications, and teaching — are so specialized and so important that ordinary people cannot properly judge our work. Instead, we have devised a way to evaluate ourselves, through a process known as peer review.

Whether it is a matter of articles or books, grant applications, or tenure and promotion, review by one's academic peers has long been the standard. And who are one's peers? The academy gives a disciplinary answer to this question. Biologists are the ones competent to judge work in biology, and only chemists can judge the research of other chemists. Nonexperts — whether within or outside the academy — will only disrupt the process, leading to misguided or even disastrous results. Best to leave such questions to the experts.

But what of philosophy? Across the 20th century and now into the 21st, philosophers have been evaluated in the same way. Even while claiming that philosophy has a special relevance to everyday life, philosophers have mostly written for and been evaluated by their disciplinary peers. Philosophy became more and more professionalized in the 20th century, with nonexperts increasingly unable to comprehend, much less judge, the work of philosophers. A philosopher today is not considered successful unless he or she contributes to professional, peer-reviewed publications in the field.

But should philosophy really act like the other disciplines in this regard? Should philosophy be considered a "discipline" at all? And if not, what are the consequences for the governance of philosophy?

One of the oddities of present-day philosophy is how rarely this question is asked. Go to a philosophy department with a graduate program, and sign up for a course in ancient philosophy: the professor will be expected to know ancient Greek, and to be well-read in the scholarly literature in the area. The irony is that there was no secondary literature for the Greeks — no scholarship at all, in fact, in the sense that we mean it today. Philosophers were thinkers, not scholars. Socrates would never get tenure: what did he write?

This situation was partly a matter of technology; paper was expensive and reproduction of a manuscript laborious. But it is still odd to assume that Plato and Aristotle would have been good scholars if only they’d had access to the Philosopher's Index and an Internet connection. Nor were the Greeks good disciplinarians. Socrates was notorious for speaking with people from all walks of life; and when he came to be evaluated it was by a jury of his peers consisting of 500 Athenians. He may not have liked the verdict, but he did not dispute the jury's right to pass judgment.

Across the long sweep of Western history we find the point repeated: Bacon, Machiavelli, Descartes, Leibniz, Locke, Marx and Nietzsche all wrote for and sought the judgment of peers across society. One wonders what they would think of what counts as philosophy across the 20th century — a highly technical, inward-looking field that values intellectual rigor over other values such as relevance or timeliness.

Questions about who should count as a philosopher's peers are timely today, for our standard notions of academic peer review are now under assault. Publicly funded science is being held more socially accountable. At the National Science Foundation, grant proposals are now judged by both disciplinary and transdisciplinary criteria — what are called, respectively, "intellectual merit" and "broader impacts." Universities are also being held responsible for outcomes, with state funding increasingly being tied to graduation rates and other achievement measures. Philosophers, too, have begun to feel the pinch of accountability, especially in Britain, where the so-called "impact agenda" has advanced more rapidly than in the United States.

We view this situation as more of an opportunity than as a problem. Philosophers should get creative and treat the question of who counts as our peers as itself a philosophic question. There are a variety of ethical, epistemological, and political issues surrounding peer review worthy of philosophic reflection. But perhaps the most pressing is the question of whether we should extend the notion of peer beyond disciplinary bounds.

This could occur in a number of different ways. Not only could we draw nonphilosophers or nonacademics into the peer review process. We could also consider a variety of other criteria, such as the number of publications in popular magazines or newspaper articles; number of hits on philosophic blogs; number of quotes in the media; or the number of grants awarded by public agencies to conduct dedisciplined philosophic work.

Now, some will claim that extending the idea of our philosophical peers to include nonphilosophers will expose philosophy to the corruptions of the demos. Is philosophizing to become a sheer popularity contest, where philosophers are promoted based on their Klout score, or the number of Facebook likes their blog posts garner? Aren’t we proposing that the Quineans be replaced by the Bieberians?

Such objections stem, in part at least, from what we could call a Cartesian ethos — the idea that philosophers should strive above all to avoid error. We should withhold our assent to any claim that we do not clearly and distinctly perceive to be true. This Cartesian ethos dominates philosophy today, and nowhere is this clearer than in regard to peer review. Our peers are our fellow philosophers, experts whose rigor stands in for Descartes' clear and distinct ideas.

For a counterethos we could call upon William James's "The Will to Believe." James argues that the pursuit of truth, even under conditions where we cannot be certain of our conclusions, is more important than the strict avoidance of error. Those who object that this will open philosophy up to all sorts of errors that would otherwise have been caught by expert peer review are exhibiting excessive Cartesianism. In fact, those who insist on the value of expertise in philosophy are reversing the Socratic approach. Whereas Socrates always asked others to contribute their opinions in pursuit of truth, Descartes trusted no one not to lead him into error. A Jamesian approach to peer review, on the other hand, would be generous in its definition of who ought to count as a peer, since avoiding error at all costs is not the main goal of philosophy. On a Jamesian approach, we would make use of peers in much the way that Socrates did — in an effort to pursue wisdom.

It is true that when philosophers broaden their peer group, they lose some control over the measures used to define philosophic excellence. This raises another risk — that philosophy will be merely an instrument for an exterior set of ends. The fear here is not that abandoning disciplinary peer review will lead us into error. Instead, it is that the only alternative to value as judged by disciplinary peers is a crass utilitarianism, where philosophic value is judged by how well it advances a paymaster’s outcome. One philosopher may be labeled a success for helping a racist political candidate hone his message, while another may be labeled a failure for not sufficiently fattening a corporation's bottom line. Isn’t a dedisciplined philosophy actually a return to sophistry rather than to Socrates? Won’t it sell its services to whoever is buying, adjusting its message to satisfy another’s agenda and criteria for success? In order to survive until the turn of the 22nd century, must we sell the soul of philosophy at the beginning of the 21st?

We have two replies to such concerns. First, philosophy existed long before the 20th-century model of academic disciplinarity came to define its soul. The struggle between philosophy and sophistry is a perennial one, and one does not necessarily sell out by writing for a larger audience — or remain pure by staying within disciplinary boundaries.

Second, disciplinary and dedisciplinary approaches to philosophy should be seen as complementary rather than antagonistic to one another. Rigor should be seen as pluralistic: the rigor of disciplinary work is different from, but neither better nor worse, than the philosophic rigor required to adjust one’s thinking to real world exigencies. This is a point that bioethicists have long understood.  In his seminal 1973 "Bioethics as a Discipline," Daniel Callahan already saw that doing philosophical thinking with physicians, scientists, and other stakeholders demands "rigor … of a different sort than that normally required for the traditional philosophical or scientific disciplines." Bioethics exists in disciplinary and in nondisciplinary forms — in ways that synergize. It shows that we need not be forced, as a matter of general principle, to choose one set of peers over another.

Practically speaking, examining the question of who should count as a peer means that philosophers will need to revisit some of the core elements of our field. For one, our criteria for tenure and promotion would need to be reviewed. The current strict hierarchy surrounding where we publish — say, in "The Stone" (the New York Times philosophy blog) or in Mind — would need to be re-evaluated. And really, what is the argument for claiming that the latter sort of publication is of higher value? If you reply that the latter is peer-reviewed, excuse us for pointing out that your answer begs the question.

And what about the question of multiple authorship? Should this article count for less because three of us wrote it? How much less? Why? Co-authoring is actually just as challenging as producing single-authored works, as we can attest, so the justification cannot be that it is less work. Should we value independent scholarship over collaboration? Why? This is the Cartesian ethos coming back to haunt philosophy: I think; I exist; I write; I am a scholar. We doubt it.

As universities face growing demands for academic accountability, philosophers ought to take the lead in exploring what accountability means. Otherwise we may be stuck with Dickens’s Mr. Gradgrind. ("Now, what I want is Facts. Teach these boys and girls nothing but Facts. Facts alone are wanted in life.") But a philosophical account of accountability will also require redefining the boundaries of what counts as philosophy. We ought to engage those making accountability demands from the outside in just the way that Socrates engaged Euthyphro on piety. If there are indeed Bieberians at the gate, we say let them in — as long as they are willing to engage in dialogue, we philosophers should do all right. Unless it is we philosophers who refuse to engage.


Robert Frodeman, J. Britt Holbrook and Adam Briggle
Author's email:

Robert Frodeman is professor of philosophy and director of the Center for the Study of Interdisciplinarity at the University of North Texas. He was editor in chief of the Oxford Handbook of Interdisciplinarity.

J. Britt Holbrook is research assistant professor of philosophy and assistant director of the Center for the Study of Interdisciplinarity at the University of North Texas. He is editor in chief of Ethics, Science, Technology, and Engineering: An International Resource, forthcoming from Gale-Cengage Learning.

Adam Briggle is assistant professor of philosophy at the University of North Texas. He is author, with Carl Mitcham, of
Ethics and Science: An Introduction from Cambridge University Press, 2012.

Editorial Tags: 

Essay on student protests in London and one planned for Howard University

Intellectual Affairs

“Our university is not a supermarket!” read one of the fliers I saw posted up around the University College London campus while there to attend a conference this past week. It seems that early November is now the official occasion for militant discontent over austerity and higher education, at least in England. Arriving for the same annual conference a year ago, I’d made my way through streets crowded with students demonstrating against budget cuts and privatization, amidst police who were prepared (so a newspaper said the following morning) to use plastic bullets if the crowd got rowdy, as it had during the huge protests against a proposal to lift the cap on tuitions in November 2010.

Fifty thousand people had turned out for that event -- more than twice as many as even the organizers expected – and a few hundred of them decided to occupy the campaign headquarters of the Conservative Party, which they left considerably worse for wear. Elsewhere, another crowd menaced the Prince of Wales and Duchess of Cornwall in their Rolls Royce, which was paint-bombed and its rear window smashed.

That was 2010. Nothing so A Tale of Two Cities-ish took place during the November 2011 march through central London. As for next week -- who knows? The National Union of Students has called for a march through central London on November 21, scheduled to coincide with the weekly questioning of the prime minister by members of the House of Commons. Complaining that the government has been “slashing undergraduate teaching funding, increasing tuition fees, introducing draconian restrictions on international students, cutting funding for post-graduate students, [and] hiking fees for adult learners looking to gain basic skills,” the NUS also points to another worsening situation: nearly a million people in England between the ages of 16 and 24 are currently unemployed. (The International Labour Organization, a United Nations agency, projects rising joblessness among youth to continue as a global trend over the next five years.) The police will probably have their plastic bullets ready next week, come what may.

As slogans go, “Our university is not a supermarket!” impressed me as one that wouldn’t work as a rallying cry in the United States. While Charles Eliot had many sober and lofty reasons for introducing the electives system at Harvard University in the late 19th century, its near-universal adoption throughout undergraduate education in the U.S. surely has more to do with the principle that it’s a good idea to give the customers what they want. (That was a running complaint in the late Jacques Barzun’s reflections on American education, discussed here last month.) It seems that we like our supermarket universities just fine here.

But that's just too cynical, and these are times when we should be ashamed of cynicism rather than proud of it. While writing this, I've gotten word from a philosophy major at Howard University that he and other students will be occupying Alaine Locke Hall on Thursday, November 15, to protest "tuition rates, administrative mistreatment of janitorial staff, and program cuts." These are not the demands of disgruntled consumers, and the protesters are very deliberate about their timing: Thursday is World Philosophy Day.

If their occupation goes on long enough, the students should read a recent volume called What Are Universities For? by Stefan Collini, a professor of intellectual history and English literature at Cambridge University. His Absent Minds: Intellectuals in Britain (Oxford University Press, 2006) is as trenchant and far-flung a work of cultural history as anything I’ve ever read, and some of its qualities also come through in the occasional pieces he has been writing about higher education since the 1980s, many of them gathered in the new book. Published this spring by Penguin, it is available as a paperback in the U.K. and Canada but not in the U.S., though you can order it to read on Kindle.

Much of it is quite specific to British debates over the reform and restructuring of the country’s university system -- and a few of the older pieces (including one called “Bibliometry,” from the late 1980s, on the use of citation statistics as  “performance indicators” for scholars’ work) are now period pieces. But his response to the rise of corporate thinking and management-speak in academe is acerbic in ways that have aged well. “I work in the knowledge and human-resources industry,” one piece begins. “My company specializes in two types of product: we manufacture high-quality, multi-skilled units of human capacity; and we produce commercially relevant, cutting-edge new knowledge in user-friendly packages of printed material….Let me put that another way. I’m a university teacher. I teach students and I write books.”

What is there about education and scholarship that gets lost in this sort of "mission statement"-ese? Collini's book is a sustained engagement with that question, but one passage stands out as a memorable formulation of what distinguishes the university from any other institution:

“A university, it may be said, is a protected space in which various forms of useful preparation for life are undertaken in a setting and manner which encourages the students to understand the contingency of any particular packet of knowledge and its interrelations with other, different forms of knowledge. To do this, the teachers themselves need to be engaged in constantly going beyond the confines of the packets of knowledge that they teach, and there is no way to prescribe in advance what will and will not be fruitful ways to do that. Undergraduate education involves exposing students for a while to the experience of enquiry into something in particular, but enquiry which has no external goal other than improving the understanding of that subject matter. One rough and ready distinction between university education and professional training is that education relativizes and constantly calls into question the information which training simply permits.... [Learning of that kind] can only be done through engagement with some particular subject matter, not simply by ingesting a set of abstract propositions about the contingency of knowledge, and the more there already exists and elaborated and sophisticated tradition of enquiry in a particular area, the more demanding and rigorous will be the process of requiring and revising understanding."

Not written with a student demonstration on World Philosophy Day in mind, of course, but it seems fitting.

Editorial Tags: 

Review of Jurgen Habermas, "The Crisis of the European Union"

Intellectual Affairs

Most volumes by Jürgen Habermas appearing in English over the past decade have consisted of papers and lectures building on the theory of communicative action and social change in his earlier work, or tightening the bolts on the system. Some are technical works only a Habermasian could love. But a few of the books have juxtaposed philosophical writings with political journalism and the occasional interview with him in his role as public intellectual of global stature.

The latest such roundup, The Crisis of the European Union: A Response -- published in Germany late last year and in translation from Polity this summer – is probably the most exasperated of them as well. Very few contemporary thinkers have laid out such a comprehensive argument for the potential of liberal-democratic societies to reform and revitalize themselves in a way that would benefit their citizens while also realizing the conditions of possibility for human flourishing everywhere else.

The operative term here being, of course, “potential.” When you consider that his recent collections The Divided West (2006) and Europe: The Faltering Project (2009), also both from Polity, are now joined by one with “crisis” in the title, it’s clear that unbridled optimism is not a distorting element in Habermas’s world view. But the sobriety has turned into something closer to frustration in his latest interventions.

The earliest text in the new book first appeared in November 2008 – a time when the initial impact of the financial crisis made many people assume that the retooling of major institutions was so urgent as to be imminent. Habermas was more circumspect about it than, say, folks in the United States who imagined Obama as FDR redivivus. But although he has long been the most moderate sort of mildly left-of-center reformist, the philosopher did permit himself to hope.

Might not the U.S. “as it has done so often in the past,” he said, “pull itself together and, before it is too late, try to bind the competing major powers of today – the global powers of tomorrow – into an international order which no longer needs a superpower?” If so, “the United States would need the friendly support of a loyal yet self-confident ally in order to undertake such a radical change in direction.”

That would require the European Union to learn “to speak with one voice in foreign policy and, indeed, to use its internationally accumulated capital of trust to act in a farsighted manner itself.” A common EU foreign policy would only be possible if it had a more coherent economic policy. “And neither could be conducted any longer through backroom deals,” he wrote, “behind the backs of the populations.”

Habermas suffered no illusions about how likely such changes might be. But he treated late ’08 as a moment when “a somewhat broader perspective may be more needful than that offered by mainstream advice and the petty maneuvering of politics as usual.” (Fatalism, too, is an illusion, and one that paralyzes.)

The appendix to Crisis reprints some newspaper commentaries that Habermas published in 2010 and ’11, as the crisis of the Euro exposed the shakiness of “an economic zone of continental proportions with a huge population but without institutions being established at the European level capable of effectively coordinating the economic policies of the member states.” This gets him riled up. He is particularly sharp on the role of the German Federal Constitutional Court’s “solipsistic and normatively depleted mindset.”

He also complains about “the cheerful moderators of the innumerable talk shows, with their never-changing line-ups of guests,” which kill the viewer’s “hope that reasons could still count in political questions.”

A seemingly more placid tone prevails in his two scholarly texts on the juridification (i.e., codifying and legal enforcement) of democratic and humanitarian values. But there is a much tighter connection between Habermas’s fulminations and his conceptual architecture than it first appears.

Another recent volume, Shivdeep Singh Grewal’s Habermas and European Integration: Social and Cultural Modernity Beyond the Nation-State (Manchester University Press), starts with a review of Habermas’s changing attitudes towards European unification over the past 30 years. Then Grewal -- an independent scholar who has taught at Brunel University and University College London -- reconstructs pertinent aspects of Habermas’s scholarly work over roughly the same period, surveying it in the context of the philosopher’s developing political concerns.

Using the political journalism as a way to frame his thinking about modernity is an unusual approach, but illuminating, and it avoids the familiar tendency in overviews of Habermas’s work to treat his books as if they spawned one another in turn.

To summarize things to a fault: From the U.S. and French revolutions onward, the nation-state was best able to secure its legitimacy through constitutional democracy. However limited in scope or restricted in mandate it was at the start, constitutional democracy opened up the possibility for public challenges to authority grounded on nothing more than tradition or inertia, which could in turn make for greater political inclusiveness. It could even try to protect its more vulnerable citizens and mitigate some kinds of inequality and economic dislocation.

Thus public life would expand and grow more various and complex, since more people would have access to more possibilities for decision-making. And that, in turn, demands a political structure both firm and flexible. Which brings us back to constitutional-democratic governance. A virtuous circle!

Actual constitutional democracies were another matter, but at least it was a normative model, something to shoot for. But the problems faced by nation-states cut across borders; and the more complex they become, the less power over them the separate states have. The point of creating a united Europe, from Habermas’s perspective, was, Grewal writes, “the urgent task of preserving the democratic and welfarist achievements of the nation state ‘beyond its own limits.’ ”

Habermas makes the point somewhere that institutions making decisions about transnational issues are going to exist in any case. Whether they will be accountable is another matter. Establishing a constitutional form of governance that goes beyond the nation-state would involve no end of difficulty in principle, let alone in practice, but it is essential.

It’s also not happening. Not right now, anyway. But as exasperated as Habermas sounds in Crisis, he has not given up. In an email discussion, Grewal pointed me to a recent statement called “Only deeper European unification can save the eurozone” that the philosopher co-authored.

“Habermas acknowledges the 'laborious' and incremental learning learning process of the German government,” Grewal told me, “whilst bemoaning the lack of sufficiently bold and courageous politicians to take the European project forward.…The alternative to the transnationalization of democracy is, Habermas continues to suggest, a sort of post-democratic 'executive federalism', with shades of the opinion poll-watching, media-manipulating approach of figures such as Berlusconi and Putin.”

He acknowledges that there are people who don’t see this as an either-or option. It’s possible have both continent-spanning constitutional democracy and a political system in which media manipulation and pandering ensure that decision-making continues behind closed doors. Is it ever....

But even aside from that, why does Habermas count on bold and courageous politicians for the kind of change he wants? Part of his frustration, no doubt, is that he’s counting on the actions of people who don’t exist, or get sidelined quickly if they do. Democracy doesn’t come from on high. I respect the man's intentions and persistence, but wish he would come up with a better strategy.



Editorial Tags: 

review of David R. Koepsell and Robert Arp "Breaking Bad and Philosophy: Badder Living Through Chemistry"

Intellectual Affairs

In a memorable scene from the first season of "Breaking Bad" (AMC), the protagonist sits down to do some moral bookkeeping of a fairly literal variety. He is a 50-year-old high-school chemistry teacher named Walter White. A recent trip to the doctor to check on a nagging cough has left with a diagnosis of advanced lung cancer, giving him, at most, a couple of years to live. If you’ve seen the show (and maybe even if you haven’t, since it has received extremely good press and won more awards than I feel like counting) you know that Walter has decided on a hazardous way to provide for his family after his death. He applies his lab skills to the production of crystal methamphetamine.

The stuff he “cooks” (as the term of art goes) is exceptionally pure and powerful. The connoisseurs love it. If he can turn a profit of $737,000 in the time he has left, Walt will leave a nest egg for his wife and children and die in peace. As a middle-class family man, Walt lacks any direct knowledge of the marketing side of the meth business, and would prefer to keep it that way. His connection to the underworld is a former student named Jesse Pinkman, memorable chiefly for his bad grades. But Jesse is a gangsta wannabe, as well as a meth head, and nowhere near as street-savvy as he thinks or the job requires.

And so it comes to pass that Walter find himself facing an unforeseen problem involving a well-connected figure from the meth supply chain – a fellow who goes by the street name of Krazy-8. It's a long story how he got there, but Krazy-8 ends up shackled by the neck to a pole in Jesse’s basement, and he is understandably, even homicidally, unhappy. Walt must now decide between two options: let Krazy-8 live or kill him.

Being the rational sort, Walt tabulates the arguments on each side.The column headed “Let him live” fills up quickly, if redundantly: “It’s the moral thing to do. Judeo-Christian principles. You are not a murderer. He may listen to reason. Post-traumatic stress. Won’t be able to live with yourself. Murder is wrong!”

Under “Kill him,” the camera reveals just one entry: “He’ll kill your entire family if you let him go.” So much for weighing the alternatives.

In his method -- and ultimately in his actions -- Walt proves to be a consequentialist, as J.C. Donhauser points out in “If Walt’s Breaking Bad, Maybe We Are Too,” one of the essays in Breaking Bad and Philosophy: Badder Living Through Chemistry (Open Court). Most viewers will have surmised as much, even if they don’t have a name for it. But there is more than one metric for judging costs and benefits, and so more than one species of consequentialist. Donhauser -- an assistant instructor of philosophy at the State University of New York at Buffalo and a lecturer at Buffalo State University – uses examples from other episodes to consider the options. There’s act consequentialism, for one (the realized effect of an act determine whether it is good or bad, even if the consequences are unintended or unforeseeable), which is distinct from rule consequentialism (“actions are better or worse, not in relation to their actual consequences, but in proportion to how far afield they fall from a rule that would be best for most people if everyone followed it”).

As for Walt, he belongs in the ranks of the agent-centered consequentialists, who “judge actions based on their consequences” but “also argue that the most important consequences are for the person carrying out the actions that produce those consequences.”

Each stance has its limitation – quite as much as deontology does. Deontology insists that consequences are irrelevant, since an act can be judged moral if and only if it could be universalized. Murder is immoral, then, because “if everyone did it, there’d be no one around for you to murder then! The same goes for stealing, as there’d be nothing left to steal.” So Jeffrey E. Stephenson put it, with tongue in cheek, in “Walter White’s American Vice.” Ditto for lying, since a society in which everyone lied constantly would be even more irrational than the one we live in.

Walt's list of argument for letting Krazy-8 live is not deontological by any means -- although “He may listen to reason” rests on a similar conviction that clarity and rationality are not just worthy aspirations but realizable possibilities as well. Despite his nickname and his criminal vocation, Krazy-8 is a well-spoken and seemingly pragmatic individual, with strong family ties of a sort that Walt can respect. And Walt very nearly reaches a decision on that basis.

On the other hand, not every consequence can be put in brackets while you seek the universally right thing to do. And “He’ll kill your entire family if you let him go” is a pretty good example of that. Under the circumstances, even a deontologist would probably find a way to think of murder as obligatory.

Breaking Bad and Philosophy, edited by David R. Koepsell and Robert Arp, is much like any other collection of essays in the Open Court series Popular Culture and Philosophy, of which it is volume 67. By the way, the publisher has registered “Popular Culture and Philosophy” as a trademark. Don't confuse it with The Blackwell Philosophy and Pop Culture Series (37 volumes at last report) or the University of Kentucky’s line called The Philosophy of Popular Culture (23 titles, not counting updated editions).

By now, it seems as if every genre, blockbuster, videogame, superhero, hit program, or teen trend has been covered by at least one book in this niche, or will be in the foreseeable future. I picture them being produced in something akin to Walt’s methamphetamine superlab – with the important exception that Walt’s product is of famously consistent in quality. The popcult philosophy collections that I’ve sampled over the years tend to be pretty uneven, even within the same volume. The one constant is that most of the essays are clearly didactic. The implied reader for these books almost always seems to be an undergraduate, with popular culture as the candy coating on the philosophical vitamins otherwise missing from the educational diet. There is jocularity aplenty. In this volume, for example, a comparison of Breaking Bad and Augustine’s Confessions includes the information that the saint-to-be “had a rep for hooking up with the MILFs of Carthage” -- not unlike Peter Abelard, “a famous playa before his lover’s father and brother… cut off his junk and sent him packin.’”

Well, you do what you must to keep the students' attention. With any luck, these books will be the philosophical equivalent of a gateway drug, leading some readers to try the harder stuff.

But there must be more ways to go about it than by reducing every pop-culture phenomenon to a pretext for introducing well-established topics and thinkers. Another constituency for these books is the fan base for whatever cultural commodity gets yoked to philosophy in their titles. It was as a devotee of the show (one who has seen every episode of the first four seasons at least twice) that I bought Breaking Bad and Philosophy in the first place. And the striking thing about the program is that it's all about how decisions, consequences, and responsibility (or the lack of it) get mixed up in ways that no schema can account for very well. That is undoubtedly part of its appeal.

I’ll end by recommending one essay from the book that will reward the attention of anyone who follows the show closely. Titled “Macbeth on Ice,” it is by Ray Bossert, a visiting assistant professor of English at Franklin and Marshall College. He compares "Breaking Bad" and the Scottish play by reference to Aristotle's Poetics, to surprisingly appropriate effect.

In Aristotle’s analysis, the hero in classical tragedy is responsible for his actions and ultimately their victim. His character is admirable and doomed because of some flaw -- excessive pride, for example. That's the one Macbeth and Walter White share. The hero's motives and decisions are transformed as this flaw grows more prominent. It leads him to "incidents arousing pity and fear" in the audience, says Aristotle. Such incidents have the very greatest effect on the mind when they occur unexpectedly and at the same time in consequence of one another; they arouse more awe than if they happened accidentally and by chance."

In Walt’s case, as his involvement in the meth business deepens, we see that his insistence that everything he does is out of love for his family is a kind of self-deception. More and more evidence of his rage and resentment accumulates. He feels trapped by his family, and his pride has been wounded too many times in his 50 years. As events unfold, Walt feels increasingly confident and powerful, and his running cost-benefit analysis leaves ever more collateral damage.

We believe in the character, writes Bossert, “because, in our own thoughts, we, too, resent being limited to a single role on life’s stage. We pity Walter White, and fear that we might make similar mistakes because we’re like him.” This seems exactly right. Bossert makes no predictions about how Breaking Bad will end (it is now counting down its last 16 episodes, 8 this summer and 8 in 2013) nor will I. But Walt has enormous potential in the pity and fear department, and the stage is sure to be covered with bodies before the curtain falls – even more than it already is.

Review of Francois Noudelmann, "The Philosopher's Touch: Sartre, Nietzsche, and Barthes at the Piano"

Call it philosophical synesthesia: the work of certain thinkers comes with a soundtrack. With Leibniz, it’s something baroque played on a harpsichord -- the monads somehow both crisply distinct and perfectly harmonizing. Despite Nietzsche’s tortured personal relationship with Wagner, the mood music for his work is actually by Richard Strauss. In the case of Jean-Paul Sartre’s writings, or at least some of them, it’s jazz: bebop in particular, and usually Charlie Parker, although it was Dizzie Gillespie who wore what became known as “existentialist” eyeglasses. And medieval scholastic philosophy resonates with Gregorian chant. Having never managed to read Thomas Aquinas without getting a headache, I find that it’s the Monty Python version:





Such linkages are, of course, all in my head -- the product of historical context and chains of association, to say nothing of personal eccentricity. But sometimes the connection between philosophy and music is much closer than that. It exists not just in the mind’s ear but in the thinker’s fingers as well, in ways that François Noudelmann explores with great finesse in The Philosopher’s Touch: Sartre, Nietzsche, and Barthes at the Piano (Columbia University Press).

The disciplinary guard dogs may snarl at Noudelmann for listing Barthes, a literary critic and semiologist, as a philosopher. The Philosopher’s Touch also ignores the principle best summed up by Martin Heidegger (“Horst Vessel Lied”): “Regarding the personality of a philosopher, our only interest is that he was born at a certain time, that he worked, and that he died." Biography, by this reasoning, is a distraction from serious thought, or, worse, a contaminant.

But then Noudelmann (a professor of philosophy at l’Université Paris VIII who has also taught at Johns Hopkins and New York Universities) has published a number of studies of Sartre, who violated the distinction between philosophy and biography constantly. Following Sartre’s example on that score is a dicey enterprise -- always in danger of reducing ideas to historical circumstances, or of overinterpreting personal trivia.

The Philosopher’s Touch runs that risk three times, taking as its starting point the one habit its protagonists had in common: Each played the piano almost every day of his adult life. Sartre gave it up only as a septuagenarian, when his health and eyesight failed. But even Nietzsche’s descent into madness couldn’t stop him from playing (and, it seems, playing well).

All of them wrote about music, and each published at least one book that was explicitly autobiographical. But they seldom mentioned their own musicianship in public and never made it the focus of a book or an essay. Barthes happily accepted the offer to appear on a radio program where the guest host got to spin his favorite recordings. But the tapes he made at home of his own performances were never for public consumption. He was an unabashed amateur, and recording himself was just a way to get better.

Early on, a conductor rejected one of Nietzsche’s compositions in brutally humiliating terms, asking if he meant it as a joke. But he went on playing and composing anyway, leaving behind about 70 works, including, strange to say, a mass.

As for Sartre, he admitted to daydreams of becoming a jazz pianist. “We might be even more surprised by this secret ambition,” Noudelmann says, “when we realize that Sartre did not play jazz! Perhaps this was due to a certain difficulty of rhythm encountered in jazz, which is so difficult for classical players to grasp. Sight-reading a score does not suffice.” It don’t mean a thing if it ain’t got that swing.

These seemingly minor or incidental details about the thinkers’ private devotion to the keyboard give Noudelmann an entrée to a set of otherwise readily overlooked set of problems concerning both art -- particularly the high-modernist sort -- and time.  

In their critical writings, Sartre and Barthes always seemed especially interested in the more challenging sorts of experimentation (Beckett, serialism, Calder, the nouveau roman, etc.) while Nietzsche was, at first anyway, the philosophical herald of Wagner’s genius as the future of art. But seated at their own keyboards, they made choices seemingly at odds with the sensibility to be found in their published work. Sartre played Chopin. A lot. So did Nietzsche. (Surprising, because Chopin puts into sound what unrequited love feels like, while it seems like Nietzsche and Sartre are made of sterner stuff.  Nietzsche also loved Bizet’s Carmen. His copy of the score “is covered with annotations, testifying to his intense appropriation of the opera to the piano.” Barthes liked Chopin but found him too hard to play, and shifted his loyalties to Schumann – becoming the sort of devotee who feels he has a uniquely intense connection with an artist. “Although he claims that Schumann’s music is, through some intrinsic quality, made for being played rather than listened to,” writes Noudelmann, “his arguments can be reduced to saying that this music involves the body that plays it.”

Such ardor is at the other extreme from the modernist perspective for which music is the ideal model of “pure art, removed from meaning and feeling,” creating, Noudelmann writes, “a perfect form and a perfect time, which follow only their own laws.... Such supposed purity requires an exclusive relation between the music and a listener who is removed from the conditions of the music’s performance.”

But Barthes’s passion for Schumann (or Sartre’s for Chopin, or Nietzsche’s for Bizet) involves more than relief at escaping severe music for something more Romantic and melodious. The familiarity of certain compositions; the fact that they fall within the limits of the player’s ability, or give it enough of a challenge to be stimulating; the way a passage inspires particular moods or echoes them -- all of this is part of the reality that playing music “is entirely different from listening to it or commenting on it.” That sounds obvious but it is something even a bad performer sometimes understands better than a good critic.

“Leaving behind the discourse of knowledge and mastery,” Noudelmann writes, “they maintained, without relent and throughout the whole of their existence, a tacit relation to music. Their playing was full of habits they had cultivated since childhood and discoveries they had made in the evolution of their tastes and passions.” More is involved than sound.

The skills required to play music are stored, quite literally, in the body. It’s appropriate that Nietzsche, Sartre, and Barthes all wrote, at some length, about both the body and memory. Noudelmann could have belabored that point at terrific length and high volume, like a LaMonte Young performance in which musicians play two or three notes continuously for several days. Instead, he improvises with skill in essays that pique the reader's interest, rather than bludgeoning it. And on that note, I must now go do terrible things to a Gibson electric guitar.


Subscribe to RSS - Philosophy
Back to Top