Philosophy

Review of Jurgen Habermas, "The Crisis of the European Union"

Intellectual Affairs

Most volumes by Jürgen Habermas appearing in English over the past decade have consisted of papers and lectures building on the theory of communicative action and social change in his earlier work, or tightening the bolts on the system. Some are technical works only a Habermasian could love. But a few of the books have juxtaposed philosophical writings with political journalism and the occasional interview with him in his role as public intellectual of global stature.

The latest such roundup, The Crisis of the European Union: A Response -- published in Germany late last year and in translation from Polity this summer – is probably the most exasperated of them as well. Very few contemporary thinkers have laid out such a comprehensive argument for the potential of liberal-democratic societies to reform and revitalize themselves in a way that would benefit their citizens while also realizing the conditions of possibility for human flourishing everywhere else.

The operative term here being, of course, “potential.” When you consider that his recent collections The Divided West (2006) and Europe: The Faltering Project (2009), also both from Polity, are now joined by one with “crisis” in the title, it’s clear that unbridled optimism is not a distorting element in Habermas’s world view. But the sobriety has turned into something closer to frustration in his latest interventions.

The earliest text in the new book first appeared in November 2008 – a time when the initial impact of the financial crisis made many people assume that the retooling of major institutions was so urgent as to be imminent. Habermas was more circumspect about it than, say, folks in the United States who imagined Obama as FDR redivivus. But although he has long been the most moderate sort of mildly left-of-center reformist, the philosopher did permit himself to hope.

Might not the U.S. “as it has done so often in the past,” he said, “pull itself together and, before it is too late, try to bind the competing major powers of today – the global powers of tomorrow – into an international order which no longer needs a superpower?” If so, “the United States would need the friendly support of a loyal yet self-confident ally in order to undertake such a radical change in direction.”

That would require the European Union to learn “to speak with one voice in foreign policy and, indeed, to use its internationally accumulated capital of trust to act in a farsighted manner itself.” A common EU foreign policy would only be possible if it had a more coherent economic policy. “And neither could be conducted any longer through backroom deals,” he wrote, “behind the backs of the populations.”

Habermas suffered no illusions about how likely such changes might be. But he treated late ’08 as a moment when “a somewhat broader perspective may be more needful than that offered by mainstream advice and the petty maneuvering of politics as usual.” (Fatalism, too, is an illusion, and one that paralyzes.)

The appendix to Crisis reprints some newspaper commentaries that Habermas published in 2010 and ’11, as the crisis of the Euro exposed the shakiness of “an economic zone of continental proportions with a huge population but without institutions being established at the European level capable of effectively coordinating the economic policies of the member states.” This gets him riled up. He is particularly sharp on the role of the German Federal Constitutional Court’s “solipsistic and normatively depleted mindset.”

He also complains about “the cheerful moderators of the innumerable talk shows, with their never-changing line-ups of guests,” which kill the viewer’s “hope that reasons could still count in political questions.”

A seemingly more placid tone prevails in his two scholarly texts on the juridification (i.e., codifying and legal enforcement) of democratic and humanitarian values. But there is a much tighter connection between Habermas’s fulminations and his conceptual architecture than it first appears.

Another recent volume, Shivdeep Singh Grewal’s Habermas and European Integration: Social and Cultural Modernity Beyond the Nation-State (Manchester University Press), starts with a review of Habermas’s changing attitudes towards European unification over the past 30 years. Then Grewal -- an independent scholar who has taught at Brunel University and University College London -- reconstructs pertinent aspects of Habermas’s scholarly work over roughly the same period, surveying it in the context of the philosopher’s developing political concerns.

Using the political journalism as a way to frame his thinking about modernity is an unusual approach, but illuminating, and it avoids the familiar tendency in overviews of Habermas’s work to treat his books as if they spawned one another in turn.

To summarize things to a fault: From the U.S. and French revolutions onward, the nation-state was best able to secure its legitimacy through constitutional democracy. However limited in scope or restricted in mandate it was at the start, constitutional democracy opened up the possibility for public challenges to authority grounded on nothing more than tradition or inertia, which could in turn make for greater political inclusiveness. It could even try to protect its more vulnerable citizens and mitigate some kinds of inequality and economic dislocation.

Thus public life would expand and grow more various and complex, since more people would have access to more possibilities for decision-making. And that, in turn, demands a political structure both firm and flexible. Which brings us back to constitutional-democratic governance. A virtuous circle!

Actual constitutional democracies were another matter, but at least it was a normative model, something to shoot for. But the problems faced by nation-states cut across borders; and the more complex they become, the less power over them the separate states have. The point of creating a united Europe, from Habermas’s perspective, was, Grewal writes, “the urgent task of preserving the democratic and welfarist achievements of the nation state ‘beyond its own limits.’ ”

Habermas makes the point somewhere that institutions making decisions about transnational issues are going to exist in any case. Whether they will be accountable is another matter. Establishing a constitutional form of governance that goes beyond the nation-state would involve no end of difficulty in principle, let alone in practice, but it is essential.

It’s also not happening. Not right now, anyway. But as exasperated as Habermas sounds in Crisis, he has not given up. In an email discussion, Grewal pointed me to a recent statement called “Only deeper European unification can save the eurozone” that the philosopher co-authored.

“Habermas acknowledges the 'laborious' and incremental learning learning process of the German government,” Grewal told me, “whilst bemoaning the lack of sufficiently bold and courageous politicians to take the European project forward.…The alternative to the transnationalization of democracy is, Habermas continues to suggest, a sort of post-democratic 'executive federalism', with shades of the opinion poll-watching, media-manipulating approach of figures such as Berlusconi and Putin.”

He acknowledges that there are people who don’t see this as an either-or option. It’s possible have both continent-spanning constitutional democracy and a political system in which media manipulation and pandering ensure that decision-making continues behind closed doors. Is it ever....

But even aside from that, why does Habermas count on bold and courageous politicians for the kind of change he wants? Part of his frustration, no doubt, is that he’s counting on the actions of people who don’t exist, or get sidelined quickly if they do. Democracy doesn’t come from on high. I respect the man's intentions and persistence, but wish he would come up with a better strategy.

 

 

Editorial Tags: 

review of David R. Koepsell and Robert Arp "Breaking Bad and Philosophy: Badder Living Through Chemistry"

Intellectual Affairs

In a memorable scene from the first season of "Breaking Bad" (AMC), the protagonist sits down to do some moral bookkeeping of a fairly literal variety. He is a 50-year-old high-school chemistry teacher named Walter White. A recent trip to the doctor to check on a nagging cough has left with a diagnosis of advanced lung cancer, giving him, at most, a couple of years to live. If you’ve seen the show (and maybe even if you haven’t, since it has received extremely good press and won more awards than I feel like counting) you know that Walter has decided on a hazardous way to provide for his family after his death. He applies his lab skills to the production of crystal methamphetamine.

The stuff he “cooks” (as the term of art goes) is exceptionally pure and powerful. The connoisseurs love it. If he can turn a profit of $737,000 in the time he has left, Walt will leave a nest egg for his wife and children and die in peace. As a middle-class family man, Walt lacks any direct knowledge of the marketing side of the meth business, and would prefer to keep it that way. His connection to the underworld is a former student named Jesse Pinkman, memorable chiefly for his bad grades. But Jesse is a gangsta wannabe, as well as a meth head, and nowhere near as street-savvy as he thinks or the job requires.

And so it comes to pass that Walter find himself facing an unforeseen problem involving a well-connected figure from the meth supply chain – a fellow who goes by the street name of Krazy-8. It's a long story how he got there, but Krazy-8 ends up shackled by the neck to a pole in Jesse’s basement, and he is understandably, even homicidally, unhappy. Walt must now decide between two options: let Krazy-8 live or kill him.

Being the rational sort, Walt tabulates the arguments on each side.The column headed “Let him live” fills up quickly, if redundantly: “It’s the moral thing to do. Judeo-Christian principles. You are not a murderer. He may listen to reason. Post-traumatic stress. Won’t be able to live with yourself. Murder is wrong!”

Under “Kill him,” the camera reveals just one entry: “He’ll kill your entire family if you let him go.” So much for weighing the alternatives.

In his method -- and ultimately in his actions -- Walt proves to be a consequentialist, as J.C. Donhauser points out in “If Walt’s Breaking Bad, Maybe We Are Too,” one of the essays in Breaking Bad and Philosophy: Badder Living Through Chemistry (Open Court). Most viewers will have surmised as much, even if they don’t have a name for it. But there is more than one metric for judging costs and benefits, and so more than one species of consequentialist. Donhauser -- an assistant instructor of philosophy at the State University of New York at Buffalo and a lecturer at Buffalo State University – uses examples from other episodes to consider the options. There’s act consequentialism, for one (the realized effect of an act determine whether it is good or bad, even if the consequences are unintended or unforeseeable), which is distinct from rule consequentialism (“actions are better or worse, not in relation to their actual consequences, but in proportion to how far afield they fall from a rule that would be best for most people if everyone followed it”).

As for Walt, he belongs in the ranks of the agent-centered consequentialists, who “judge actions based on their consequences” but “also argue that the most important consequences are for the person carrying out the actions that produce those consequences.”

Each stance has its limitation – quite as much as deontology does. Deontology insists that consequences are irrelevant, since an act can be judged moral if and only if it could be universalized. Murder is immoral, then, because “if everyone did it, there’d be no one around for you to murder then! The same goes for stealing, as there’d be nothing left to steal.” So Jeffrey E. Stephenson put it, with tongue in cheek, in “Walter White’s American Vice.” Ditto for lying, since a society in which everyone lied constantly would be even more irrational than the one we live in.

Walt's list of argument for letting Krazy-8 live is not deontological by any means -- although “He may listen to reason” rests on a similar conviction that clarity and rationality are not just worthy aspirations but realizable possibilities as well. Despite his nickname and his criminal vocation, Krazy-8 is a well-spoken and seemingly pragmatic individual, with strong family ties of a sort that Walt can respect. And Walt very nearly reaches a decision on that basis.

On the other hand, not every consequence can be put in brackets while you seek the universally right thing to do. And “He’ll kill your entire family if you let him go” is a pretty good example of that. Under the circumstances, even a deontologist would probably find a way to think of murder as obligatory.

Breaking Bad and Philosophy, edited by David R. Koepsell and Robert Arp, is much like any other collection of essays in the Open Court series Popular Culture and Philosophy, of which it is volume 67. By the way, the publisher has registered “Popular Culture and Philosophy” as a trademark. Don't confuse it with The Blackwell Philosophy and Pop Culture Series (37 volumes at last report) or the University of Kentucky’s line called The Philosophy of Popular Culture (23 titles, not counting updated editions).

By now, it seems as if every genre, blockbuster, videogame, superhero, hit program, or teen trend has been covered by at least one book in this niche, or will be in the foreseeable future. I picture them being produced in something akin to Walt’s methamphetamine superlab – with the important exception that Walt’s product is of famously consistent in quality. The popcult philosophy collections that I’ve sampled over the years tend to be pretty uneven, even within the same volume. The one constant is that most of the essays are clearly didactic. The implied reader for these books almost always seems to be an undergraduate, with popular culture as the candy coating on the philosophical vitamins otherwise missing from the educational diet. There is jocularity aplenty. In this volume, for example, a comparison of Breaking Bad and Augustine’s Confessions includes the information that the saint-to-be “had a rep for hooking up with the MILFs of Carthage” -- not unlike Peter Abelard, “a famous playa before his lover’s father and brother… cut off his junk and sent him packin.’”

Well, you do what you must to keep the students' attention. With any luck, these books will be the philosophical equivalent of a gateway drug, leading some readers to try the harder stuff.

But there must be more ways to go about it than by reducing every pop-culture phenomenon to a pretext for introducing well-established topics and thinkers. Another constituency for these books is the fan base for whatever cultural commodity gets yoked to philosophy in their titles. It was as a devotee of the show (one who has seen every episode of the first four seasons at least twice) that I bought Breaking Bad and Philosophy in the first place. And the striking thing about the program is that it's all about how decisions, consequences, and responsibility (or the lack of it) get mixed up in ways that no schema can account for very well. That is undoubtedly part of its appeal.

I’ll end by recommending one essay from the book that will reward the attention of anyone who follows the show closely. Titled “Macbeth on Ice,” it is by Ray Bossert, a visiting assistant professor of English at Franklin and Marshall College. He compares "Breaking Bad" and the Scottish play by reference to Aristotle's Poetics, to surprisingly appropriate effect.

In Aristotle’s analysis, the hero in classical tragedy is responsible for his actions and ultimately their victim. His character is admirable and doomed because of some flaw -- excessive pride, for example. That's the one Macbeth and Walter White share. The hero's motives and decisions are transformed as this flaw grows more prominent. It leads him to "incidents arousing pity and fear" in the audience, says Aristotle. Such incidents have the very greatest effect on the mind when they occur unexpectedly and at the same time in consequence of one another; they arouse more awe than if they happened accidentally and by chance."

In Walt’s case, as his involvement in the meth business deepens, we see that his insistence that everything he does is out of love for his family is a kind of self-deception. More and more evidence of his rage and resentment accumulates. He feels trapped by his family, and his pride has been wounded too many times in his 50 years. As events unfold, Walt feels increasingly confident and powerful, and his running cost-benefit analysis leaves ever more collateral damage.

We believe in the character, writes Bossert, “because, in our own thoughts, we, too, resent being limited to a single role on life’s stage. We pity Walter White, and fear that we might make similar mistakes because we’re like him.” This seems exactly right. Bossert makes no predictions about how Breaking Bad will end (it is now counting down its last 16 episodes, 8 this summer and 8 in 2013) nor will I. But Walt has enormous potential in the pity and fear department, and the stage is sure to be covered with bodies before the curtain falls – even more than it already is.

Review of Francois Noudelmann, "The Philosopher's Touch: Sartre, Nietzsche, and Barthes at the Piano"

Call it philosophical synesthesia: the work of certain thinkers comes with a soundtrack. With Leibniz, it’s something baroque played on a harpsichord -- the monads somehow both crisply distinct and perfectly harmonizing. Despite Nietzsche’s tortured personal relationship with Wagner, the mood music for his work is actually by Richard Strauss. In the case of Jean-Paul Sartre’s writings, or at least some of them, it’s jazz: bebop in particular, and usually Charlie Parker, although it was Dizzie Gillespie who wore what became known as “existentialist” eyeglasses. And medieval scholastic philosophy resonates with Gregorian chant. Having never managed to read Thomas Aquinas without getting a headache, I find that it’s the Monty Python version:

 

 

 

 

Such linkages are, of course, all in my head -- the product of historical context and chains of association, to say nothing of personal eccentricity. But sometimes the connection between philosophy and music is much closer than that. It exists not just in the mind’s ear but in the thinker’s fingers as well, in ways that François Noudelmann explores with great finesse in The Philosopher’s Touch: Sartre, Nietzsche, and Barthes at the Piano (Columbia University Press).

The disciplinary guard dogs may snarl at Noudelmann for listing Barthes, a literary critic and semiologist, as a philosopher. The Philosopher’s Touch also ignores the principle best summed up by Martin Heidegger (“Horst Vessel Lied”): “Regarding the personality of a philosopher, our only interest is that he was born at a certain time, that he worked, and that he died." Biography, by this reasoning, is a distraction from serious thought, or, worse, a contaminant.

But then Noudelmann (a professor of philosophy at l’Université Paris VIII who has also taught at Johns Hopkins and New York Universities) has published a number of studies of Sartre, who violated the distinction between philosophy and biography constantly. Following Sartre’s example on that score is a dicey enterprise -- always in danger of reducing ideas to historical circumstances, or of overinterpreting personal trivia.

The Philosopher’s Touch runs that risk three times, taking as its starting point the one habit its protagonists had in common: Each played the piano almost every day of his adult life. Sartre gave it up only as a septuagenarian, when his health and eyesight failed. But even Nietzsche’s descent into madness couldn’t stop him from playing (and, it seems, playing well).

All of them wrote about music, and each published at least one book that was explicitly autobiographical. But they seldom mentioned their own musicianship in public and never made it the focus of a book or an essay. Barthes happily accepted the offer to appear on a radio program where the guest host got to spin his favorite recordings. But the tapes he made at home of his own performances were never for public consumption. He was an unabashed amateur, and recording himself was just a way to get better.

Early on, a conductor rejected one of Nietzsche’s compositions in brutally humiliating terms, asking if he meant it as a joke. But he went on playing and composing anyway, leaving behind about 70 works, including, strange to say, a mass.

As for Sartre, he admitted to daydreams of becoming a jazz pianist. “We might be even more surprised by this secret ambition,” Noudelmann says, “when we realize that Sartre did not play jazz! Perhaps this was due to a certain difficulty of rhythm encountered in jazz, which is so difficult for classical players to grasp. Sight-reading a score does not suffice.” It don’t mean a thing if it ain’t got that swing.

These seemingly minor or incidental details about the thinkers’ private devotion to the keyboard give Noudelmann an entrée to a set of otherwise readily overlooked set of problems concerning both art -- particularly the high-modernist sort -- and time.  

In their critical writings, Sartre and Barthes always seemed especially interested in the more challenging sorts of experimentation (Beckett, serialism, Calder, the nouveau roman, etc.) while Nietzsche was, at first anyway, the philosophical herald of Wagner’s genius as the future of art. But seated at their own keyboards, they made choices seemingly at odds with the sensibility to be found in their published work. Sartre played Chopin. A lot. So did Nietzsche. (Surprising, because Chopin puts into sound what unrequited love feels like, while it seems like Nietzsche and Sartre are made of sterner stuff.  Nietzsche also loved Bizet’s Carmen. His copy of the score “is covered with annotations, testifying to his intense appropriation of the opera to the piano.” Barthes liked Chopin but found him too hard to play, and shifted his loyalties to Schumann – becoming the sort of devotee who feels he has a uniquely intense connection with an artist. “Although he claims that Schumann’s music is, through some intrinsic quality, made for being played rather than listened to,” writes Noudelmann, “his arguments can be reduced to saying that this music involves the body that plays it.”

Such ardor is at the other extreme from the modernist perspective for which music is the ideal model of “pure art, removed from meaning and feeling,” creating, Noudelmann writes, “a perfect form and a perfect time, which follow only their own laws.... Such supposed purity requires an exclusive relation between the music and a listener who is removed from the conditions of the music’s performance.”

But Barthes’s passion for Schumann (or Sartre’s for Chopin, or Nietzsche’s for Bizet) involves more than relief at escaping severe music for something more Romantic and melodious. The familiarity of certain compositions; the fact that they fall within the limits of the player’s ability, or give it enough of a challenge to be stimulating; the way a passage inspires particular moods or echoes them -- all of this is part of the reality that playing music “is entirely different from listening to it or commenting on it.” That sounds obvious but it is something even a bad performer sometimes understands better than a good critic.

“Leaving behind the discourse of knowledge and mastery,” Noudelmann writes, “they maintained, without relent and throughout the whole of their existence, a tacit relation to music. Their playing was full of habits they had cultivated since childhood and discoveries they had made in the evolution of their tastes and passions.” More is involved than sound.

The skills required to play music are stored, quite literally, in the body. It’s appropriate that Nietzsche, Sartre, and Barthes all wrote, at some length, about both the body and memory. Noudelmann could have belabored that point at terrific length and high volume, like a LaMonte Young performance in which musicians play two or three notes continuously for several days. Instead, he improvises with skill in essays that pique the reader's interest, rather than bludgeoning it. And on that note, I must now go do terrible things to a Gibson electric guitar.

Essay on Christine Overall, "Why Have Children?"

Intellectual Affairs

“Not to be born is, beyond all estimation, best,” chants the chorus in Sophocles’s Oedipus at Colonus, “but when a man has seen the light of day, this is next best by far, that with utmost speed he should go back from where he came.” They make a quick inventory of life’s miseries, including pain, envy, war, and old age. Which seems like rubbing it in, considering Oedipus is an ex-king who, in the trilogy’s earlier play, tore his own eyeballs out of their sockets.

In any case, the sentiment is hardly original. Consider another king, Midas, of golden-touch fame. He kidnaps Silenus, teacher and drinking companion of the god Dionysus, and demands that he reveal the most desirable thing in the world. Silenus resists answering for a while but finally blurts it out: "Never to have been born." It's not the voice of clinical depression speaking but a nugget of grim wisdom from antiquity. It's Western civilization's way of saying that your parents did you no great favor.

I don’t see much good in arguing the point, one way or the other. Cosmic pessimism is a sensibility, not a proposition. It's not even that dour, necessarily. Silenus doesn't kill himself; in the myths, he seems to be having a pretty good time. If anything, pessimists might find life easier to bear. They’re less likely to be disappointed.

In her new book Why Have Children? The Ethical Debate (MIT Press), Christine Overall, a professor of philosophy at Queen's University in Ontario, assesses the usual grounds for having kids or deciding against it. She scrutinizes them like an IRS accountant in the middle of a ruthless audit. Few claims survive her red pen. To summarize her findings with somewhat reckless brevity, Overall maintains that many of the motivations for having children are, for the most part, at least somewhat ethically dubious -- while the decision not to have them tends to be less problematic. 

“Deciding whether to procreate is a moral decision,” she writes, “…because it affects so many people -- not only the prospective parent(s), but also the prospective child, other family members, and other members of the community. Although one is certainly entitled to take into account the effects of having a child on oneself, if one decides only on the basis of a gamble about one’s future well-being, then one is refusing to treat procreation as a fully moral matter.” Having a baby to boost self-esteem, or save a marriage (does that ever work?), or simply because it's expected of you, grossly underestimates the seriousness of becoming responsible for someone's existence.

Conversely, even if a person's reasons for opting out of reproduction are specious or self-interested, that doesn’t make the decision itself bad. It has little impact on anybody besides the decision-maker, apart from the occasional unhappy would-be grandparent, perhaps.

She is particularly critical of arguments that there is some obligation to have children -- duty to nation or community, for instance, or obedience to a divine command to “be fruitful and multiply.” Her guiding concern is the moral right to autonomous decision-making about whether or not to reproduce. Otherwise, we have “the compulsory and unwilled use of people’s bodies for procreative purposes, whether they are other individuals’ or the state’s purposes.”

Here the phrase “people’s bodies” is a little more gender-neutral than strictly necessary. If presidential candidates or members of Congress tried to outlaw vasectomies, or made sperm-bank donations obligatory -- well, that would be bad, but it’s not something men tend to worry over. Given the extremely asymmetrical distribution of the burdens involved in procreation, the real issue is whether women can decide not to have children. The precondition for making an ethical decision about having children is that it actually be a choice.

Perhaps I’ve made the author sound like an echo of the chorus in Sophocles. She isn’t -- very much the contrary. Overall has two children, and the final pages of her book are a testament to the distinct satisfactions of raising them and seeing them grow into adults. She recognizes that antinatalism (the philosophical brand-name for arguments that coming into the world is a horrid misfortune) tends to be explicitly misogynistic. “The idea that it is better in every case never to have been [born],” she writes, “implies that women’s reproductive labor in pregnancy, birth, breastfeeding, and even rearing children contributes to the accumulation of net harm on this planet.”

For that matter, “the theory can be interpreted to mean that both contraception and abortion should be mandatory” -- hardly an attitude consistent with autonomous decision-making.

But antinatalism isn’t a real force in the world -- while the expectation that if you can have kids, you should, remains fairly strong. Overall’s book is a welcome antidote.

“Children are not essential to all good lives,” she writes, “nor are having and rearing children prerequisites to becoming a good person. Moreover, there are many childless persons who support, love, care for, and teach other people’s children. Chosen childlessness has as much potential for the good life as chosen parenthood has.”

There is more to this passage in a similar vein. It appears on page 219. I mention it because some readers might want to photocopy it to post on the refrigerator door, when the family comes around.

Making the case for dissolving the American Philosophical Association

Smart Title: 

Is it time for the American Philosophical Association to be euthanized? A philosopher appointed to a committee to look into the organization’s future asks the provocative question.

review of Emrys Westcott, "The Virtue of Our Vices"

Intellectual Affairs

In search of a rationale to avoid making any New Year’s resolutions, I was glad to see that Princeton University Press has issued a book called The Virtues of Our Vices: A Modest Defense of Gossip, Rudeness, and Other Bad Habits, by Emrys Westacott, a professor of philosophy at Alfred University.

It is not, alas, a handbook on self-improvement through creative rationalization. Two chapters started out as papers appearing in International Journal of Applied Philosophy, and Westacott’s project falls under a specialized heading within the humanistic division of labor: “microethics.”

The term was unfamiliar. A look at The Oxford Companion to Philosophy and similar reference works was to no avail.  Searching the Library of Congress catalog turned up no book or journal titles mentioning microethics, nor was it a keyword in any subject heading. Missing from the LoC’s holdings, but locatable online, is Paul Komesaroff’s Experiments in Love and Death: Medicine, Postmodernism, Microethics and the Body (Melbourne University Press, 2008). Komesaroff, a physician and professor of medicine at Monash University in Australia, contrasts microethical decision-making to the more general level of bioethical argument.

The issues that bioethicists discuss (cloning, euthanasia, animal rights, etc.) are matters of public debate, while microethical questions arise in a clinical setting – often enough while doctor and patient are face-to-face, with a piece of bad news between them. “Microethics,” writes Komesaroff,“is in general not the terrain of arresting cases involving heroic decisions or extraordinary circumstances…. Indeed, this may be one reason for the relative lack of attention it has attracted. Rather, it is the field of day-to-day communication and structured, complex interactions, of subtle gestures and fine nuances of language.”

He gives as an example the obligations of a physician when conveying unwelcome results from a biopsy. Here, the microethical question is not whether to be honest. That is a given. But the moment of truth will reverberate for the patient throughout whatever may be left of his or her life. The duty to render a prognosis is complicated by the possibility of creating false hope or absolute despair. The dilemma is both fine-grained and profoundly consequential.

By contrast, the microethical issues that interest Westacott seem like decidedly smaller beans. The subtitle of The Virtues of Our Vices mentions gossip and rudeness. In addition, there are chapters on snobbery and offensive jokes, as well as an investigation of the balancing act involved in respecting the opinions of others. On the one hand, people have a right to their beliefs. On the other hand, it is an inescapable reality this sometimes those beliefs are uninformed, irrational or downright insane.

None of these issues are a matter of life or death, as such, though I suppose they could be, if you offended the wrong person. But they all fit Komesaroff’s definition of the microethical domain as “the field of day-to-day communication and structured, complex interactions, of subtle gestures and fine nuances of language.” They are problems that flash up in the course of routine social interaction, with ambiguities that can make things even more complicated. Deciding whether a given action or remark was rude or snobbish is not always easy -- even for the person responsible for it.

Questions about right and wrong concerning everyday interaction “take up the bulk of whatever time most of us spend in moral reflection and decision making,” writes Westacott. “[O]ur everyday thinking and conduct regarding commonplace matters are the most important indicators, both to ourselves and to others, of our true moral values and character. Certainly, they count for more than purely hypothetical scenarios in which we imagine how we would handle terrible dilemmas involving lifeboats, terrorists, deathbed promises, or runaway trains.”

Quite true: It’s hard to remember the last time I had to decide if it would be okay to torture a prisoner to extract information about a ticking time bomb. The microethical questions of everyday life tend to be less stark, though not necessarily more simple. The very familiarity of an experience such as rudeness means that we ordinarily do without formal definitions of what counts as rude and what doesn’t. It is the microethicist's task to specify what is otherwise left implicit in such terms. That can take some doing, as shown by Westacott's labor to synthesize a precise definition of snobbery. It takes six efforts. (The final product: "believing without sufficient justification that you are superior to another person in certain respects because you belong to or are associated with some group that you think places you above them in a social hierarchy.")

Plenty of gray areas remain even after the terms have been clarified. It's possible to generate decision trees for judging if a piece of gossip is damaging and unethical, or whether a given violation of social norms will be assessed as rude. And so Westacott does -- seeming to sharpen up the distinctions between good and bad microethical distinctions. But at the same time, the author reckons the possible benefits of various vices, as well as their costs. Gossip, for example, is typically criticized as evidence of "shallow living," writes Westacott, "something we are continually discovering new ways to achieve." But that is one-sided. "Since one of the benefits gossip can bring is a deeper understanding of human nature and social institutions ... it is more plausible to think that a willingness to talk about people -- which at times will involve gossiping -- may be an integral part of 'the examined life.' This is why we find Socrates, in Platonic dialogues like the Meno and the Gorgias, freely discussing the failings of others in the course of his philosophical inquiries."

Not to push the comparison too hard, but in Westacott's microethical analyses, as with Socratic badinage, it's the process of inquiry, as much as the result, that engages the reader's interest. His tree-chart algorithms probably won't be that useful to anyone having to make a decision. But they reveal some of implicit choices that we often make very quickly when dealing with other people. The unexamined life may not be worth living, but it is, after all, where we spend most of our time. The Virtues of Our Vices shines a little light in that direction.

Faulty Forecast?

Smart Title: 
New analysis of "climate" for women in graduate philosophy programs -- conducted without input of actual grad students -- has infuriated many.

Last Bastion of Liberal Education?

Why do narratives of decline have such perennial appeal in the liberal arts, especially in the humanities?  Why is it, year after year, meeting after meeting, we hear laments about the good old days and predictions of ever worse days to come?  Why is such talk especially common in elite institutions where, by many indicators,  liberal education is doing quite well, thank you very much.  I think I know why.  The opportunity is just too ripe for the prophets of doom and gloom to pass up.

There is a certain warmth and comfort in being inside the “last bastion of the liberal arts,” as  B.A. Scott characterized prestigious colleges and research universities in his collection of essays The Liberal Arts in a Time of Crisis (NY Praeger, 1990). The weather outside may be frightful, but inside the elite institutions, if not “delightful,” it’s perfectly tolerable, and likely to remain so until retirement time.

Narratives of decline have also been very useful to philanthropy, but in a negative way.  As Tyler Cowen recently noted in The New York Times, “many donors … wish to be a part of large and successful organizations -- the ‘winning team’ so to speak.” They are not eager to pour out their funds in order to fill a moat or build a wall protecting some isolated  “last bastion.” Narratives of decline provide a powerful reason not to reach for the checkbook. Most of us in the foundation world, like most other people, prefer to back winners than losers. Since there are plenty of potential winners out there, in areas of pressing need, foundation dollars have tended to flow away from higher education in general, and from liberal education in particular.

But at the campus level there’s another reason for the appeal of the narrative of decline, a genuinely insidious one. If something goes wrong the narrative of decline of the liberal arts always provides an excuse. If course enrollments decline, well, it’s just part of the trend.  If students don’t like the course, well, the younger generation just doesn’t appreciate such material. If the department loses majors, again, how can it hope to swim upstream when the cultural currents are so strong?  Believe in a narrative of decline and you’re home free; you never have to take responsibility, individual or collective, for anything having to do with liberal education.  

There’s just one problem. The narrative of decline is about one generation out of date and applies now only in very limited circumstances. It’s true that in 1890, degrees in the liberal arts and sciences accounted for about 75 percent of all bachelor’s degrees awarded; today the number is about 39 percent, as Patricia J. Gumport and  John D. Jennings noted in “Toward the Development of Liberal Arts Indicators” (American Academy of Arts and Sciences, 2005). But most of that decline had taken place by 1956, when the liberal arts and sciences had 40 percent of the degrees. 

Since then the numbers have gone up and down, rising to 50 percent by 1970, falling to 33 percent by 1990, and then rising close to the 1956 levels by 2001, the last year for which the data have been analyzed. Anecdotal evidence, and some statistics, suggest that the numbers continue to rise, especially in  Research I universities.  

For example, in the same AAA&S report ("Tracking Changes in the Humanities) from which these figures have been derived, Donald Summer examines the University of Washington (“Prospects for the Humanities as Public Research Universities Privatize their Finances”) and finds that majors in the humanities have been increasing over the last few years and course demand is strong.

The stability of liberal education over the past half century seems to me an amazing story, far more compelling than a narrative of decline, especially when one recognizes the astonishing changes that have taken place over that time: the vast increase in numbers of students enrolled in colleges and universities,  major demographic changes, the establishment of new institutions, the proliferation of knowledge, the emergence of important new disciplines, often in the applied sciences and engineering, and, especially in recent years, the financial pressures that have pushed many institutions into offering majors designed to prepare students for entry level jobs in parks and recreation, criminal justice, and now homeland security studies. And, underlying many of these changes, transformations of the American economy.    

The Other, Untold Story

How, given all these changes, and many others too, have the traditional disciplines of the arts and sciences done as well as they have? That would be an interesting chapter in the history of American higher education. More pressing, however, is the consideration of one important consequence of narratives of decline of the liberal arts.

This is the “last bastion” mentality, signs of which are constantly in evidence when liberal education is under discussion. If liberal education can survive only within the protective walls of elite institutions, it doesn’t really make sense to worry about other places. Graduate programs, then, will send the message that success means teaching at a well-heeled college or university, without any hint that with some creativity and determination liberal education can flourish in less prestigious places, and that teaching there can be as satisfying as it is demanding.

Here’s one example of what I mean. In 2000, as part of a larger initiative to strengthen undergraduate liberal education,  Grand Valley State University, a growing regional public institution in western Michigan, decided to establish a classics department. Through committed teaching, imaginative curriculum design, and with strong support from the administration, the department has grown to six tenured and tenure track positions with about 50 majors on the books at any given moment. Most of these are first-generation college students from blue-collar backgrounds who had no intention of majoring in classics when they arrived at Grand Valley State, but many have an interest in mythology or in ancient history that has filtered down through popular culture and high school curricula. The department taps into this interest through entry-level service courses, which are taught by regular faculty members, not part timers or graduate students.

That’s a very American story, but the story of liberal education is increasingly a global one as well.  New colleges and universities in the liberal arts are springing up in many countries, especially those of the former Soviet Union.

I don’t mean that the spread of liberal education comes easily, in the United States or elsewhere. It’s swimming upstream. Cultural values, economic anxieties, and all too often institutional practices (staffing levels, salaries, leave policies and research facilities) all exert their downward pressure. It takes determination and devotion to press ahead. And those who do rarely get the recognition or credit they deserve.

But breaking out of the protective bastion of the elite institutions is vital for the continued flourishing of liberal education. One doesn’t have to read a lot of military history to know what happens to last bastions. They get surrounded; they eventually capitulate, often because those inside the walls squabble among themselves rather than devising an effective breakout strategy. We can see that squabbling at work every time humanists treat with contempt the quantitative methods of their scientific colleagues and when scientists contend that the reason we are producing so few scientists is that too many students are majoring in other fields of the liberal arts.  

The last bastion mentality discourages breakout strategies. Even talking to colleagues in business or environmental studies can be seen as collaborating with the enemy rather than as a step toward broadening and enriching the education of students majoring in these fields. The last bastion mentality, like the widespread narratives of decline, injects the insidious language of purity into our thinking about student learning, hinting that any move  beyond the cordon sanitaire is somehow foul or polluting and likely to result in the corruption of high academic standards.   

All right, what if one takes this professed concern for high standards seriously? What standards, exactly, do we really care about and wish to see maintained? If it’s a high level of student engagement and learning, then let’s say so, and be forthright in the claim that liberal education is reaching that standard, or at least can reach that standard if given half a chance. That entails, of course, backing up the claim with some systematic form of assessment.

That provides one way to break out of the last bastion mentality. One reason that liberal education remains so vital  is that when properly presented it contributes so much to personal and cognitive growth. The subject matter of the liberal arts and sciences provides some of the best ways of helping students achieve goals such as analytical thinking, clarity of written and oral expression,  problem solving, and alertness to moral complexity, unexpected consequences and cultural difference. These goals command wide assent outside academia, not least among employers concerned about the quality of their work forces. They are, moreover, readily attainable  through liberal education provided proper attention is paid to “transference.”  “High standards” in liberal education require progress toward these cognitive capacities.

Is it not time, then, for those concerned with the vitality of liberal education to abandon the defensive strategies that derive from the last bastion mentality, and adopt a new and much more forthright stance? Liberal education cares about high standards of student engagement and learning, and it cares about them for all students regardless of their social status or the institution in which they are enrolled.

There is, of course, a corollary. Liberal education can’t just make the claim that it is committed to such standards, still less insist that others demonstrate their effectiveness in reaching them, unless those of us in the various fields of the arts and sciences are willing to put ourselves on the line. In today’s climate  we have to be prepared to back up the claim that we are meeting those standards. Ways to make such assessments are now at hand, still incomplete and imperfect, but good enough to provide an opportunity for the liberal arts and sciences to show what they can do.

That story, I am convinced, is far more compelling than any narrative of decline.

Author/s: 
W. Robert Connor
Author's email: 
newsroom@insidehighered.com

W. Robert Connor is president of the Teagle Foundation and blogs frequently about liberal education.

If Not Religion, What?

In a variety of arenas, from politics to high schools, from colleges to the military, Americans argue as though the proper face-to-face discussion in our society ought to be between religion and science. This is a misunderstanding of the taxonomy of thought. Religion and science are in different families on different tracks: science deals with is vs. isn’t and religion, to the extent that it relates to daily life, deals with should vs. shouldn’t.

These are fundamentally different trains. They may hoot at each other in passing, and many people attempt to switch them onto the same track (mainly in order to damage science), but this is an act of the desperate, not the thoughtful.

It is true that a portion of religious hooting has to do with is vs. isn’t questions, in the arena of creationism and its ancillary arguments. However, this set of arguments, important as it might be for some religious people, is not important to a great many (especially outside certain Protestant variants), while the moral goals and effects of religious belief are a far more common and widespread concern among many faiths. I was raised in Quaker meeting, where we had a saying: Be too busy following the good example of Jesus to argue about his metaphysical nature.

Until recently, most scientists didn’t bother trying to fight with religion; for the most part they ignored it or practiced their own faiths. However, in recent years Carl Sagan, Richard Dawkins, Daniel Dennett and Sam Harris have decided to enter the ring and fight religion face to face. The results have been mixed. I have read books by all of these authors on this subject, as well as the interesting 2007 blog exchange between Harris and Andrew Sullivan, one of the best writers active today and a practicing Catholic, and it is clear that a great deal of energy is being expended firing heavy ordnance into black holes with no likelihood of much effect.

The problem that the scientific horsemen face is that theirs is the language of is/isn’t. Their opponents (mostly Christians but by implication observant Jews and Muslims as well) don’t use the word “is” to mean the same thing. To a religious person, God is and that’s where the discussion begins. To a nonreligious scientist, God may or may not be, and that is where the discussion begins.

The two sides, postulating only two for the moment, are each on spiral staircases, but the stairs wind around each other and never connect: this is the DNA of unmeeting thoughts. Only shouting across the gap happens, and the filters of meaning are not aligned. That is why I don’t put much faith, you’ll pardon the expression, in this flying wedge of scientific lancers to change very many minds.

Dennett’s approach is quite different from the others at a basic level; he views religious people as lab rats and wants to study why they squeak the way they do. That way of looking at the issue seems insulting at first but is more honest and practical in that it doesn’t really try to change minds that are not likely to change.

But these arguments are the wrong ones at a very basic level, especially for our schools and the colleges that train our teachers. The contrapuntal force to religion, that force which is in the same family, if a different genus, speaks the same language in different patterns regarding the same issues. It is not science, it is philosophy. That is what our teachers need to understand, and this distinction is the one in which education colleges should train them.

Those of us who acknowledge the factual world of science as genuine and reject the idea of basing moral and “should” questions in the teachings of religion are left seeking an alternate source for sound guidance. Our own judgment based in experience is a strong basic source. The most likely source, the ‘respectable’ source with sound academic underpinnings that can refine, inform and burnish our judgment, is philosophy in its more formal sense.

The word “philosophy” conjures in many minds the image of dense, dismal texts written by oil lamp with made-up words in foreign languages, and far beyond mortal ken. In fact, many writers on philosophy are quite capable of writing like human beings; some of their books are noted below.

When we introduce more religious studies into our K-12 schools, as we must if people are ever to understand each other’s lives, the family of learning into which they must go also contains philosophy. It is this conversation, between the varieties of religious outlooks and their moral conclusions, and the same questions discussed by major philosophers, that needs to happen.

Philosophy is not all a dense, opaque slurry of incomprehensible language. Some excellent basic books are available that any reasonably willing reader can comprehend and enjoy. Simon Blackburn’s Think, Robert Solomon and Kathleen Higgins’ A Passion for Wisdom and Erik Wielenberg’s Value and Virtue in a Godless Universe are some recent examples.

An older text providing a readable commentary on related issues is John Jay Chapman’s Religion and Letters, still in print in his Collected Works but hard to find in the original, single volume . Chapman wrote of changes in our school system that:

“It is familiarity with greatness that we need—an early and first-hand acquaintance with the thinkers of the world, whether their mode of thought was music or marble or canvas or language. Their meaning is not easy to come at, but in so far as it reaches us it will transform us. A strange thing has occurred in America. I am not sure that it has ever occurred before. The teachers wish to make learning easy. They desire to prepare and peptonize and sweeten the food. Their little books are soft biscuits for weak teeth, easy reading on great subjects, but these books are filled with a pervading error: they contain a subtle perversion of education. Learning is not easy, but hard: culture is severe.”

This, published in 1910, is remarkably relevant to education at all levels today. The idea that philosophy is too hard for high school students, which I doubt, simply means that we need to expect more of students all through K-12. Many of them would thank us.

Paul Kurtz’s Affirmations and my brother John Contreras’s Gathering Joy are interesting “guidebooks” that in effect apply philosophical themes in an informal way to people’s real lives. There are also somewhat more academic books that integrate what amount to philosophical views into daily life such as Michael Lynch’s True to Life: Why Truth Matters, physicist Alan Lightman’s A Sense of The Mysterious and the theologian John O’Donohue’s Beauty: The Invisible Embrace.

Some of these are denser than others and not all are suited for public schools, but the ideas they discuss are often the same ideas discussed in the context of religions, and sometimes with similar language. It is this great weave of concepts that our students should be exposed to, the continuum of philosophical thought blended with the best that different religions have to offer.

The shoulds and shouldn’ts that are most important to the future of our society need to be discussed in colleges, schools and homes, and the way to accomplish this is to bring religions and philosophies back to life as the yin and yang of right and wrong. That is the great conversation that we are not having.

Author/s: 
Alan Contreras
Author's email: 
newsroom@insidehighered.com

Alan L. Contreras has been administrator of the Oregon Office of Degree Authorization, a unit of the Oregon Student Assistance Commission, since 1999. His views do not necessarily represent those of the commission. He blogs at http://oregonreview.blogspot.com.

Pages

Subscribe to RSS - Philosophy
Back to Top