In search of a rationale to avoid making any New Year’s resolutions, I was glad to see that Princeton University Press has issued a book called The Virtues of Our Vices: A Modest Defense of Gossip, Rudeness, and Other Bad Habits, by Emrys Westacott, a professor of philosophy at Alfred University.
It is not, alas, a handbook on self-improvement through creative rationalization. Two chapters started out as papers appearing in International Journal of Applied Philosophy, and Westacott’s project falls under a specialized heading within the humanistic division of labor: “microethics.”
The term was unfamiliar. A look at The Oxford Companion to Philosophy and similar reference works was to no avail. Searching the Library of Congress catalog turned up no book or journal titles mentioning microethics, nor was it a keyword in any subject heading. Missing from the LoC’s holdings, but locatable online, is Paul Komesaroff’s Experiments in Love and Death: Medicine, Postmodernism, Microethics and the Body (Melbourne University Press, 2008). Komesaroff, a physician and professor of medicine at Monash University in Australia, contrasts microethical decision-making to the more general level of bioethical argument.
The issues that bioethicists discuss (cloning, euthanasia, animal rights, etc.) are matters of public debate, while microethical questions arise in a clinical setting – often enough while doctor and patient are face-to-face, with a piece of bad news between them. “Microethics,” writes Komesaroff,“is in general not the terrain of arresting cases involving heroic decisions or extraordinary circumstances…. Indeed, this may be one reason for the relative lack of attention it has attracted. Rather, it is the field of day-to-day communication and structured, complex interactions, of subtle gestures and fine nuances of language.”
He gives as an example the obligations of a physician when conveying unwelcome results from a biopsy. Here, the microethical question is not whether to be honest. That is a given. But the moment of truth will reverberate for the patient throughout whatever may be left of his or her life. The duty to render a prognosis is complicated by the possibility of creating false hope or absolute despair. The dilemma is both fine-grained and profoundly consequential.
By contrast, the microethical issues that interest Westacott seem like decidedly smaller beans. The subtitle of The Virtues of Our Vices mentions gossip and rudeness. In addition, there are chapters on snobbery and offensive jokes, as well as an investigation of the balancing act involved in respecting the opinions of others. On the one hand, people have a right to their beliefs. On the other hand, it is an inescapable reality this sometimes those beliefs are uninformed, irrational or downright insane.
None of these issues are a matter of life or death, as such, though I suppose they could be, if you offended the wrong person. But they all fit Komesaroff’s definition of the microethical domain as “the field of day-to-day communication and structured, complex interactions, of subtle gestures and fine nuances of language.” They are problems that flash up in the course of routine social interaction, with ambiguities that can make things even more complicated. Deciding whether a given action or remark was rude or snobbish is not always easy -- even for the person responsible for it.
Questions about right and wrong concerning everyday interaction “take up the bulk of whatever time most of us spend in moral reflection and decision making,” writes Westacott. “[O]ur everyday thinking and conduct regarding commonplace matters are the most important indicators, both to ourselves and to others, of our true moral values and character. Certainly, they count for more than purely hypothetical scenarios in which we imagine how we would handle terrible dilemmas involving lifeboats, terrorists, deathbed promises, or runaway trains.”
Quite true: It’s hard to remember the last time I had to decide if it would be okay to torture a prisoner to extract information about a ticking time bomb. The microethical questions of everyday life tend to be less stark, though not necessarily more simple. The very familiarity of an experience such as rudeness means that we ordinarily do without formal definitions of what counts as rude and what doesn’t. It is the microethicist's task to specify what is otherwise left implicit in such terms. That can take some doing, as shown by Westacott's labor to synthesize a precise definition of snobbery. It takes six efforts. (The final product: "believing without sufficient justification that you are superior to another person in certain respects because you belong to or are associated with some group that you think places you above them in a social hierarchy.")
Plenty of gray areas remain even after the terms have been clarified. It's possible to generate decision trees for judging if a piece of gossip is damaging and unethical, or whether a given violation of social norms will be assessed as rude. And so Westacott does -- seeming to sharpen up the distinctions between good and bad microethical distinctions. But at the same time, the author reckons the possible benefits of various vices, as well as their costs. Gossip, for example, is typically criticized as evidence of "shallow living," writes Westacott, "something we are continually discovering new ways to achieve." But that is one-sided. "Since one of the benefits gossip can bring is a deeper understanding of human nature and social institutions ... it is more plausible to think that a willingness to talk about people -- which at times will involve gossiping -- may be an integral part of 'the examined life.' This is why we find Socrates, in Platonic dialogues like the Meno and the Gorgias, freely discussing the failings of others in the course of his philosophical inquiries."
Not to push the comparison too hard, but in Westacott's microethical analyses, as with Socratic badinage, it's the process of inquiry, as much as the result, that engages the reader's interest. His tree-chart algorithms probably won't be that useful to anyone having to make a decision. But they reveal some of implicit choices that we often make very quickly when dealing with other people. The unexamined life may not be worth living, but it is, after all, where we spend most of our time. The Virtues of Our Vices shines a little light in that direction.
Six years ago, Yale University Press published A Little History of the World by E.H. Gombrich, which appeared to much acclaim and has by now sold 500,000 copies -- impressive for a trade publisher, and epochal for a university press. The great art historian had written it, his first book, in Austria during the Depression, mainly to pay the bills. It enjoyed some popularity in Europe over the years, though nothing like the success of his classic The Story of Art (1950). While “ostensibly written for teenagers,” says the entry on Gombrich in The Dictionary of Art Historians, it had “a huge impact on the general post-war populace.” According to an article in ArtNews, it had by 2006 sold more than 8 million copies in at least 30 languages. The Story of Art is one of the rare examples of a textbook that not only outlives its author but proves genuinely beloved by readers. “I never believed that books for young people should differ from books for adults,” Gombrich wrote in its preface, “except for the fact that they must reckon with the most exacting class of critics, critics who are quick to detect and resent any trace of jargon or bogus sentiment.”
A Little History of the World is, in anything, an even more deft feat of popularization, since its target audience is about 10 years old. Exact data are not at hand, but quite a few of the half-million copies it's sold so far were almost certainly purchased for adult consumption. And no shame in that. Better to know A Little History than to know none at all. At least Gombrich respects his public enough not to call them dummies.
Later this month, Yale is bringing out A Little History of Philosophy by Nigel Warburton -- published in a format that mimics the earlier volume in every particular, from cover design and length to the woodblock-like artwork appearing at the head of each chapter.
The word for this sort of thing is “branding.” My initial response to it was something less than joyous. In 2005, I reviewed Gombrich’s book for a newspaper, and put down additional thoughts on it for this column; and a few people have indicated they were encouraged to look for the book on the basis of my ardent tub-thumping on its behalf, which was as heartfelt as it could possibly be. But that was based on admiration for the singular generosity of Gombrich’s style. (The author was translating and revising the book himself when he died in 2001, and it reflects decades of finesse in his adopted language.) The odds of lightning striking twice did not seem good.
The dust jacket says that the author, Nigel Warburton, lectures on philosophy at The Open University and the Tate Modern, both in England, and “hosts a weekly podcast and an integrated philosophy website.” Writing for The Guardian, Julian Baggini, editor of The Philosopher’s Magazine, says that Warburton “has quietly become quite one of the most-read popular philosophers of our time. Over nearly two decades his Philosophy: The Basics has sold in excess of 100,000 copies, with no gimmicks, no literary flourishes, just admirable clarity, concision and accuracy.”
That said, I must admit that name rang no bells. My initial, spontaneous response to A Little History of Philosophy was simply that the the author faced an impossible task. And in Gombrich, he also had an impossible act to follow.
And yet, the book is pretty good. Warburton has many of the Gombrichian virtues. While reading A Little History of Philosophy, I jotted down notes in an effort to characterize it -- only to realize that they were, point for point, things I'd said about A Little History of the World, six years ago. For example: “Concise but substantial, without condescension, somewhat limited in its purview (focus is on the West) but written with just enough wryness to be charming.”
Machiavelli, Darwin, Freud, and Alan Turing are all covered, although none of them was a philosopher, exactly. At the same time, Martin Heidegger makes only a very brief appearance, in a chapter on his one-time girlfriend Hannah Arendt. This would undoubtedly have bothered both of them.
Warburton's survey covers strictly European and (to a smaller extent) American philosophy. A handful of thinkers from elsewhere do turn up, but very much in passing. The Buddha gets a nod in the chapter on Schopenhauer, for example. Jewish and Arabic philosophy flourished during centuries when Christendom was anything but reflective. But the only trace of them here is the names (and only the names) of Maimonides and Avicenna.
The selection, then, is debatable, and the task itself almost unimaginable (at least by the standards of academe, where it is permissible to write a 500-page monograph containing the phrase “space does not permit me to consider….” in each chapter); but the book has a certain quality that comes from accepting a challenge under severe conditions, then taking it on without making a big deal of the whole thing. And the word for that quality is grace.
It requires more than a knack for brevity. The question of what role biography ought to play in writing the history of philosophy is not a simple one. Heidegger’s treatment of the life and times of Aristotle at the start of a lecture (“He was born. He thought. He died.”) is legendary, but not, perhaps, the final word on the matter. At the same time, reducing complex ideas to personal or social factors – as with sensationalistic treatments of Heidegger himself – is no real service to anyone trying to get some bearings on the history of philosophy.
A Little History untangles that Gordian knot in tried and true manner, because saying anything in six pages means cutting through things without hesitation. To stick with the example of Aristotle, this means discussing a single text (in this case, the Nichomachean Ethics) and just enough context to connect him with the previous chapter (on Socrates and Plato) while setting up the next (on Pyrrho, the extreme skeptic, whose work stands in a nice contrast to the authoritarian dogmatism around Aristotle in later centuries).
Warburton zeroes in on the concept of eudaimonia -- meaning “happiness” or, better, “flourishing” -- and explains that it is “pronounced ‘you-die-monia’ but mean[s] the opposite” (a fitting and even helpful play on words). He sketches the psychological, moral, and social implications of eudaimonia.
And that’s that – time to move along. Of course, it means reducing the Peripatetic’s thought to the size of postage stamp. But no better approach seems obvious, given the circumstances. (It's not as if covering the logical or metaphysical writings in six pages is an option.) The focus on eudaimonia also helps to set up the later chapter on Kant’s very different understanding of ethics. When exhaustiveness is not an option, efficiency counts for something.
Yale University Press has more Little History titles on the way. (So I am told by a publicist, who kept their titles close to the vest.) The public should hold them to the standard set by the first two volumes. In the preface to The Story of Art, Gombrich spelled out the duties and the benefits of this kind of work: “It may serve to show newcomers the lay of the land without confusing them with details; to enable them to bring some intelligible order into the wealth of names, periods, and styles which crowd the pages of more ambitious works, and so equip them for consulting more specialized books.” A creditable ambition, and a demanding one. The only easy thing about it is how it looks.
“It is evident,” declared Aristotle, expecting no argument, that “two ways of life are the ones intentionally chosen by those human beings who are most ambitious with a view to virtue, both in former times and the present; the two I mean are the political and the philosophic.”
The strange word here is “virtue,” which carries a lot of baggage for the modern reader. Anyone too preoccupied with virtue is, by contemporary standards, presumably guilty of something until proven otherwise, and maybe not even then.
So it bears keeping in mind that, in Aristotle’s usage, “virtue” is almost a piece of technical jargon. It refers to a form of excellence that, as you pursue it, leads toward profound happiness and a richer life – a condition of human flourishing.
Being “ambitious with respect to virtue,” then, is not as grim as it may sound. Likewise, we have to shed a little cynicism in order to understand why Aristotle would single out politics and philosophy as ideal venues for pursuing that ambition. He understood them, not as professions, let alone as rackets, but rather as activities manifesting and enhancing our nature as social and rational animals.
At the same time, politics and philosophy pull in different directions -- one toward civic engagement, the other into deep and prolonged reflection. Aristotle was all about finding a happy medium, but in the final analysis he thought that intellectual contemplation was the highest form of virtue/excellence. (This is hardly surprising. He was a philosopher, after all.)
Mary Ann Glendon’s The Forum and the Tower: How Scholars and Politicians Have Imagined the World, from Plato to Eleanor Roosevelt, published by Oxford University Press, is a meditation on this theme from Aristotle by someone who has served as both an academic and a diplomat. (Glendon, a professor at the Harvard University Law School, was a U.S. Ambassador to the Vatican during George W. Bush’s second term.)
The book consists of a series of biographical essays on figures who moved between scholarship and statecraft, or at least desired to bring them together. “History provides few examples of prominent political actors who, like Cicero or Edmund Burke, are remembered for important contributions to political thought as well as for distinguished public service,” Glendon writes. “As for political theorists who have ventured into politics, some of the most eminent – Plato, Tocqueville, and Weber, for example – were strikingly ineffective in the public arena.” She devotes a chapter to each of these figures, plus a few others, drawing as much on their memoirs and private papers as their books or speeches. In style and spirit, The Forum and the Tower is much closer to a book like Will Durant's The Story of Philosophy (1926) than to a monograph.
The author describes the essays as “loosely linked,” which seems fair a fair description. Unlike Aristotle -- who is forever generating categorical distinctions, weighing alternatives, and lining things up neatly – Glendon is not particularly driven to analysis. Her examples from history converge on a simple point: the politician and the serious thinker embody distinct capacities, seldom found together in a single person. That principle was already recognized in ancient Athens. And Max Weber had pretty much the last word on the subject in two lectures, “Science as a Vocation” (1917) and “Politics as a Vocation” (1919). Glendon's discussion of Weber, near the end of the book, epitomizes her concern with the difficulty of bridging the distance between “the forum” (where political decisions are made) and “the tower” (as in, ivory).
In particular, Weber’s thoughts on politics as an “ethics of responsibility” seems framed as a warning. The political actor “has to be able to deal with the world as it is,” she writes, “taking human frailty into account and even using it for his purposes. He must be able to bear the irrationality of the word in which evil sometimes comes from good and good sometimes comes from evil. He has to understand that the attainment of good ends may even require using morally dubious or at least dangerous means, and that if one chases after the ultimate good, then the good he seeks may be damaged or discredited for generations…. What is decisive, said Weber, ‘is the trained relentlessness in viewing the realities of life and the ability to face such realities and to measure up to them inwardly.’”
She seems to be saying in response to Aristotle that no matter how highly you rate contemplation, the political leader's task requires the rarest virtue.
A few words about the politics of the author, and of the book itself, seem in order. In 2009, Glendon declined the Laetare Medal from the University of Notre Dame when she learned that the institution would be granting an honorary degree to President Obama at the same ceremony. I read The Forum and the Tower without knowing this, though with hindsight it is illuminating.
Glendon names two exceptional cases of leaders who also produced lasting works of scholarship, Cicero and Edmund Burke. Both, as it happens, were conservatives. She identifies Henry Kissinger as another “statesman-scholar,” which is certainly one thing you can call him, if not the one I find springing to mind. The citations from secondary literature are infrequent and tend to come from figures such as Harvey Mansfield, Thomas Pangle, Conor Cruise O’Brien, and Paul Johnson – all of them reliably conservative.
Eleanor Roosevelt appears in the subtitle of the book, rather anomalously. She makes a brief appearance in the final chapter, which is devoted to the Lebanese philosopher Charles Malik’s role the United Nations committee that drew up the Universal Declaration of Human Rights. The former First Lady chaired the committee; its work is the subject of an earlier book by Glendon. But the whole chapter feels a bit tacked on – almost an effort to impose some balance at the last possible moment. In a way the book really ends with Max Weber’s brooding thoughts on the good and evil that men do.
But that leaves me wishing that Glendon had ventured beyond popularized history and rather broad points about the gap between statecraft and the life of the mind. It would be a better book for addressing her own experience in shuttling between forum and tower -- and for posing questions about the relationship between conservative thought and action. "If one chases after the ultimate good, then the good he seeks may be damaged or discredited for generations" would serve as a critique of various right-wing luminaries, but it's never clear whether or not Glendon means it as one.
I’ll play Marc Antony. I have not come to praise large conferences, but to bury them. It is my opinion that mega humanities conferences are way past their sell-by date. For senior faculty the only reason to go is to schmooze with old friends; for junior faculty they are an onerous duty, and for graduate students they are a rip-off for which professional organizations ought to be collectively ashamed.
First codicil: I speak exclusively of humanities conferences, as they are the only ones I know firsthand. Friends in computing and the sciences tell me that collaborative efforts arise from their conferences. I’m willing to believe them. Maybe it’s a cultural thing. Most humanities people find it so hard to collaborate that their wills stipulate that their notes go with them to the grave.
Second codicil: I have only myself to blame for recent travails. I didn't need to go to my unnamed conference, but I got it into my head that it would be fun. I was wrong. It serves me right for violating my principles.
Five years ago I concluded that humanities conferences were out of touch with the times and vowed to attend only smaller regional meetings with less cachet, but more satisfaction. But I didn’t listen to me. Instead I spent four days and a considerable wad of cash jostling among a throng of over three thousand. I returned home more akin to Ponce de Leon, who sought the Fountain of Youth and found mostly dismal swampland. Sound harsh? See if any of these observations resonate with your own.
Problem One: Outmoded Presentations
We live in the communications age, but the memo apparently never circulated among those studying the liberal arts. For reasons arcane and mysterious, humanities scholars still read papers. That’s tedious enough at a small conference where one might attend six three-paper presentations. At my recent conference, sessions commenced at 8 a.m. and ran past 10 p.m. One could have conceivably attended 30 sessions and heard 90 or more papers, though the only ones with the stamina to attend more than six or seven sessions were either posturing or desperate.
I wanted my four-day sojourn to introduce me to new ideas, concepts, and teaching modules, but the reality of such a grueling schedule is that I was running on fumes by the end of day one. It would have helped if presenters took advantage of new technology, but things seldom got more flash than PowerPoint, a program that, alas, seems to encourage more reading. Let me reiterate something I’ve said for years: the death penalty should apply to those who read anything from a PowerPoint slide other than a direct quote. It's an academic conference, for crying out loud; assume your audience is reasonably proficient at reading! Seriously, does anyone need to fly across the country to listen to a paper? Why not do as science conferences have done for years: post papers online and gather to have a serious discussion of those papers?
The mind-numbing tedium of being read to for four days is exacerbated by the fact that many humanities scholars have little idea about the differences between hearing and reading. If you construct a paper that’s so highly nuanced that understanding it rests upon subtle turns of phrase or complicated linguistic shifts, do not look up from your paper with a wan smile indicating you are enamored of your own cleverness; go back to your room and rewrite the damn thing. Audience, clarity, and coherence are pretty much the Big Three for speech and composition, unless one's audience is the International Mindreaders' Society. By the way, is there something wrong with using a map, providing a chart, or summarizing a work that few in the room are likely to have read? And do bother to tell me why your paper matters.
I actually heard several very exciting papers, but most of the offerings were dreadful. Note to young scholars: stop relying on the Internet and check out journals that predate 1995 before you proclaim a “discovery.” And if you really want to stand out, work on your shtick. Guess which papers I remember? Yep -- those in which the presenter did more than read to me.
Critical note to young scholars: Want to turn off everyone in the room? Be the person who doesn’t think that the 20-minute limit applies to you. Nothing says "non-collegial" more clearly.
Problem Two: Expense
Another reason to rethink conferences is that they cost an arm and a leg to attend. I had partial funding from my university because I was presenting -- and no, I bloody well did not read my paper -- but I was still out of pocket for quite a hunk of cash. If you attend a humanities conference and want to stay anywhere near the actual site of the event, plan on $150 per night for lodging in a soulless franchise hotel with windowless conference rooms and quirky technology, $20 per day for Internet access, another $200 for conference fees, roughly $500 for airfare, at least $50 for taxis to and from the airport -- almost no U.S. city has a convenient shuttle service anymore -- and money for whatever you plan on eating.
Budget plenty for the latter if your conference is in what is glibly called a Destination City. That’s shorthand for a theme area marketing itself as unique, though it’s actually a slice of Generica surrounded by shops and restaurants identical to those found in suburban malls in every way except one: captive audiences equal higher prices. (One small example: the Starbucks inside the pedestrian precinct at my hotel charged a buck more per cup than the one on the street 100 yards away.) Do the math and you can see that you can easily drop a few grand on a megaconference. (That’s what some adjuncts are paid per course!)
An immediate cost-saving adjustment would be to confine conferences to airline hub cities such as New York, Chicago, Los Angeles, Atlanta, and Houston. The moment the conference locates to a (not my term) "second-tier" city, allot another few hundred dollars for "connecting flights," a term used by the airline industry because it sounds nicer than saying you’ll spend six hours waiting in a hub, after you’ve sprinted through the airport like Usain Bolt for your next flight, found the gate closed, and retreated to the rebooking counter.
Problem Three: Victimized Grad Students
I'm a parsimonious Scot who resents spending money on boring hotels and lousy food, but I can afford it when I have to. Grad students can’t. A major way in which megaconferences have changed in the past several decades is that there’s considerably less balance between senior scholars, junior colleagues, and graduate students. (Senior scholars used to accompany the latter two in a mentor capacity.) Now there is just a smattering of senior and junior scholars, and they’re often holed up in hotel suites conducting interviews. Whenever they can, search committee members flee the conference and rendezvous with old friends. They might attend a session or two. Unless they have to be there, there aren’t many junior colleagues in attendance at all because they're busy getting material into publication and they can meet presentation expectations at cheaper regional meetings, or save their dough and go to prestigious (-sounding) international gatherings.
So who’s left? Graduate students. Lots of graduate students. So many that conservationists would recommend culling the herd if they were wild mustangs. Grad students have always gone to conferences in hopes of making their mark, attracting attention, and meeting people who can help them advance. That was the way it was done -- 20 years ago. Now network opportunities are slimmer. Whom do they meet? Mostly other grad students, often those massed outside of interview rooms.
Of all the antiquated things about large conferences, the "cattle call" interview is the most perverse. These were barbaric back in the days in which there were jobs; now they’re simply cruel. At least a third of attendees at my conference were grad students from a single discipline: English. As has been discussed many times on this site, most of them shouldn't be in grad school in the first place. How many of the thousand-plus English grad students can realistically hope to get an academic job of any sort?
The Modern Language Association predicts that only 900 English jobs will come open for all of 2011. That’s 900 in all specialties of English, the bulk of which will be in writing and rhetoric, not Austen and Proust. Will a fifth of those at the conference get a job? The odds are long. It's probably more like half of that, and if we're talking about a good job, slice it in half once more. So why ask strapped grad students to attend expensive conferences for 15-minute preliminary interviews? Do a telephone interview, for heaven’s sake; it’s kinder on both grad students and search committees.
As I did as a grad student, many young hopefuls pooled resources and economized where they could, but the sad truth is that the vast majority of attendees spent a small fortune on a gamble whose odds aren't much greater than buying lottery tickets. Are associations playing the role of enabler to grad student delusions? Yes. Here’s another thought: Instead of holding a big conference, sponsor a teleconference. Charge a fee for uploads, but give speakers one-year access to the URL, which they can make available to potential employers. Use the savings to the association to lobby for more tenure-track faculty.
Problem Four: No-Shows
You spend lots of money, you sit through desultory talks, and head off to the one or two sessions that made you want to attend the conference in the first place. What do you find? It’s been canceled because only one of the presenters showed up, and that paper was combined with several others of sessions that suffered the same fate. Didn’t you see the 3x5 card tacked to the conference bulletin board?
As noted above, I’m in favor of putting large conferences to rest. But If we insist on having them, let’s at least make sure they’re as advertised. O.K., things do happen, but in most cases missing presenters are simply AWOL. I know it smacks of McCarthyism, but I’ve come to support the idea of a data bank of no-shows that employers, conference planners, and deans can check.
Problem Five: Urban Sprawl
What’s the point of a conference that’s so big it’s inaccessible? I walked between two different hotels to attend sessions and pored over a Britannica-sized program to locate them. Conference attendees were housed in four "official" hotels and untold numbers of others. With round-the-clock sessions and decentralization, the few networking opportunities that did exist were logistically difficult. It took me two entire days to find my old friends, let alone new folks I wanted to engage. I met two interesting people at the airport. I never saw them again.
In Praise of Small Conferences
There are other problems I’ll leave for now, including the gnawing suspicion that some big conferences have become sinecures for "insiders" who have become "players" within associations. Let’s just say that there is a serious disconnect between how the big conferences operate and what makes sense in the changing world of academe.
Teleconferences with real-time discussion groups and online forums would be one good starting point for reform; providing more resources for regional and local conferences would be another. Small gatherings have issues of their own -- no-shows, sparsely attended sessions, overreliance on volunteers -- but they compensate by offering intimacy, good value, face-to-face feedback, and easier opportunities to network. It's time to give these the cachet they deserve. The big conference is like a one-size-fits-all t-shirt; it simply doesn’t accessorize most people. I’m done. For real. Unless I get funding for an exotic overseas meeting. (Just kidding!)
Rob Weir, who writes the "Instant Mentor" column for Inside Higher Ed's Career Advice section, has published six books and numerous articles on social and cultural history, and has been cited for excellence in teaching on numerous occasions during his 20 years in the college classroom.
Submitted by John Basl on October 5, 2009 - 3:00am
Ever since this piece on the hiring process in philosophy was published in Inside Higher Ed, there has been a lot of discussion about the role that pedigree should play in hiring committees' decisions about job candidates (see here, here, and
In the early 1970s, a French publisher issued a sort of photo album devoted to Jean-Paul Sartre, who was the most famous philosopher in the world. He had been for some while, so the photojournalistic dossier on him was quite full. The book is full of pictures of him alongside equally famous figures from the world stage -- Camus and Castro, for example, and Simone de Beauvoir, of course. You also see him in the midst of dramatic events, as when he addressed an assembly of revolutionary students during May ’68. There are a few images of the philosopher in a less public capacity. As I recall, there is a baby portrait or two. Plus there were pictures of the Sartrean babes, who seemed to get younger as he got older.
The man was a philosophical action figure, to be sure. But my favorite pages in the book show him at his desk, with manuscripts piled up precariously nearby, or at a café table, scribbling away. Sartre once said that he felt like a machine while working on The Critique of Dialectical Reason, grinding out each day’s quota of concepts. And that’s what’s happening in those photographs of him with pen in hand and tobacco pipe in jaw -- tuning out everything else but the hard work of philosophizing. But who knows? A photograph cannot document thought. It’s entirely possible that Sartre was updating his schedule to accommodate a new girlfriend, rather than analyzing Stalinism.
The same brain did both -- a fact that lends itself to philosophical inquiry. Just where do you draw the line between task-oriented thinking and whatever it is philosophers do while they are “doing philosophy”? It is a conundrum.
In his new book Philosophers, from Oxford University Press, the New Yorker photographer Steve Pyke assembles a portrait gallery of contemporary thinkers. It embodies a conundrum or two of its own -- beginning with the title. In 1995, the British press Zelda Cheatle issued a collection of Pyke’s photographs that was also called Philosophers, which now fetches a high price from secondhand dealers. These are, it bears stressing, completely distinct books. All but one of the pictures in the new collection were taken over the past decade. Only two images from the earlier volume appear in the new one -- in the introductory pages, separate from the hundred portraits making up the main body of the book.
So we have, in other words, two volumes of the same kind, on the same subject, by the same author. They bear the same title. And yet they are not identical. A teachable moment in metaphysics? Yes, but one with practical implications for the used-book trade: a certain percentage of people trying to buy the older volume online will end up getting really, really irritated.
The book from Oxford is quite handsome. And its status as an aesthetic object is not a minor consideration. (For that matter, its aesthetics as a status object are also pretty demanding. It feels like you should get a nicer coffee table, just to have someplace to put it.) Without going so far as to say that Pyke represents philosophers as a subcategory of the beautiful people, he certainly renders them in beautiful black and white.
Ethnography forms no part of what he has in mind: his photographs do not show subjects going about their daily routines or occupying their usual niches. It’s difficult to think of Sartre without picturing him in certain settings – bars, cafés, lecture halls, etc. Furthermore, these places aren’t just elements of his biography; they figure into his work (the waiter in Being and Nothingness is an obvious example). Pyke’s philosophers, by contrast, hang in the void. Usually they are set against a solid black backdrop. The one conspicuous exception is the portrait of Michael Friedman, with an unreadable chalkboard diagram behind him. Their heads loom like planets in the depths of space. The camera registers the texture of skin and hair, the expression on the lips and in the eyes. Scarcely anything else enters the frame -- an earring, perhaps, or the neck of a sweater. Most of the subjects look right into the camera, or just to the side.
With Pyke, the thinker becomes, simply, a face. The effect is intimate, but also strangely abstract. The place and date of the photo session is indicated, but the book provides no biographical information about the subjects. I recognized about a quarter of them off the top of my head, such as Robert Brandom, David Chalmers, Patricia Churchland, Arthur Danto, Sydney Morgenbesser, Richard Rorty. A couple are even on TV from time to time. Both Harry Frankfurt and Bernard-Henri Levy have been on "The Daily Show." That two or three pages could not be found to list a couple of books by each figure is puzzling, although most of the portraits are accompanied by very brief remarks by the subjects on the nature or motivation of their work.
“Philosophy is the way we have of reinventing ourselves,” says Sydney Morgenbesser. Ruth Millikan quotes Wilfrid Sellars from Science, Perception, and Reality: “The aim of philosophy, abstractly formulated, is to understand how things in the broadest possible sense of the term hang together in the broadest possible sense of the term.” Fortunately not everyone is so gnomic. The comments by Jerry Fodor seem the funniest: “To the best of my recollection, I became a philosopher because my parents wanted me to become a lawyer. It seems to me, in retrospect, that there was much to be said for their suggestion. On the other hand, many philosophers are quite good company; the arguments they use are generally better than the ones that lawyers use; and we do get to go to as many faculty meetings as we like at no extra charge.”
The ambivalence in Sally Haslanger’s statement felt more than vaguely familiar: “Given the amount of suffering and injustice in the world, I flip-flop between thinking that doing philosophy is a complete luxury and that it is an absolute necessity. The idea that it is something in between strikes me as a dodge. So I do it in the hope that it is a contribution, and with the fear that I’m just being self-indulgent. I suppose these are the moral risks life is made of.” That sounds quite a bit like Sartre, actually.
In the interview prefacing the collection, Pyke says that his intention is to make philosophers “seem more human, less of a mystery.” And that is where the true conundrum lies. Some philosophers look dyspeptic, while others have goofy smiles, but that isn’t what makes them human -- let alone philosophers. Making something “more human” precludes rendering it “less of a mystery,” since the human capacity for thought is itself an ever-deepening mystery.
Pyke thinks visually. A more interesting commentary on the figures in his portrait gallery might come indirectly, from the late Gilbert Ryle. An Oxford don and the author of The Concept of Mind, he gave a lecture that tried to sort out the relationship between deep cogitation and various other sorts of mental activity. To that end, he focused on the question of what that naked guy in Rodin's sculpture was doing -- and how it presumably differed from, say, a professor preparing to teach a class.
“The teacher has already mastered what he wants his students to master,” said Ryle. “He can guide them because he is on his own ground. But le Penseur is on ground unexplored by himself, and perhaps unexplored by anyone. He cannot guide himself through this jungle. He has to find his way without guidance from anyone who already knows it, if anyone does know it…. The teacher is a sighted leader of the blind, where le Penseur is a blind leader of the blind -- if indeed the very idea of his being or having a leader fits at all.”
That seems like a good description of what the subjects of Pyke's photographs spend their time doing. Not, of course, while the camera is turned on them. To judge by the expressions of some, their thoughts may have been something closer to, "Wow, I'm being photographed by someone from The New Yorker. How did that happen?"