Philosophy

Essay on Christine Overall, "Why Have Children?"

Intellectual Affairs

“Not to be born is, beyond all estimation, best,” chants the chorus in Sophocles’s Oedipus at Colonus, “but when a man has seen the light of day, this is next best by far, that with utmost speed he should go back from where he came.” They make a quick inventory of life’s miseries, including pain, envy, war, and old age. Which seems like rubbing it in, considering Oedipus is an ex-king who, in the trilogy’s earlier play, tore his own eyeballs out of their sockets.

In any case, the sentiment is hardly original. Consider another king, Midas, of golden-touch fame. He kidnaps Silenus, teacher and drinking companion of the god Dionysus, and demands that he reveal the most desirable thing in the world. Silenus resists answering for a while but finally blurts it out: "Never to have been born." It's not the voice of clinical depression speaking but a nugget of grim wisdom from antiquity. It's Western civilization's way of saying that your parents did you no great favor.

I don’t see much good in arguing the point, one way or the other. Cosmic pessimism is a sensibility, not a proposition. It's not even that dour, necessarily. Silenus doesn't kill himself; in the myths, he seems to be having a pretty good time. If anything, pessimists might find life easier to bear. They’re less likely to be disappointed.

In her new book Why Have Children? The Ethical Debate (MIT Press), Christine Overall, a professor of philosophy at Queen's University in Ontario, assesses the usual grounds for having kids or deciding against it. She scrutinizes them like an IRS accountant in the middle of a ruthless audit. Few claims survive her red pen. To summarize her findings with somewhat reckless brevity, Overall maintains that many of the motivations for having children are, for the most part, at least somewhat ethically dubious -- while the decision not to have them tends to be less problematic. 

“Deciding whether to procreate is a moral decision,” she writes, “…because it affects so many people -- not only the prospective parent(s), but also the prospective child, other family members, and other members of the community. Although one is certainly entitled to take into account the effects of having a child on oneself, if one decides only on the basis of a gamble about one’s future well-being, then one is refusing to treat procreation as a fully moral matter.” Having a baby to boost self-esteem, or save a marriage (does that ever work?), or simply because it's expected of you, grossly underestimates the seriousness of becoming responsible for someone's existence.

Conversely, even if a person's reasons for opting out of reproduction are specious or self-interested, that doesn’t make the decision itself bad. It has little impact on anybody besides the decision-maker, apart from the occasional unhappy would-be grandparent, perhaps.

She is particularly critical of arguments that there is some obligation to have children -- duty to nation or community, for instance, or obedience to a divine command to “be fruitful and multiply.” Her guiding concern is the moral right to autonomous decision-making about whether or not to reproduce. Otherwise, we have “the compulsory and unwilled use of people’s bodies for procreative purposes, whether they are other individuals’ or the state’s purposes.”

Here the phrase “people’s bodies” is a little more gender-neutral than strictly necessary. If presidential candidates or members of Congress tried to outlaw vasectomies, or made sperm-bank donations obligatory -- well, that would be bad, but it’s not something men tend to worry over. Given the extremely asymmetrical distribution of the burdens involved in procreation, the real issue is whether women can decide not to have children. The precondition for making an ethical decision about having children is that it actually be a choice.

Perhaps I’ve made the author sound like an echo of the chorus in Sophocles. She isn’t -- very much the contrary. Overall has two children, and the final pages of her book are a testament to the distinct satisfactions of raising them and seeing them grow into adults. She recognizes that antinatalism (the philosophical brand-name for arguments that coming into the world is a horrid misfortune) tends to be explicitly misogynistic. “The idea that it is better in every case never to have been [born],” she writes, “implies that women’s reproductive labor in pregnancy, birth, breastfeeding, and even rearing children contributes to the accumulation of net harm on this planet.”

For that matter, “the theory can be interpreted to mean that both contraception and abortion should be mandatory” -- hardly an attitude consistent with autonomous decision-making.

But antinatalism isn’t a real force in the world -- while the expectation that if you can have kids, you should, remains fairly strong. Overall’s book is a welcome antidote.

“Children are not essential to all good lives,” she writes, “nor are having and rearing children prerequisites to becoming a good person. Moreover, there are many childless persons who support, love, care for, and teach other people’s children. Chosen childlessness has as much potential for the good life as chosen parenthood has.”

There is more to this passage in a similar vein. It appears on page 219. I mention it because some readers might want to photocopy it to post on the refrigerator door, when the family comes around.

Making the case for dissolving the American Philosophical Association

Smart Title: 

Is it time for the American Philosophical Association to be euthanized? A philosopher appointed to a committee to look into the organization’s future asks the provocative question.

review of Emrys Westcott, "The Virtue of Our Vices"

Intellectual Affairs

In search of a rationale to avoid making any New Year’s resolutions, I was glad to see that Princeton University Press has issued a book called The Virtues of Our Vices: A Modest Defense of Gossip, Rudeness, and Other Bad Habits, by Emrys Westacott, a professor of philosophy at Alfred University.

It is not, alas, a handbook on self-improvement through creative rationalization. Two chapters started out as papers appearing in International Journal of Applied Philosophy, and Westacott’s project falls under a specialized heading within the humanistic division of labor: “microethics.”

The term was unfamiliar. A look at The Oxford Companion to Philosophy and similar reference works was to no avail.  Searching the Library of Congress catalog turned up no book or journal titles mentioning microethics, nor was it a keyword in any subject heading. Missing from the LoC’s holdings, but locatable online, is Paul Komesaroff’s Experiments in Love and Death: Medicine, Postmodernism, Microethics and the Body (Melbourne University Press, 2008). Komesaroff, a physician and professor of medicine at Monash University in Australia, contrasts microethical decision-making to the more general level of bioethical argument.

The issues that bioethicists discuss (cloning, euthanasia, animal rights, etc.) are matters of public debate, while microethical questions arise in a clinical setting – often enough while doctor and patient are face-to-face, with a piece of bad news between them. “Microethics,” writes Komesaroff,“is in general not the terrain of arresting cases involving heroic decisions or extraordinary circumstances…. Indeed, this may be one reason for the relative lack of attention it has attracted. Rather, it is the field of day-to-day communication and structured, complex interactions, of subtle gestures and fine nuances of language.”

He gives as an example the obligations of a physician when conveying unwelcome results from a biopsy. Here, the microethical question is not whether to be honest. That is a given. But the moment of truth will reverberate for the patient throughout whatever may be left of his or her life. The duty to render a prognosis is complicated by the possibility of creating false hope or absolute despair. The dilemma is both fine-grained and profoundly consequential.

By contrast, the microethical issues that interest Westacott seem like decidedly smaller beans. The subtitle of The Virtues of Our Vices mentions gossip and rudeness. In addition, there are chapters on snobbery and offensive jokes, as well as an investigation of the balancing act involved in respecting the opinions of others. On the one hand, people have a right to their beliefs. On the other hand, it is an inescapable reality this sometimes those beliefs are uninformed, irrational or downright insane.

None of these issues are a matter of life or death, as such, though I suppose they could be, if you offended the wrong person. But they all fit Komesaroff’s definition of the microethical domain as “the field of day-to-day communication and structured, complex interactions, of subtle gestures and fine nuances of language.” They are problems that flash up in the course of routine social interaction, with ambiguities that can make things even more complicated. Deciding whether a given action or remark was rude or snobbish is not always easy -- even for the person responsible for it.

Questions about right and wrong concerning everyday interaction “take up the bulk of whatever time most of us spend in moral reflection and decision making,” writes Westacott. “[O]ur everyday thinking and conduct regarding commonplace matters are the most important indicators, both to ourselves and to others, of our true moral values and character. Certainly, they count for more than purely hypothetical scenarios in which we imagine how we would handle terrible dilemmas involving lifeboats, terrorists, deathbed promises, or runaway trains.”

Quite true: It’s hard to remember the last time I had to decide if it would be okay to torture a prisoner to extract information about a ticking time bomb. The microethical questions of everyday life tend to be less stark, though not necessarily more simple. The very familiarity of an experience such as rudeness means that we ordinarily do without formal definitions of what counts as rude and what doesn’t. It is the microethicist's task to specify what is otherwise left implicit in such terms. That can take some doing, as shown by Westacott's labor to synthesize a precise definition of snobbery. It takes six efforts. (The final product: "believing without sufficient justification that you are superior to another person in certain respects because you belong to or are associated with some group that you think places you above them in a social hierarchy.")

Plenty of gray areas remain even after the terms have been clarified. It's possible to generate decision trees for judging if a piece of gossip is damaging and unethical, or whether a given violation of social norms will be assessed as rude. And so Westacott does -- seeming to sharpen up the distinctions between good and bad microethical distinctions. But at the same time, the author reckons the possible benefits of various vices, as well as their costs. Gossip, for example, is typically criticized as evidence of "shallow living," writes Westacott, "something we are continually discovering new ways to achieve." But that is one-sided. "Since one of the benefits gossip can bring is a deeper understanding of human nature and social institutions ... it is more plausible to think that a willingness to talk about people -- which at times will involve gossiping -- may be an integral part of 'the examined life.' This is why we find Socrates, in Platonic dialogues like the Meno and the Gorgias, freely discussing the failings of others in the course of his philosophical inquiries."

Not to push the comparison too hard, but in Westacott's microethical analyses, as with Socratic badinage, it's the process of inquiry, as much as the result, that engages the reader's interest. His tree-chart algorithms probably won't be that useful to anyone having to make a decision. But they reveal some of implicit choices that we often make very quickly when dealing with other people. The unexamined life may not be worth living, but it is, after all, where we spend most of our time. The Virtues of Our Vices shines a little light in that direction.

A Little History of Philosophy

Six years ago, Yale University Press published A Little History of the World by E.H. Gombrich, which appeared to much acclaim and has by now sold 500,000 copies -- impressive for a trade publisher, and epochal for a university press. The great art historian had written it, his first book, in Austria during the Depression, mainly to pay the bills. It enjoyed some popularity in Europe over the years, though nothing like the success of his classic The Story of Art (1950). While “ostensibly written for teenagers,” says the entry on Gombrich in The Dictionary of Art Historians, it had “a huge impact on the general post-war populace.” According to an article in ArtNews, it had by 2006 sold more than 8 million copies in at least 30 languages. The Story of Art is one of the rare examples of a textbook that not only outlives its author but proves genuinely beloved by readers. “I never believed that books for young people should differ from books for adults,” Gombrich wrote in its preface, “except for the fact that they must reckon with the most exacting class of critics, critics who are quick to detect and resent any trace of jargon or bogus sentiment.”

A Little History of the World is, in anything, an even more deft feat of popularization, since its target audience is about 10 years old. Exact data are not at hand, but quite a few of the half-million copies it's sold so far were almost certainly purchased for adult consumption. And no shame in that. Better to know A Little History than to know none at all. At least Gombrich respects his public enough not to call them dummies.

Later this month, Yale is bringing out A Little History of Philosophy by Nigel Warburton -- published in a format that mimics the earlier volume in every particular, from cover design and length to the woodblock-like artwork appearing at the head of each chapter.

The word for this sort of thing is “branding.” My initial response to it was something less than joyous. In 2005, I reviewed Gombrich’s book for a newspaper, and put down additional thoughts on it for this column; and a few people have indicated they were encouraged to look for the book on the basis of my ardent tub-thumping on its behalf, which was as heartfelt as it could possibly be. But that was based on admiration for the singular generosity of Gombrich’s style. (The author was translating and revising the book himself when he died in 2001, and it reflects decades of finesse in his adopted language.) The odds of lightning striking twice did not seem good.

The dust jacket says that the author, Nigel Warburton, lectures on philosophy at The Open University and the Tate Modern, both in England, and “hosts a weekly podcast and an integrated philosophy website.” Writing for The Guardian, Julian Baggini, editor of The Philosopher’s Magazine, says that Warburton “has quietly become quite one of the most-read popular philosophers of our time. Over nearly two decades his Philosophy: The Basics has sold in excess of 100,000 copies, with no gimmicks, no literary flourishes, just admirable clarity, concision and accuracy.”

That said, I must admit that name rang no bells. My initial, spontaneous response to A Little History of Philosophy was simply that the the author faced an impossible task. And in Gombrich, he also had an impossible act to follow.

And yet, the book is pretty good. Warburton has many of the Gombrichian virtues. While reading A Little History of Philosophy, I jotted down notes in an effort to characterize it -- only to realize that they were, point for point, things I'd said about A Little History of the World, six years ago. For example: “Concise but substantial, without condescension, somewhat limited in its purview (focus is on the West) but written with just enough wryness to be charming.”

That about covers it. Each book surveys a vast array of vast topics while presupposing as little as possible about the background of the audience. That would be no small trick even with long chapters. As it is, each is roughly six pages long. Not 60, but six. Aristotle gets six pages. Hegel gets six pages. Karl Popper and Thomas Kuhn are roommates in a single chapter, which runs to the exceptional length of eight pages. Immanuel Kant, clearly the guest of honor, is permitted two chapters adding up to a total of 11 pages. All of French existentialism is covered by having Sartre, Beauvoir, and Camus set up a philosophical ménage à trois in the usual six pages. (Not that Warburton puts it that way, as I should make clear to children's librarians everywhere.)

Machiavelli, Darwin, Freud, and Alan Turing are all covered, although none of them was a philosopher, exactly. At the same time, Martin Heidegger makes only a very brief appearance, in a chapter on his one-time girlfriend Hannah Arendt. This would undoubtedly have bothered both of them.

Warburton's survey covers strictly European and (to a smaller extent) American philosophy. A handful of thinkers from elsewhere do turn up, but very much in passing. The Buddha gets a nod in the chapter on Schopenhauer, for example. Jewish and Arabic philosophy flourished during centuries when Christendom was anything but reflective. But the only trace of them here is the names (and only the names) of Maimonides and Avicenna.

The selection, then, is debatable, and the task itself almost unimaginable (at least by the standards of academe, where it is permissible to write a 500-page monograph containing the phrase “space does not permit me to consider….” in each chapter); but the book has a certain quality that comes from accepting a challenge under severe conditions, then taking it on without making a big deal of the whole thing. And the word for that quality is grace.

It requires more than a knack for brevity. The question of what role biography ought to play in writing the history of philosophy is not a simple one. Heidegger’s treatment of the life and times of Aristotle at the start of a lecture (“He was born. He thought. He died.”) is legendary, but not, perhaps, the final word on the matter. At the same time, reducing complex ideas to personal or social factors – as with sensationalistic treatments of Heidegger himself – is no real service to anyone trying to get some bearings on the history of philosophy.

A Little History untangles that Gordian knot in tried and true manner, because saying anything in six pages means cutting through things without hesitation. To stick with the example of Aristotle, this means discussing a single text (in this case, the Nichomachean Ethics) and just enough context to connect him with the previous chapter (on Socrates and Plato) while setting up the next (on Pyrrho, the extreme skeptic, whose work stands in a nice contrast to the authoritarian dogmatism around Aristotle in later centuries).

Warburton zeroes in on the concept of eudaimonia -- meaning “happiness” or, better, “flourishing” -- and explains that it is “pronounced ‘you-die-monia’ but mean[s] the opposite” (a fitting and even helpful play on words). He sketches the psychological, moral, and social implications of eudaimonia.

And that’s that – time to move along. Of course, it means reducing the Peripatetic’s thought to the size of postage stamp. But no better approach seems obvious, given the circumstances. (It's not as if covering the logical or metaphysical writings in six pages is an option.) The focus on eudaimonia also helps to set up the later chapter on Kant’s very different understanding of ethics. When exhaustiveness is not an option, efficiency counts for something.

Yale University Press has more Little History titles on the way. (So I am told by a publicist, who kept their titles close to the vest.) The public should hold them to the standard set by the first two volumes. In the preface to The Story of Art, Gombrich spelled out the duties and the benefits of this kind of work: “It may serve to show newcomers the lay of the land without confusing them with details; to enable them to bring some intelligible order into the wealth of names, periods, and styles which crowd the pages of more ambitious works, and so equip them for consulting more specialized books.” A creditable ambition, and a demanding one. The only easy thing about it is how it looks.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

'The Forum and the Tower'

“It is evident,” declared Aristotle, expecting no argument, that “two ways of life are the ones intentionally chosen by those human beings who are most ambitious with a view to virtue, both in former times and the present; the two I mean are the political and the philosophic.”

The strange word here is “virtue,” which carries a lot of baggage for the modern reader. Anyone too preoccupied with virtue is, by contemporary standards, presumably guilty of something until proven otherwise, and maybe not even then.

So it bears keeping in mind that, in Aristotle’s usage, “virtue” is almost a piece of technical jargon. It refers to a form of excellence that, as you pursue it, leads toward profound happiness and a richer life – a condition of human flourishing.

Being “ambitious with respect to virtue,” then, is not as grim as it may sound. Likewise, we have to shed a little cynicism in order to understand why Aristotle would single out politics and philosophy as ideal venues for pursuing that ambition. He understood them, not as professions, let alone as rackets, but rather as activities manifesting and enhancing our nature as social and rational animals.

At the same time, politics and philosophy pull in different directions -- one toward civic engagement, the other into deep and prolonged reflection. Aristotle was all about finding a happy medium, but in the final analysis he thought that intellectual contemplation was the highest form of virtue/excellence. (This is hardly surprising. He was a philosopher, after all.)

Mary Ann Glendon’s The Forum and the Tower: How Scholars and Politicians Have Imagined the World, from Plato to Eleanor Roosevelt, published by Oxford University Press, is a meditation on this theme from Aristotle by someone who has served as both an academic and a diplomat. (Glendon, a professor at the Harvard University Law School, was a U.S. Ambassador to the Vatican during George W. Bush’s second term.)

The book consists of a series of biographical essays on figures who moved between scholarship and statecraft, or at least desired to bring them together. “History provides few examples of prominent political actors who, like Cicero or Edmund Burke, are remembered for important contributions to political thought as well as for distinguished public service,” Glendon writes. “As for political theorists who have ventured into politics, some of the most eminent – Plato, Tocqueville, and Weber, for example – were strikingly ineffective in the public arena.” She devotes a chapter to each of these figures, plus a few others, drawing as much on their memoirs and private papers as their books or speeches. In style and spirit, The Forum and the Tower is much closer to a book like Will Durant's The Story of Philosophy (1926) than to a monograph.

The author describes the essays as “loosely linked,” which seems fair a fair description. Unlike Aristotle -- who is forever generating categorical distinctions, weighing alternatives, and lining things up neatly – Glendon is not particularly driven to analysis. Her examples from history converge on a simple point: the politician and the serious thinker embody distinct capacities, seldom found together in a single person. That principle was already recognized in ancient Athens. And Max Weber had pretty much the last word on the subject in two lectures, “Science as a Vocation” (1917) and “Politics as a Vocation” (1919). Glendon's discussion of Weber, near the end of the book, epitomizes her concern with the difficulty of bridging the distance between “the forum” (where political decisions are made) and “the tower” (as in, ivory).

In particular, Weber’s thoughts on politics as an “ethics of responsibility” seems framed as a warning. The political actor “has to be able to deal with the world as it is,” she writes, “taking human frailty into account and even using it for his purposes. He must be able to bear the irrationality of the word in which evil sometimes comes from good and good sometimes comes from evil. He has to understand that the attainment of good ends may even require using morally dubious or at least dangerous means, and that if one chases after the ultimate good, then the good he seeks may be damaged or discredited for generations…. What is decisive, said Weber, ‘is the trained relentlessness in viewing the realities of life and the ability to face such realities and to measure up to them inwardly.’”

She seems to be saying in response to Aristotle that no matter how highly you rate contemplation, the political leader's task requires the rarest virtue.

A few words about the politics of the author, and of the book itself, seem in order. In 2009, Glendon declined the Laetare Medal from the University of Notre Dame when she learned that the institution would be granting an honorary degree to President Obama at the same ceremony. I read The Forum and the Tower without knowing this, though with hindsight it is illuminating.

Glendon names two exceptional cases of leaders who also produced lasting works of scholarship, Cicero and Edmund Burke. Both, as it happens, were conservatives. She identifies Henry Kissinger as another “statesman-scholar,” which is certainly one thing you can call him, if not the one I find springing to mind. The citations from secondary literature are infrequent and tend to come from figures such as Harvey Mansfield, Thomas Pangle, Conor Cruise O’Brien, and Paul Johnson – all of them reliably conservative.

Eleanor Roosevelt appears in the subtitle of the book, rather anomalously. She makes a brief appearance in the final chapter, which is devoted to the Lebanese philosopher Charles Malik’s role the United Nations committee that drew up the Universal Declaration of Human Rights. The former First Lady chaired the committee; its work is the subject of an earlier book by Glendon. But the whole chapter feels a bit tacked on – almost an effort to impose some balance at the last possible moment. In a way the book really ends with Max Weber’s brooding thoughts on the good and evil that men do.

But that leaves me wishing that Glendon had ventured beyond popularized history and rather broad points about the gap between statecraft and the life of the mind. It would be a better book for addressing her own experience in shuttling between forum and tower -- and for posing questions about the relationship between conservative thought and action. "If one chases after the ultimate good, then the good he seeks may be damaged or discredited for generations" would serve as a critique of various right-wing luminaries, but it's never clear whether or not Glendon means it as one.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Faulty Forecast?

Smart Title: 

New analysis of "climate" for women in graduate philosophy programs -- conducted without input of actual grad students -- has infuriated many.

End Large Conferences

I’ll play Marc Antony. I have not come to praise large conferences, but to bury them. It is my opinion that mega humanities conferences are way past their sell-by date. For senior faculty the only reason to go is to schmooze with old friends; for junior faculty they are an onerous duty, and for graduate students they are a rip-off for which professional organizations ought to be collectively ashamed.

First codicil: I speak exclusively of humanities conferences, as they are the only ones I know firsthand. Friends in computing and the sciences tell me that collaborative efforts arise from their conferences. I’m willing to believe them. Maybe it’s a cultural thing. Most humanities people find it so hard to collaborate that their wills stipulate that their notes go with them to the grave.

Second codicil: I have only myself to blame for recent travails. I didn't need to go to my unnamed conference, but I got it into my head that it would be fun. I was wrong. It serves me right for violating my principles.

Five years ago I concluded that humanities conferences were out of touch with the times and vowed to attend only smaller regional meetings with less cachet, but more satisfaction. But I didn’t listen to me. Instead I spent four days and a considerable wad of cash jostling among a throng of over three thousand. I returned home more akin to Ponce de Leon, who sought the Fountain of Youth and found mostly dismal swampland. Sound harsh? See if any of these observations resonate with your own.

Problem One: Outmoded Presentations

We live in the communications age, but the memo apparently never circulated among those studying the liberal arts. For reasons arcane and mysterious, humanities scholars still read papers. That’s tedious enough at a small conference where one might attend six three-paper presentations. At my recent conference, sessions commenced at 8 a.m. and ran past 10 p.m. One could have conceivably attended 30 sessions and heard 90 or more papers, though the only ones with the stamina to attend more than six or seven sessions were either posturing or desperate.

I wanted my four-day sojourn to introduce me to new ideas, concepts, and teaching modules, but the reality of such a grueling schedule is that I was running on fumes by the end of day one. It would have helped if presenters took advantage of new technology, but things seldom got more flash than PowerPoint, a program that, alas, seems to encourage more reading. Let me reiterate something I’ve said for years: the death penalty should apply to those who read anything from a PowerPoint slide other than a direct quote. It's an academic conference, for crying out loud; assume your audience is reasonably proficient at reading! Seriously, does anyone need to fly across the country to listen to a paper? Why not do as science conferences have done for years: post papers online and gather to have a serious discussion of those papers?

The mind-numbing tedium of being read to for four days is exacerbated by the fact that many humanities scholars have little idea about the differences between hearing and reading. If you construct a paper that’s so highly nuanced that understanding it rests upon subtle turns of phrase or complicated linguistic shifts, do not look up from your paper with a wan smile indicating you are enamored of your own cleverness; go back to your room and rewrite the damn thing. Audience, clarity, and coherence are pretty much the Big Three for speech and composition, unless one's audience is the International Mindreaders' Society. By the way, is there something wrong with using a map, providing a chart, or summarizing a work that few in the room are likely to have read? And do bother to tell me why your paper matters.

I actually heard several very exciting papers, but most of the offerings were dreadful. Note to young scholars: stop relying on the Internet and check out journals that predate 1995 before you proclaim a “discovery.” And if you really want to stand out, work on your shtick. Guess which papers I remember? Yep -- those in which the presenter did more than read to me.

Critical note to young scholars: Want to turn off everyone in the room? Be the person who doesn’t think that the 20-minute limit applies to you. Nothing says "non-collegial" more clearly.

Problem Two: Expense

Another reason to rethink conferences is that they cost an arm and a leg to attend. I had partial funding from my university because I was presenting -- and no, I bloody well did not read my paper -- but I was still out of pocket for quite a hunk of cash. If you attend a humanities conference and want to stay anywhere near the actual site of the event, plan on $150 per night for lodging in a soulless franchise hotel with windowless conference rooms and quirky technology, $20 per day for Internet access, another $200 for conference fees, roughly $500 for airfare, at least $50 for taxis to and from the airport -- almost no U.S. city has a convenient shuttle service anymore -- and money for whatever you plan on eating.

Budget plenty for the latter if your conference is in what is glibly called a Destination City. That’s shorthand for a theme area marketing itself as unique, though it’s actually a slice of Generica surrounded by shops and restaurants identical to those found in suburban malls in every way except one: captive audiences equal higher prices. (One small example: the Starbucks inside the pedestrian precinct at my hotel charged a buck more per cup than the one on the street 100 yards away.) Do the math and you can see that you can easily drop a few grand on a megaconference. (That’s what some adjuncts are paid per course!)

An immediate cost-saving adjustment would be to confine conferences to airline hub cities such as New York, Chicago, Los Angeles, Atlanta, and Houston. The moment the conference locates to a (not my term) "second-tier" city, allot another few hundred dollars for "connecting flights," a term used by the airline industry because it sounds nicer than saying you’ll spend six hours waiting in a hub, after you’ve sprinted through the airport like Usain Bolt for your next flight, found the gate closed, and retreated to the rebooking counter.

Problem Three: Victimized Grad Students

I'm a parsimonious Scot who resents spending money on boring hotels and lousy food, but I can afford it when I have to. Grad students can’t. A major way in which megaconferences have changed in the past several decades is that there’s considerably less balance between senior scholars, junior colleagues, and graduate students. (Senior scholars used to accompany the latter two in a mentor capacity.) Now there is just a smattering of senior and junior scholars, and they’re often holed up in hotel suites conducting interviews. Whenever they can, search committee members flee the conference and rendezvous with old friends. They might attend a session or two. Unless they have to be there, there aren’t many junior colleagues in attendance at all because they're busy getting material into publication and they can meet presentation expectations at cheaper regional meetings, or save their dough and go to prestigious (-sounding) international gatherings.

So who’s left? Graduate students. Lots of graduate students. So many that conservationists would recommend culling the herd if they were wild mustangs. Grad students have always gone to conferences in hopes of making their mark, attracting attention, and meeting people who can help them advance. That was the way it was done -- 20 years ago. Now network opportunities are slimmer. Whom do they meet? Mostly other grad students, often those massed outside of interview rooms.

Of all the antiquated things about large conferences, the "cattle call" interview is the most perverse. These were barbaric back in the days in which there were jobs; now they’re simply cruel. At least a third of attendees at my conference were grad students from a single discipline: English. As has been discussed many times on this site, most of them shouldn't be in grad school in the first place. How many of the thousand-plus English grad students can realistically hope to get an academic job of any sort?

The Modern Language Association predicts that only 900 English jobs will come open for all of 2011. That’s 900 in all specialties of English, the bulk of which will be in writing and rhetoric, not Austen and Proust. Will a fifth of those at the conference get a job? The odds are long. It's probably more like half of that, and if we're talking about a good job, slice it in half once more. So why ask strapped grad students to attend expensive conferences for 15-minute preliminary interviews? Do a telephone interview, for heaven’s sake; it’s kinder on both grad students and search committees.

As I did as a grad student, many young hopefuls pooled resources and economized where they could, but the sad truth is that the vast majority of attendees spent a small fortune on a gamble whose odds aren't much greater than buying lottery tickets. Are associations playing the role of enabler to grad student delusions? Yes. Here’s another thought: Instead of holding a big conference, sponsor a teleconference. Charge a fee for uploads, but give speakers one-year access to the URL, which they can make available to potential employers. Use the savings to the association to lobby for more tenure-track faculty.

Problem Four: No-Shows

You spend lots of money, you sit through desultory talks, and head off to the one or two sessions that made you want to attend the conference in the first place. What do you find? It’s been canceled because only one of the presenters showed up, and that paper was combined with several others of sessions that suffered the same fate. Didn’t you see the 3x5 card tacked to the conference bulletin board?

As noted above, I’m in favor of putting large conferences to rest. But If we insist on having them, let’s at least make sure they’re as advertised. O.K., things do happen, but in most cases missing presenters are simply AWOL. I know it smacks of McCarthyism, but I’ve come to support the idea of a data bank of no-shows that employers, conference planners, and deans can check.

Problem Five: Urban Sprawl

What’s the point of a conference that’s so big it’s inaccessible? I walked between two different hotels to attend sessions and pored over a Britannica-sized program to locate them. Conference attendees were housed in four "official" hotels and untold numbers of others. With round-the-clock sessions and decentralization, the few networking opportunities that did exist were logistically difficult. It took me two entire days to find my old friends, let alone new folks I wanted to engage. I met two interesting people at the airport. I never saw them again.

In Praise of Small Conferences

There are other problems I’ll leave for now, including the gnawing suspicion that some big conferences have become sinecures for "insiders" who have become "players" within associations. Let’s just say that there is a serious disconnect between how the big conferences operate and what makes sense in the changing world of academe.

Teleconferences with real-time discussion groups and online forums would be one good starting point for reform; providing more resources for regional and local conferences would be another. Small gatherings have issues of their own -- no-shows, sparsely attended sessions, overreliance on volunteers -- but they compensate by offering intimacy, good value, face-to-face feedback, and easier opportunities to network. It's time to give these the cachet they deserve. The big conference is like a one-size-fits-all t-shirt; it simply doesn’t accessorize most people. I’m done. For real. Unless I get funding for an exotic overseas meeting. (Just kidding!)

Author/s: 
Rob Weir
Author's email: 
info@insidehighered.com

Rob Weir, who writes the "Instant Mentor" column for Inside Higher Ed's Career Advice section, has published six books and numerous articles on social and cultural history, and has been cited for excellence in teaching on numerous occasions during his 20 years in the college classroom.

Inside a Search

When a philosophy department receives more than 600 applications for a tenure-track opening, how does it make a decision? Lou Marinoff describes the process.

Pages

Subscribe to RSS - Philosophy
Back to Top