In recent years we’ve had quite a few books on the negative emotions – disgust, malice, humiliation, shame – from scholars in the humanities. In addition, Oxford University Press published its series of little books on the Seven Deadly Sins. Apparently envy is the most interesting vice, to judge by the sales ranks on Amazon, followed by anger -- with lust straggling in third place. (A poor showing, given its considerable claims on human attention.)
The audience for monographs putting unpleasant or painful feelings into cultural and historical context probably doesn’t overlap very much with the far larger pop-psychology readership. But their interests do converge on at least one point. Negative affects do have some benefits, but most of us try to avoid them, or minimize them, both in ourselves and others, and to disguise them when necessary; or, failing that, to do damage control. And because the urge to limit them is so strong, so is the need to comprehend where the feelings come from and how they operate.
Arguably the poets, historians, and philosophers have produced richer understandings of negative emotions, in all their messiness. As for what the likes of Dr. Phil bring to the table, I have no opinion – though obviously they’re the ones leaving it with the biggest bags of money.
But the avoidance / interest dynamic really goes AWOL with the topic Chris Walsh explores in Cowardice: A Brief History (Princeton University Press). The Library of Congress catalog has a subject heading called “Cowardice — history,” with Walsh’s book being the sole entry. That’s a clerical error: Marquette University Press published Lesley J. Gordon’s “I Never Was a Coward”: Questions of Bravery in a Civil War Regiment in 2005. It is 43 pages long, making Walsh the preeminent scholar in the field by a sizable margin. (He is also associate director of the College of Arts and Sciences Writing Program at Boston University.)
“[P]ondering cowardice,” he writes “illuminates (from underneath, as it were) our moral world. What we think about cowardice reveals a great deal about our conceptions of human nature and responsibility, about what we think an individual person can and should have to endure, and how much one owes to others, to community and cause.”
But apart from a typically thought-provoking paper by William Ian Miller a few years ago, cowardice has gone largely unpondered. Plato brought it up while on route to discussing courage. Aristotle stressed the symmetry between cowardice (too much fear, too little confidence) and rashness (too much confidence, too little fear) and went on to observe that rash men tended to be cowards hiding behind bluster.
That insight has survived the test of time, though it’s one of the few analyses of cowardice that Walsh can draw on. But in the historical and literary record it is always much more concrete. (In that regard it’s worth noting that the LOC catalog lists 44 novels about cowardice, as against just two nonfiction works.)
Until sometime in the 19th century, cowardice seems to have been equated simply and directly with fear. It was the immoral and unmanly lack of yearning for the chance at slaughter and glory. The author refers to the American Civil War as a possible turning point, or at least the beginning of a change, in the United States. By the Second World War, the U.S. Army gave new soldiers a pamphlet stating, up front, YOU’LL BE SCARED and even acknowledging their anxiety that they might prove cowards once in battle.
Courage was not an absence of fear but the ability to act in spite of it. This represented a significant change in attitude, and it had the advantage of being sane. But it did not get around a fundamental issue that Walsh shows coming up repeatedly, and one well-depicted in James Jones’s novel The Thin Red Line:
“[S]omewhere in the back of each soldier’s mind, like a fingernail picking uncontrollably at a scabby sore, was the small voice saying: but is it worth it? Is it really worth it to die, to be dead, just to prove to everybody you’re not a coward?”
The answer that the narrator of Louis-Fernand Celine’s Journey to the End of the Night about the First World War (“I wasn’t very bright myself, but at least I had sense enough to opt for cowardice once and for all”) sounds a lot like Mark Twain’s considered opinion in the matter: “The human race is a race of cowards, and I am not only marching in that procession but carrying a banner.”
Both were satirists, but there may be more to the convergence of sentiment than that. In the late 19th and early 20th centuries, war became mechanized and total, with poison gas and machine guns (just a taste of improvements to come) and whole populations mobilized by propaganda and thrown onto the battlefield. The moral defect of the coward was sometimes less than obvious, especially with some hindsight.
In Twain’s case, the remark about fundamental human cowardice wasn’t an excuse for his own military record, which was not glorious. (He numbered himself among the thousands who "entered the war, got just a taste of it, and then stepped out again permanently.") Walsh provides a crucial bit of context by quoting Twain’s comment that “man’s commonest weakness, his aversion to being unpleasantly conspicuous, pointed at, shunned” is better understood as moral cowardice, “the supreme make-up of 9,999 men in the 10,000.”
I’ve indicated a few of Walsh’s themes here, and neglected a few. (The yellow cover, for example, being a reminder of his pages on the link between cowardice and that color.) Someone might well write an essay about how overwhelmingly androcentric the discussion tends to be, except insofar as a male labeled as a coward is called womanly. This is strange. When the time comes for battle, a man can try to flee, but I’ve never heard of anyone escaping childbirth that way. And the relationship between moral cowardice (or courage) and the military sort seems complex enough for another book.
Georgetown University recently announced plans for an English Ph.D. tailored to non-university careers, reflecting ongoing deliberations within the Modern Language Association about what to do about the anemic employment market.
In their important, humane contribution to the same conversation, “No More Plan B,” the American Historical Association’s Anthony Grafton and James Grossman argue that, at a time when the employment market for history Ph.D.s is dismal, historians with Ph.D.s have high-level skills that should be recognized by employers. Some evidence suggests, not surprisingly, that Ph.D.s in the humanities are already thriving in the private sector.
These conversations reflect the efforts of concerned academic leaders to find ways to deal with the human cost of declining faculty positions in the humanities (and, one might add, in the natural sciences).
These proposals are controversial because, to their detractors, they turn graduate education in the humanities into job training. At a time when the liberal arts are threatened, and when many policymakers are emphasizing narrowly vocational goals over a broad general education, this is not an unreasonable concern.
Graduate education in the humanities cannot be defended because it prepares people for any job. That’s not what brings students to graduate school. Students enter graduate school because they love their subjects. They have had good teachers who have inspired them to see the world in new ways. They have learned to ask the kinds of questions that only the humanities can answer. They have been converted.
We therefore cannot treat the humanities Ph.D. as a high-end professional credential — an alternative to the M.B.A. When we do so we corrupt what graduate study in humanities is for. Unlike the undergraduate major, which is intended as broad preparation for life, the graduate degree is designed for those who wish to engage in deep study in order to enter professional work in the humanities.
Instead, I propose we think of graduate education in the humanities as closer to ministerial education. We must prepare students not just with the knowledge required to understand their field, but with the skills necessary to carry out their ministry in the different places to which they might be called. By imagining ministers instead of M.B.A.s, we might be able to find a language that makes it possible to reform graduate education without giving in to vocationalism.
Addressing Supply and Demand
Before reforming graduate education, however, we must not forget the primary issue faced by the humanities: the structural problems that plague the university.
On the demand side, we must expand the number of tenure-line positions in the humanities across the nation and resist the deprofessionalization of teachers and professors.
On the supply side, institutions that prepare graduate students must recognize that, too often, graduate students are valued for their cheap teaching labor. This is not to suggest that individual faculty members do not invest their hearts and souls in mentoring graduate students, but instead that universities have underinvested in tenure-line faculty. As Marc Bousquet pointed out, in some ways graduate students are the waste products of the system, their value to the university used up when they receive their degree.
Focusing on structural solutions would help those called to the humanities find university positions. If the jobs are not there, however, the answer may not be to continue to overproduce Ph.D.s and market them to private employers, but to curtail production. Unlike the undergraduate humanities major, which is part of a general liberal arts education and needs no vocational justification, the graduate program is designed to lead students to meaningful employment.
Humanities as a Calling
Students come to graduate school because of their passion for the humanities. We must respect what brings them to us. We must refuse to see them as budding entrepreneurs; they are ministers committed to spreading the gospel of the humanities. We must prepare them for the ministry they came to undertake, whether in schools and universities, in government, or in other organizations.
For most humanities Ph.D.s, the primary work will be teaching. Humanities Ph.D.s teach at the secondary and college levels, but humanities programs have been relatively disengaged from the task of preparing teachers. We have allowed teacher preparation to take place almost entirely within education schools, but there are many reasons why liberal arts programs should be more involved in preparing teachers.
Moreover, the cost of the split between secondary teachers and professors has been significant. In the history profession, as the AHA’s Robert B. Townsend makes clear in his book History’s Babel, the division between professors and other historians has devalued the daily ministry of most historians, led to an overemphasis on scholarship, and denied secondary school teachers opportunities to engage in the life of the discipline.
Even if most humanities graduates’ primary task will be teaching, we should not denigrate scholarship. Too many policy makers and commentators have suggested that humanities research does not matter. It matters greatly, both in the public sphere and in the classroom. To sustain scholarly inquiry, we need scholars around the country and world engaged in research and capable of critically assessing each other’s work. We need to ensure that humanities graduates at all levels — in K-12 schools, museums, local societies, media, universities, and government — have the space and time to engage in scholarship and be part of the conversation.
Reforming Graduate Education
If it is deemed necessary to reform graduate education, we must always keep in mind that we are preparing humanities ministers. To keep this first and foremost opens up alternative ways to reimagine graduate education.
We might, in addition to or instead of the Ph.D., offer a doctorate of humanities (like the JD or MD), a four-year program that would offer a solid academic education, require a significant work of scholarship in the form of a publication-worthy thesis, but also provide practical skills to help young humanists enter the humanities fields at various levels in different kinds of organizations. The doctorate of humanities could be interdisciplinary or field-specific, as different institutions and programs and the needs of scholarship determine appropriate.
To get a sense of what this would look like, we need only examine the curriculum for the M-Div at Princeton Theological Seminary, in New Jersey. The degree “is designed to prepare students for the diverse ministries of congregational leadership, for further graduate study in theology and related disciplines, for various types of chaplaincy, for mission work at home and abroad, and for other forms of church vocation. The curriculum is planned to provide the flexibility and independence consonant with a broad theological foundation.”
Students are expected to take coursework in Biblical studies, history, and theology. But academic work is insufficient. There is also a “practical theology” component to help ministerial candidates learn how to preach, educate, and perform pastoral care. Finally, the program requires “field education” under practicing ministers. At Princeton Theological Seminary, without reducing or diminishing academic preparation, candidates are taught to use their academic knowledge to carry out the very important work that they will undertake as ministers.
A similar combination of academic and practical education could prepare graduate students better for their jobs as teachers, but also for work in the public, nonprofit, or private sectors. Such a degree would be more portable, and as a result, it would also reduce the human and financial cost for those who cannot find professional humanities work and move on to other careers.
There is no reason to believe that this will reduce the quality of humanities scholarship. A four-year doctoral degree with a serious research component should prepare graduates for research as well as other kinds of work. After all, most ministers do not need Ph.D.s, nor do most lawyers or MDs. They need an education that enables them to undertake their daily work with thoughtfulness, the skills to make them effective at it, and the ability to engage in scholarship.
In many ways, that seems like what the proposed Georgetown English Ph.D. seeks to do. It would create a four-year program for students who already have an MA, provide a strong academic foundation, require a significant work of scholarship, and also provide field experience in an organization that promotes humanistic endeavors.
In conclusion, we need to continue to move forward on two fronts. The crisis of doctoral education is, to a large extent, a crisis of the university. We must continue to emphasize the need for more tenure-track hiring in the liberal arts. Nonetheless, there is a good case to be made that graduate education in the humanities could be more expansive, not because we need to bow down to the anti-intellectual forces reshaping higher education, but because we can better prepare graduates for the diverse ministries that they could serve.
Johann Neem is professor of history at Western Washington University and a visiting faculty fellow at the Institute for Advanced Studies in Culture at the University of Virginia.
This morning, after a poor night’s sleep punctuated by weird pregnancy nightmares and hourly wakings due to the discomforts of being newly behemoth, I lumbered over to my “office” (aka the other side of my apartment), and, loins girded, prepared to see what the internet beheld. As a freelancer with many different gigs, it’s not uncommon to have to “put out fires” first thing in the a.m., as they say, but this morning, all three rings in the circus of my life conflagrated at once. A potential dissertation-coaching client was unhappy with my original free consult, and I wanted to give my boss a rundown of what went wrong. Then, an urgent email from Germany — some editing work I was supposed to turn in yesterday wasn’t in! Ach, nein! Quick, turn on the German brain, apologize and send in the work, schnell.
Meanwhile, in my most public job, as an education columnist for Slate, I’m dealing with the fallout of my latest piece, which calls into question why the University of California System — which tells its students, faculty and staff it is one giant budget crater — feels the need to give its three “poorest” chancellors $60,000 raises. And, while I prepare to cook up my next column (maybe American academics teaching abroad? I have a friend in Kazakhstan; maybe she’ll let me interview her!), I’m making my editor’s extensive changes on my forthcoming one — with, of course, a turnaround of a few hours at most, as per the conventions of short-form Internet journalism.
All this happened today before I had a chance to eat or pee.
In the midst of what was already a wackadoodle morning, about 70 of my best “friends” linked me to a new op-ed here at Inside Higher Ed, by Cornell writing lecturer Charles Green. The 3,000-word magnum opus betrays a fairly impressive (obsessive?) attention to both my personal blog and certain contributions to my Slate work, and calls into question the research bona fides of my non-academic journalism. Green excoriates the cursory sample sizes and openly informal methodology of two of my recent columns — op-eds both, meaning that it is clear to most readers that what I am writing is indeed, in the words of the great rhetorician Jeffrey Lebowski, just, like, my opinion, man.
He even goes so far as to perform what appears to be a rhetorical exegesis of “Revise and Resubmit,” a roast of the humanities peer-review process done in my usual style, which is a mixture of dark humor, open hyperbole, and cutting truth — and which quotes, yes, a small sample of hilarious tweets about peer-review experiences from my readers, which I culled for their sharpness from a much larger “data set” of about 100.
But yes, the piece exaggerated. Every op-ed I write does. Every sentence I say at home does! My voice has, for better or worse, basically been what it is since my first turn as a columnist at the age of 17 (I appeared bi-weekly in Eugene, Oregon’s paper of record from 1993 to 1994 — kind of a big deal, I know). But it was sharpened in graduate school in a particular vein, as I fell in love with the crotchety Austrians who would come to define my research: Robert Musil, whose over-the-top satire of a bunch of rich drifters also belies harsh truths about the decline and fall of the Austro-Hungarian dual monarchy; the playwright Johann Nepomunk Nestroy, whose untranslatable humor involves saying something that is a massive exaggeration and an unfortunate truth at the same time; Karl Kraus, the patron saint of pithy bile and my personal hero.
Is Green correct that my 1,500-word op-eds (the appropriate length for such a medium, ahem) are not researched with the same rigor as my academic book, which took seven years to write, and for which I am receiving the standard advance of zero dollars? He is. If the 80 or so columns I’ve written for Slate in the past year were submitted to academic journals, they would all be rejected out of hand for their style, tone, and, yes, lack of scientifically perfect data sets.
But, speaking of “limited sample size,” (which itself masks Green’s real critique, which is that my experience in academe is different from his, and thus incorrect), I’d like to point out that Green has done to me precisely what he claims I do to the unfairly-maligned idyll that is the life of the mind (which, unlike me, he has never left, which would explain his unfamiliarity with the conventions of my medium).
In addition to his huff about “Revise and Resubmit,” he also takes issue with “Syllabus Tyrannus,” in which I trace the corporatization of the American university via the encroachment of administrative boilerplate onto once-brief college syllabuses. His main problem seems to be that the editor who wrote my subhead was also a fan of numerical exaggeration. Guilty, I suppose.
And the third and final piece he mentions is “Bring on the Sledgehammer,” in which I simply executed what we jokingly call a #Slatepitch — I took a contrarian stance to a current issue (President Obama’s college-rankings plan), and tried to argue it to the best of my ability. In the wake of that article, I was the first to admit the imperfection of my arguments, and it resulted in numerous productive conversations with readers.
Anyway, what Green conveniently neglects to mention, even with 3,000 words, is the vast majority of my work for Slate, most of which is far more akin to traditional reporting, and much of which has nothing to do with higher education at all (I’m thinking here, of course, of my vaunted German grocery store canon).
Among recent pieces are the following — none of which remotely fit Green’s characterizations:
“The Birth of the #FergusonSyllabus,” which describes the ways in which educators in my hometown of St. Louis and beyond are teaching about systemic racism and police violence;
“Don’t Extinguish the Fulbright,” which was part of a national media push that actually helped save the Fulbright program from a disastrous set of cuts;
“Nasty and Brutish,” which helped break the CU-Boulder philosophy sexual harassment scandal nationally (and brought about my first-ever hate mail from a Neo-Nazi – suck it, “Abraxas88,” whoever you are!);
I am used to people disagreeing with me, often vehemently and directly to my face. I am used to getting kicked around (also, now, from the inside — thanks, kiddo!). I understand that many academics long to write for a larger public audience, and resent the fact that I get to do so, because my experience is not indicative of theirs.
Look, I am as aghast at the modest success of my fourth-act career as anyone else. But here is why I get to do what I do: Readers can sense hedging, equivocation and cowardice from 10 miles away, and they don’t like it. At the same time, those who wish to succeed in academe must compromise what they say in public (the recent Salaita affair is but the most extreme example of the kind of systemic restraint that academia demands). As a result, a lot of “public” writing by academics is self-censored, over-equivocated, bogged down in data analysis, and thus unreadably boring to a non-academic audience. But since I am no longer beholden to some imaginary search or tenure committee, I get to hold nothing back — and that is why I get to be at Slate. If you want anyone to read your op-eds on a mainstream platform, you must take a firm, blunt stance — one that might have to oversimplify a few things for brevity, and one that will bring its share of both support and vitriol.
I guess Charles Green finally hit upon a stance — and oversimplification, though not brevity — that can bring in readers, too. Too bad it came in the form of an ad-hominem attack on a person who never did anything to hurt him, and whose body of work is more complex — and, simultaneously, more banal — than he gives it credit for.
Rebecca Schuman is the education columnist for Slate.
Writing in 1860, a journalist depicted Washington as a miserable little Podunk on the Potomac, quite unworthy of its status as the nation’s capitol. He called it an “out of the way, one-horse town, whose population consists of office-holders, lobby buzzards, landlords, loafers, blacklegs, hackmen, and cyprians – all subsisting on public plunder.”
"Hackmen" meant horse-powered cabbies. "Blacklegs" were crooked gamblers. And cyprians (lower-case) were prostitutes -- a classical allusion turned slur, since Cyprus was a legendary birthplace of Aphrodite. Out-of-towners presumably asked hackmen where to find blacklegs and cyprians.
But sordid entertainment was really the least of D.C. vices. “The paramount, overshadowing occupation of the residents,” the newsman continued, having just gotten warmed up, “is office-holding and lobbying, and the prize of life is a grab at the contents of Uncle Sam’s till. The public-plunder interest swallows up all others, and makes the city a great festering, unbearable sore on the body politic. No healthy public opinion can reach down here to purify the moral atmosphere of Washington.”
Plus ça change! To be fair, the place has grown more metropolitan and now generates at least some revenue from tourism (plundering the public by other means). Zephyr Teachout quotes this description in Corruption in America: From Benjamin Franklin’s Snuff Box to Citizens United (Harvard University Press), a book that merits the large readership it may get thanks to the author’s recent appearance on "The Daily Show," even if much of that interview concerned her remarkable dark-horse gubernatorial campaign in New York state's Democratic primary, in which anti-corruption was one of her major themes. (Teachout is associate professor of law at Fordham University.)
The indignant commentator of 1860 could include lobbyists in the list of ne’er-do-wells and assume readers would share his disapproval. “Lobby buzzards” were as about as respectable as card sharks and hookers. You can still draw cheers for denouncing their influence, of course, but Teachout suggests that something much deeper than cynicism was involved in the complaint. It had a moral logic – one implying a very different set of standards and expectations than prevails now, to judge by recent Supreme Court rulings.
Teachout’s narrative spans the history of the United States from its beginnings through Chief Justice John Roberts’s decision in McCutcheon v. FEC, less than six months ago. One of the books that gripped the country’s early leaders was Edward Gibbon’s Decline and Fall of the Roman Empire, the first volume of which happened to come out in 1776, and Teachout regards the spirit they shared with Gibbon as something like the crucial genetic material in the early republic’s ideological DNA.
To be clear, she doesn’t argue that Gibbon influenced the founders. Rather, they found in his history exceptionally clear and vivid confirmation of their understanding of republican virtue and the need to safeguard it by every possible means. A passage from Montesquieu that Thomas Jefferson copied into his notebook explained that a republican ethos “requires a constant preference of public to private interest [and] is the source of all private virtues….”
That “constant preference” required constant vigilance. The early U.S. statesmen looked to the ancient Roman republic as a model (“in creating something that has never yet existed,” a German political commentator later noted, political leaders “anxiously conjure up the spirits of the past to their service and borrow from them names, battle cries, and costumes in order to present the new scene of world history in this time-honored disguise and this borrowed language”).
But the founders also took from history the lesson that republics, like fish, rot from the head down. The moral authority, not just of this or that elected official, but of the whole government demanded the utmost scruple – otherwise, the whole society would end up as a fetid moral cesspool, like Europe. (The tendency to define American identity against the European other runs deep.)
Translating this rather anxious ideology into clear, sharp legislation was a major concern in the early republic, as Teachout recounts in sometimes entertaining detail. It was the diplomatic protocol of the day for a country’s dignitaries to present lavish gifts to foreign ambassadors -- as when the king of France gave diamond-encrusted snuffboxes, with his majesty’s portrait on them, to Benjamin Franklin and Thomas Jefferson. In Franklin’s case, at least, the gift expressed admiration and affection for him as an individual at least as much as it did respect for his official role.
But all the more reason to require Congressional approval. Doing one’s public duty must be its own reward, not an occasion for private benefit. Franklin received official permission to accept the snuffboxes, as did two other figures Teachout discusses. The practice grated on American sensibilities, but had to be tolerated to avoid offending an ally. Jefferson failed to disclose the gift to Congress and quietly arranged to have the diamonds plucked off and sold to cover his expenses.
Like the separation of powers among the executive, legislative, and judicial branches (another idea taken from Montesquieu), the division of Congress into House and Senate was also designed to preempt corruption: “The improbability of sinister combinations,” wrote Madison, “will be in proportion to the dissimilarity in genius of the two bodies.” Teachout quotes one delegate to the Constitutional Convention referring sarcastically to the “mercenary & depraved ambition” of “those generous & benevolent characters who will do justice to each other’s merit, by carving out offices & rewards for it.”
Hence the need for measures such as the clause in Article 1, Section 6 forbidding legislators from serving simultaneously in an appointed government position. It also prevented them from accepting such a position created during their terms, after they took office. The potential for abuse was clear, but it could be contained. The clause was an effort “to avoid as much as possible every motive for corruption,” in another delegate’s words.
Corruption, so understood, clearly entails far more than bribery, nepotism, and the like – things done with an intent to influence the performance of official duties, in order to yield a particular benefit. The quid pro quo was only the most obvious level of the injustice. Beyond violating a rule or law, it undermines the legitimacy of the whole process. It erodes trust in even the ideal of disinterested official power. Public service itself begins to look like private interest carried on duplicitously.
The public-mindedness and lofty republican principles cultivated in the decades just after the American revolution soon enough clashed with the political and economic realities of a country expanding rapidly westward. There were fortunes to be made, and bribes to be taken. But as late as the 1880s, states were putting laws on the books to wipe out lobbying, on the grounds that it did damage to res publica.
Clearly a prolonged and messy process has intervened in the meantime, which we’ll consider in the next column, along with some of the criticism of Teachout’s ideas that have emerged since she began presenting them in legal journals a few years ago. Until then, consider the proposal that newspaper writer of the 1860s offered for how to clean the Augean stables of Washington: To clear out corruption, the nation’s capitol should be moved to New York City, where it would be under a more watchful eye. Brilliant! What could possibly go wrong?