The current state and future prospects of the humanities are occasioning considerable anxious comment. Many humanists are sadly resigned to a belief that the humanities have irrevocably ceded pride of place to the social sciences and sciences; and, indeed, the social sciences and sciences generate and command much intellectual energy in the 21st-century university, for understandable reasons.
The usual remedies proposed for this state of affairs have seemed to me to be limited at best and perhaps even misguided. A typical argument for the utility of the humanistic disciplines is that studying them enhances critical thought and powers of expression, and one would certainly agree.
But I wonder whether such an argument will gain much traction with college-age students and especially their parents. The data suggest a clear national trend away from the humanistic disciplines toward those that seem to offer a different kind of promise or outcome: a vocational utility or practical applicability. Under such circumstances, abstract arguments about the enhancement of critical thought – no matter how skillfully they are advanced, no matter how much one might agree with them – are less likely to prevail.
I propose here one different kind of case for the humanities, one that identifies – and celebrates – their specific vocational utility.
Now, many of my fellow humanists, I suspect, will be troubled – even offended – by such an argument: the humanities ought not to be sullied by vulgar assertions about their supposed practicality. But there would be an irony in that response to my argument.
As a historian, I – like all historians – have invariably found it informative, illuminating and useful to consider the historical context and precedents for the issue at hand. And as a student of the Italian Renaissance, I have always found it ironic that, notwithstanding likely present-day resistance to evaluating the humanities in terms of their vocational utility, they enjoyed the considerable prestige they enjoyed during the Italian Renaissance and thereafter precisely because of their perceived practical utility.
Currently, the humanities, relative not only to the current place of the sciences but also to the place of the humanities during the Italian Renaissance, have withdrawn from a prominent role in the public arena, and this, I suspect, is one of the causes of their momentarily precarious state. During the Italian Renaissance, on the other hand, the humanistic disciplines were prestige subjects of study expressly because they enjoyed a relationship to the political and social order -- because those with political authority saw real practical value in encouraging humanistic study and employing those who had undertaken and completed it.
The adherents of the studia humanitatis held posts in the governments of the Italian cities and courts of the 15th and 16th centuries; their skills enabled them to serve their employers effectively as speech and letter writers, historians of the state, diplomats and government magistrates. They wrote elegant prose that was then deployed in diplomatic dispatches and letters and in speeches that they or their employers – the bearers of political authority – delivered effectively and persuasively, in part due to the elegance of the language, in part to the emphasis characteristic of the humanist program on skilled oratorical delivery. If I understand correctly, this is the collective opinion of a succession of distinguished historians of the Italian Renaissance: Paul Oskar Kristeller; Lauro Martines; Anthony Grafton and Lisa Jardine; James Hankins; and others.
Precisely how were such linguistic and literary skills leveraged as professional assets? In the words of one student of Renaissance humanism, rhetoric “was ... effective in the daily encounters of the tribunal, marketplace, and political forum, not to mention in diplomatic and personal correspondence. Artful communication ... became a[n] ... .instrument for gaining or maintaining power.” Grafton and Jardine have written that the skills
...inculcated had an established practical value in fifteenth-century Italy. The ability to speak extempore on any subject in classical Latin, the ability to compose formal letters to order in the classical idiom... were... valuable assets. Equipped with them the student could serve as an ambassador, or secretary to a government department... In other words, although the programme was strictly literary and non-vocational, it nevertheless opened the way to a number of careers....[T]he independence of liberal arts education from establishment values is an illusion. The individual humanist is defined in terms of his relation to the power structure, and he is praised or blamed, promoted or ignored, to just the extent that he fulfils or fails to fulfil those terms. It is ... a condition of the prestige of humanism in the fifteenth century, as Lauro Martines stresses, that “the humanists ... were ready to serve [the ruling] class.”
“In this setting,” Grafton and Jardine continue, “the rhetoric of humanism represents the power of Latinity and eloquence as actual power – as meshed with civic activity in a close and influential relationship.”
As models for their linguistic practices, the Italian Renaissance humanists turned to familiar and newly recovered classical texts, and the classicizing character of university education in the post-Renaissance European and Europeanized world is directly attributable to the influence of the Renaissance humanists, who advocated strenuously and successfully for the virtues of their particular disciplines. As late as the mid-to-late 19th century, venerable American liberal arts colleges offered a course of study for the A.B. degree that continued to feature classical texts, almost to the exclusion of other subject matter. (The course of study for the A.B. at such institutions also included some more limited course work in “geometry and conic sections,” algebra, plane and spherical trigonometry, mechanics, “general chemistry and the non-metals,” and additional subjects other than classical languages and literatures.)
So persuasive had the Italian humanists been in their advocacy that, centuries later, the course of study in the classic 18th- and 19th-century American liberal arts college continued to reveal the influence of the Italian Renaissance, notwithstanding the challenges one would have faced in arguing compellingly for the continuing utility of such an educational tradition in 18th- and 19th-century America. The Harvard historian Bernard Bailyn wrote that “[t]he classics of the ancient world are everywhere in the literature of the [American] Revolution,” “everywhere illustrative… of thought. They contributed a vivid vocabulary..., a universally respected personification...of political and social beliefs. They heightened the colonists’ sensitivity to ideas and attitudes otherwise derived.” And, indeed, James Madison, A.B., LL.D. Princeton University, 1771, 1787, mastered several ancient languages before “fathering” the American Constitution.
Harvard president and chemist James Bryant Conant could write as late as the 1950s that “[in] Europe west of the Iron Curtain, the literary tradition in education still prevails. An educated man or woman is a person who has acquired a mastery of several tongues and retained a working knowledge of the art and literature of Europe.”
Now, what does one learn from this brief primer on the historical context? First, that advocacy – the kind of advocacy characteristic of the Italian Renaissance humanists, who, according to Kristeller and those who wrote after him, wrested a temporary position of preeminence in their society precisely through the force and effectiveness of their advocacy – is perfectly acceptable, and carries no risk of coarsening the quality of the enterprise: a succession of Italian Renaissance humanists beginning with Petrarch advocated spiritedly for their program, and one could scarcely argue that their intellectual achievement was cheapened as a result of that advocacy.
And second, that such advocacy is especially successful when it legitimately emphasizes vocational utility and professional applicability, when it advances an argument that one’s field of study leads incontrovertibly to coveted careers and has concrete benefits for the state and for the political and social order. Let us be spirited advocates, therefore, and celebrate the utility of the humanities as one of the justifications for studying them.
Could a similar, and similarly effective, case be made today for the humanistic disciplines? I believe so. In what ways could one argue – reasonably, justifiably, and therefore persuasively – that the humanities have direct professional viability, and that one can therefore envision and countenance studying them not only because of the intrinsic intellectual satisfactions of doing so or merely because their study enhances critical thought or powers of expression in some abstract sense, but also because there is true, clear utility to doing so?
It would not be difficult to inventory a considerable number of coveted professions and enterprises where humanistic training is not only professionally valuable, but indispensable. I offer just a few possibilities here, and the list could easily be extended, I should imagine. (For example, Lino Pertile suggested the importance of humanistic training to careers in the growing nonprofit sector.)
And my argument is that, in our advocacy for the humanities, we should not be at all reluctant to make much fuller and more explicit reference to their career utility.
What would a 21st-century inventory of concrete vocational applications of the humanities look like? For example:
A field that embraces what was once termed bioethics and related areas. When one addresses and attempts to resolve such pressing public-policy issues as stem-cell research, abortion, the availability of health care, female genital mutilation, AIDS, epidemics and pandemics, and many others, a satisfactory resolution of the problems encountered will depend not solely on scientific and medical expertise, but also a command of the time-honored questions of the ancient discipline of philosophy: notions of justice (for example, determining how to distribute justly limited resource like health care); morality; and ethics. These are urgent matters that require a humanist’s expertise and the philosophers’ millennia of experience in analyzing such vexing issues. The career possibilities in international health organizations, government agencies, non-government organizations, and think tanks seem promising. The indispensability of the humanities to the successful practice of this field is such that it is now often termed the medical humanities.
Architecture and urban planning. The architect and urban planner creates the built environment (an urgent and practical, enterprise, in that human beings require spaces in which to live and work), and in doing so, he or she functions at the nexus of the political-economic, the social, and the aesthetic; the architect and urban planner is equal parts humanist (who deploys aesthetic sensibilities in the design work) and sensitive reader of the practical social, political, and economic contexts within which he or she necessarily operates. Enlightened city planning offices welcome colleagues with such sensibilities.
Foreign service and diplomacy. Never before has there been a more urgent need for skilled readers of cultural difference. A sensitive humanistic understanding of other cultures, acquired above all through the rigorous study of foreign languages (and literatures), will be indispensable in coming to terms with such developments as the encounter of Islam and the European and Europeanized worlds. The repercussions for so practical a consideration as American national security are obvious, and one can imagine many outlets for such skills in government service.
Various modes of public discourse (or “writing in action,” as my former Tulane colleague Molly Rothenberg has termed it). By this I mean the effective use of language in the public arena, such as journalism (both print and broadcast, and, increasingly, digital) or television and motion-picture screenwriting. But it could also be extended to embrace advertising (increasingly web-based, which entails yet another humanistic skill, the aesthetic sense required in the visual and aural material that now invariably complements text); web-page design (which, once more, will entail a fusion of the visual, aural, and textual); and related enterprises. The humanist’s command of the aesthetic complexities of text and language, visual image, and aural material, and their simultaneous deployment will be indispensable. Indeed, the digital technologies of the 20th and 21st centuries are so powerful, and the full reach of the transition currently under way so difficult to apprehend, that one can only speculate as to what shape human communication will take when the shift to a new paradigm is more or less complete. (Indeed, humanistic sensibilities may prove to have a salutary, tempering influence on the effects of digital technologies.) The skillful fusion of still and moving images, aural material, and text will determine the effectiveness of MOOCs, which will depend as much on humanistic skills as scientific and technical.
Rhetoric and oratory. This element is related to the previous one. The electronic information technologies that emerged beginning with the invention of the telegraph in the 19th century have a characteristic that makes them unlike manuscript copying and print: they “dematerialize” information and make it possible for it to be disseminated with lightning speed across vast distances. And the invention of radio, film, and television added the elements of the aural and moving visual to those that had characterized the medium of print (and manuscript copying before it): written text and still image. These newer technologies more closely replicate “live” human experience, and much more closely than print, which freezes discourse, and alters its character. As a technology, print (and the media associated with it) have been giving way to electronic technologies, with their capacity for the full integration of written and spoken language, still and moving image, and sound (music and other aural material), and for the dematerialization and dissemination of such information. The implication for colleges and universities is as follows: we have invested admirably in initiatives designed to train our students to write well and read texts critically and perceptively. But given the power of the new technologies, there is a case to be made for a return to greater instruction in rhetoric and oratory, to an equal command of the spoken word, which can be captured on audio- or videotape or broadcast over radio, television, and the computer (via Skype), in a guise that print has never demanded. The development of electronic communication technologies that permit us to communicate extemporaneously over vast distances in a conversational tone and manner, suggests that we might well retool our educational system to feature once again the time-honored humanistic practice of effective oratory and refine our students’ facility in the spoken word.
One need only consider the example of Barack Obama’s skilled oratory (or Franklin Roosevelt’s, or Ronald Reagan’s, or John Kennedy’s) to appreciate the importance to the political order of a venerable humanistic skill like oratory; these are political figures who postdate the development of electronic technologies, notably. Columnist George F. Will has observed that the American presidency is “an office whose constitutional powers are weak but whose rhetorical potential is great.”
By no means do the new electronic information technologies obviate the need for continuing skill in other, more traditional and familiar humanistic modes of communication – the kind of careful, comprehensive, subtle argument that written text affords – and the close, sensitive reading and command of existing texts that inform the authorship of new texts. Henry Riecken suggested that “[t]he text of the Federalist Papers was put into machine-readable form in order to carry out an analysis that resolved questions of disputed authority of some of the papers; but the new format did not replace the bound volumes for readers who want to absorb the thoughts and reflect on the aspirations of this stately document.”
Art conservation, and its relationship to the political economy. Nations with an exceptional legacy of monuments in the visual arts (Italy being an well-known example) face a particular challenge with respect to maintaining the condition of that legacy. And in Italy’s case, the relationship of the condition of that legacy to the economy is obvious: given the central place of tourism in the Italian economy, it is vital that the nation’s artistic patrimony be satisfactorily conserved. Sensitive art conservation is at the intersection of the humanistic (the aesthetic), the scientific and technological (an understanding of the nature of surfactants and the effects of environmental conditions), and the political-economic (the need to balance the claims of conserving the artistic patrimony acceptably against other claims on public resources).
What is interesting about this list is how closely its elements are aligned with the Italian Renaissance humanist’s earlier construction of the studia humanitatis. The kind of ethical reasoning demanded in successful practice of the medical humanities is, in its way, a modern iteration of the Renaissance humanist’s moral philosophy; 21st-century applications of writing, rhetoric, and oratory are, in their way, contemporary versions of the Renaissance humanist’s grammar, poetry, and rhetoric; the understanding of foreign cultures and languages required for effective foreign service in today’s bewilderingly complex and interdependent world is, in its way, the modern expression of the Renaissance humanist’s practice of history. The foundational elements of the core humanistic program have perhaps not changed so very much.
What is different is the explicitness with which the Renaissance humanists advocated – persuasively, compellingly, successfully – for the professional utility of their disciplines, which permitted them to secure a place of considerable prestige and authority in their world. There is warrant for their 21st-century successors’ advancing a similar argument: that one undertake the study and practice of the humanistic disciplines not only within the confines of the academic world (as intrinsically worthwhile, in a fundamental intellectual sense) but outside them as well (as critical to the successful execution of one’s expressly professional and vocational responsibilities).
Specifically, I propose that we self-consciously reframe the presentation and delivery of the humanistic offerings of the modern-day college and university to make much more explicit reference to their potential applicability: that we foreground this kind of argument for their virtues. Some of what is now being done within the university is being done absentmindedly, so to speak, without a sufficiently self-conscious articulation of why we do what we do. Were we to reframe our offerings in this way – reposition the humanities and articulate their virtues differently – we might find that the national trend away from them could be halted and perhaps even reversed.
My sense is that many students rather naturally hunger for the humanistic disciplines and are driven to make other curricular choices in part because of concerns about career viability. Were such concerns addressed – legitimately, effectively, persuasively – we might find some such students willing to study what their hearts prompt them to study. In our curriculums, were we to foreground explicit, purposeful reference to the ways in which the humanities are indispensable to the successful practice of some of the esteemed and rewarding professions identified above (rewarding in several senses of that word), we might succeed in alleviating student (and parental) anxiety about the practicality of studying such supposedly “impractical” subjects.
Only by such means, I believe, will the humanities truly be able to re-secure the place they once enjoyed, and still deserve, in the collective cultural imagination and in the great public arena. And by no means should we be hesitant about advancing such an argument, since we have the example of the Italian Renaissance before us: it would be difficult to argue that energetic advocacy on grounds of vocational viability compromised the artistic and intellectual integrity of the achievements of Petrarch and his venerated successors.
Anthony M. Cummings is professor of music and coordinator of Italian studies (and former provost and dean of the faculty) at Lafayette College.
Submitted by Jeff Rice on October 7, 2013 - 3:00am
Not all academics eat well. Often, I have found myself among a group of friends at the end of a conference, hungry for dinner, and, by some unknown force, our movement is directed toward an overpriced, chain steak house or fast food restaurant.
Conference hotels often house a Starbucks; each morning of my field’s main conference, a line of 30 people deep can be found before the day’s proceedings begin. Publisher-sponsored affairs are always a big hit. Cold shrimp served with ketchup-based sauce. Small cheese-stuffed pastry hors d'oeuvres. Toast with tomatoes on top. Cheese and crackers. Hummus. Crudités. Free food. Conference lunches can be less generous as colleagues grab day-old sandwiches -- made in some unknown factory -- at Starbucks. Or they push coins into a machine and grab a Milky Way.
Office microwaves are often messy with the remnants of frozen pizza, ramen noodles, or reheated hot dogs. Sometimes, when I am walking from the parking garage to my campus office, I spot colleagues at 8 in the morning leaving the nearby McDonald's with bags of fried something-or-other. One of my more astute colleagues, who works extensively in cultural criticism, has stood more than once across from me in our building’s elevator, a bag of Chick-fil-A in his hand. Academic cocktail parties, at the university or at a conference, usually offer $6 bottles of Bud and Miller Lite. The $7 Sam Adams is labeled “import.”
Our department meetings take place every other Tuesday during lunch time. It is not uncommon for me to eat a sandwich or salad during the meeting. Being at work makes me hungry no matter how large or small a breakfast I’ve had. When I taught community college night classes almost 20 years ago, I ate a salad before class started. Sometimes, I pack hard-boiled eggs in my salad so that the sulfuric odors permeate the room during meetings.
No matter where I’ve worked, campus catering coffee is always bad. Order a vegetarian meal for an event, and campus catering makes something heavy in starch (pasta drenched in a bland, unseasoned red sauce) or portabello mushrooms (grilled or raw). Across the street from our campus is a restaurant with the word “ass" in its title (“huge” is another word in its name). Across the street, one can also dine at a Korean restaurant, an African restaurant, a local pizza place with a good beer list, a Middle Eastern restaurant, a regional taco chain with the word "local" in its title, a fast food restaurant that specializes in chicken fingers, and a McDonald’s. I entered the student union the other day and saw a 30-person deep line at the Subway.
While my wife and I are members of the local co-op, not all of our colleagues know that it is located three miles south of campus. Recently, I bought local paw paw at the coop and posted a picture to my Facebook profile; some people mistook it for rotten avocado. The possible outsourcing of campus dining to a private company has raised faculty and student concerns that the university’s spending of almost $800,000 per year on local food will vanish. During a tour of the campus dining food warehouse last year, I was informed that when the university purchases local cattle, the chefs sneak ox tail into stews served in student housing. While most, if not all, of us housed in the humanities support same sex marriage, the elevator in our building reeks of Chick fil-A - whose owner opposes such marriages - on any given day. Purchasing a University of Kentucky Dining Plan allows a student to eat at Chick fil-A and Subway in addition to campus dining facilities.
Every October, regardless of what I am teaching, I share with students my hatred of candy. "I work all year," I say to them, "to keep candy away from my kids, and two hours of walking around the neighborhood on October 31 ruins my hard work."
In the living-learning community I co-direct, we leave Tootsie Rolls and Milky Ways out in a bowl for students to snack on when they come in for academic or life advice. In the residential hall where the living-learning community is housed, for our weekly coffee chats with members of the local community or university, we provide factory-made cookies from the Kroger supermarket chain and Cheetos. The best way to get faculty or students to attend a meeting or event, common advice goes, is to serve pizza.
Several times I’ve taught a course with the word “Eating” in its title. When I was at the University of Missouri at Columbia, the course was called “Eating Missouri.” When I took a job at the University of Kentucky, the course became "Eating Kentucky." After reading Anthony Bourdain, Calvin Trillin, a profile of Whole Foods CEO John Mackey, and notable food critics and discussions, including exposés of the fast food industry and mass-produced food, students still came to class with chicken McNuggets, defrosted frozen pizzas, high-fructose corn syrup sweets, and Krispy Kreme donuts. At the four different universities that have employed me, a Subway has always been within walking distance.
Many colleagues drink 32-ounce sodas in the morning. Because of my reputation as someone who enjoys craft beer, when I’m visiting a campus for an invited talk, colleagues feel obligated to take me to a place that serves good beer. When I was on my campus visit at the University of Missouri seven years ago, colleagues took me to the local brewpub for dinner. After our last main conference in my field, I fretted over the long flight home from Las Vegas (beginning early in the morning and ending at night) and worried that I would not have time during the layover to purchase something to eat. To ease my fear of future hunger, I bought a vegetarian sandwich in the casino Subway.
I used to get excited about attending the dinners for guest speakers or job candidates. Free food. Free food at expensive restaurants. I’ve since grown tired of menus that offer only heavy meat dishes, overcooked lamb chops, bacon in everything, or scallops. The Chick-fil-A in our campus food court is "proudly" closed on Sundays. One Friday a month, the agriculture college hosts a food-related discussion for faculty and members of the community at 7:30 a.m. Local food is served for breakfast. Participants are encouraged to bring their own coffee mugs. I once gave a talk entitled "Menu Literacy" for the discussion.
I sometimes say that my casual conversational skills are limited to discussions of kids, food, and beer. My attempts to recruit job candidates often involve telling them how great the local farmer’s market is and what kind of beer they will be able to buy if they move here. For some reason, I can host a catered event with local vendors in one building on campus, but in the building next door, I must use campus catering. In my previous job, because of budget cuts, the office I directed was no longer allowed to order $15 worth of cookies from a local bakery for board meetings that took place twice a month. In my previous job, I angered campus catering by complaining about the low-quality food they served during a "Writing Across the Curriculum" event I hosted. Campus catering at my current university won’t allow me to invite a local Mexican food truck for a small event that would take place outside of the living-learning community residential hall.
I know I sound like a grump or food snob with these random observations. And I probably am as much of a food snob as I am a critic snob or rhetoric snob or teaching snob or snob of any other part of my professional life. Snobbery can simply mean valuing one thing over another to a significant, and sometimes hyperbolic, degree. I value eating.
Snobbery is not alien to academic discussion; we place value on any number of things we admire or teach. I’ve often wondered how cultural snobbery, often expressed by colleagues in regard to art, literature, music, or film, does not extend to gastronomy. I’ve often wondered how astute cultural critics or critics of the university are poor food critics. By that, I don’t mean that we must decode every food representation we encounter in order to better understand ideology or power in the food industry. Instead, I wonder why, in our practices of everyday life, we succumb so easily to fast food, high-fructose corn syrup, chains, and other items instead of merely trying to eat outside of these problematic practices.
Pleasure, of course, is a powerful agent. Pleasure, of course, is at the heart of bad eating habits. And food writers such as Michael Pollan have demonstrated the ways fat , sugar, and salt compose and encourage a specific system of food pleasure, one encouraged by much of the fast food industry. None of us are beyond such pleasure, but that does not mean we must succumb to every instance that calls out to us.
Calvin Trillin’s best effort at food critique was to declare, "I wouldn’t throw rocks at it." My food pleasure is not another’s food pleasure, I realize. And I have no desire to preach health-conscious food habits or mindful eating to my academic colleagues. I have no overall argument to make regarding what academics should or should not eat. I have no agenda to preach. My observations merely prompt me to ask: Why don’t some academics eat well?
In asking that question, I am sketching some observations that include me, too. Among these observations I highlight, I also note that I support the local food movement known as "Kentucky Proud," and my wife and I try to buy most of our produce and meat from the Lexington Farmer’s Market. But when on the road or on campus without coffee, we succumb to Starbucks, too. Among our food purchases, we buy for our kids Arthur Pasta, dehydrated cheese and pasta shaped like the popular PBS character. We are not beyond the commercialization of food either.
Bruno Latour has warned of "purification narratives," stories that try to portray some event, movement, or way of thinking as pure or without contradiction. Roland Barthes once noted that every text is made up of contradictions, what he referred to as the pleasure of the text. That I have ordered a coffee at Starbucks or bought a box of pasta named for a cartoon character might seem to be minor contradictions of my interest in local food or my series of somewhat critical observations. Minor or major, the contradictions no doubt reveal a larger crack in any kind of purification narrative of food I might want to portray. I’m sure there are more or larger cracks in my ideological stance. After all, even after he carefully decodes the industrial, meat industry in his New York Times essay “Power Steer,” Michael Pollan confesses to not caring for grass-fed beef.
My point is only to trace a type of academic eating, a series of habits and practices that run counter, at times, to our professional practices and beliefs, that suggest an untapped pleasure of the text as we build elsewhere purification narratives regarding culture or texts. For good or for bad, many academic eating practices follow similar trajectories to one another as the banal and bland overpower the local and flavorful. Professionally, we are great critics: MOOCs, corporatization of education, adjunct labor, global conflict, a fiscal crisis here or there. What about bad eating?
One type of pleasure of the text might be the relentless critic who finds fault in every representation outside of the bag of Chick-fil-A in his hand. One might surmise from this lack of critical parallelism a lack, or crack, in the overall project of critique. French fries or diet soda, it seems, may be outside of critique, the behavior change that critique is meant to promote, or even basic awareness. Such an assumption goes far beyond my simple observations of eating in the university. I can only speculate in the meantime how critical practices might better shape food practices. Do you know what you are? Frank Zappa asked. You are what you is, he responded. Or, as the popular health saying goes, you are what you eat. Either way, not all academics eat well.
Jeff Rice is Martha B. Reynolds Professor of Writing, Rhetoric, and Digital Studies at the University of Kentucky.
Very seldom in writing about scholarly publishing have I had an occasion to use the word “fun” -- actually, this may be the first time -- but with a couple of recent titles, nothing else will do.
They are not frivolous books by any means. Sober and learned reviews by experts may appear in specialist journals two or three years from now, and they will be instructive. But the books in question should generate some interest outside the field of J. Redding Ware studies.
Nobody who appreciates the history and color of the English language can fail to enjoy the University of Oxford Bodleian Library’s new facsimile edition of Ware’s masterpiece, the invaluable Passing English of the Victorian Era: A Dictionary of Heterodox English, Slang, and Phrase. First published in 1909, shortly after Ware’s death, it is now available as Ware’s Victorian Dictionary of Slang and Phrase -- a title that is more marketable, perhaps, but decidedly wanting in precision. Most of the lingo it covers is Victorian, but the dictionary itself, appearing as it did in the final month of the king’s life, is Edwardian. (A pedant’s work is never done.) The new edition contains a valuable introduction by John Simpson, who retires this month as chief editor of the Oxford English Dictionary.
It covers almost everything currently known about Ware (i.e., not much) and assesses his contribution to lexicography, which was considerable. Ware’s dictionary is cited in the OED more than 150 times, including “almost 50 times for the first recorded evidence for a word or meaning.” My earlier reference to Ware studies was a bit of a stretch, since nobody has ever written a book about him, nor a dissertation -- nor even, it seems, a journal article devoted solely to his life or works. A couple of papers by Joyce scholars identify his dictionary as a source the novelist consulted while writing Ulysses. Simpson’s introduction is the landmark in a nearly barren field.
Ware was born in 1832 and his first name was James, after his father, who was a grocer. In his teens the future author served a short jail sentence following a violent family quarrel, during which he grabbed a bacon knife and threatened to kill James the elder. He worked as mercantile clerk while writing his first novel, published in 1860. From that point on Ware seems to have eked out a hardscrabble existence along the lines George Gissing depicts in New Grub Street, cranking out fiction, journalism, and quite a few plays, along with a handbook on playing whist and a tourist’s guide to the Isle of Wight.
Simpson unfortunately neglects to mention one other documented occasion when Ware went to court, seeking relief from a downstairs neighbor who played the piano at all hours. The outcome remains unclear but it seems no bacon-knife was involved.
Not quite a gentleman, then, nor by profession a scholar. Ware’s dictionary was commissioned by Routledge as a supplement to another volume on colloquial English. “His lexicographical method is arguably modern,” Simpson notes, “as he based his selection largely on printed evidence that he (and, one imagines, other interested parties) had collected principally from newspapers and other ephemeral sources… The latest quotations date from 1902, though the majority date from the 1880s and 1890s.”
No don would have come up with what Simpson calls Ware’s “idiosyncratic labeling system,” which identifies the provenance of a given piece of slang through categories such as “Slums,” “Italian Organ-Grinders,” “Music Hall,” or “Colloquial Imbecile.” He clearly spent a good bit of time hanging around theaters and pubs, and “was painfully aware of changes in hairstyles and fashion generally over the decades, and can with the help of his printed evidence place the introduction of a new ‘look’ to a precise year.”
Some of the expressions Ware includes have passed into accepted use. He identifies opportunism as a piece of political slang from the 1860s, explaining: “Used rather in contempt, as subserving conscience to convenience, or to personal advantage.” It turns out that flappers -- young women of an immodest sort -- were on the scene well before the 1920s, And while Susan Sontag doesn't mention it in wrote her notes, Ware identified camp as an adjective applying to “actions and gestures of exaggerated emphasis," noting that it was “used chiefly by persons of exceptional want of character.” By that Ware undoubtedly means sissies, “effeminate men in society,” a term he indicates (citing an American newspaper) caught on in the 1890s.
I would have assumed the slang word narc -- pertaining to an informer, and used as both noun and verb -- derived from narcotics. Apparently not: copper’s nark is defined as thieves’ argot, also from the 1890s, meaning “a policeman’s civilian spy.” Ware indicates that police were called coppers from 1868 on, and you’d have found a copper-slosher (“individual with the mania for ‘going for’ policemen”) hanging around a copper’s shanty, as the station house was known. A working-class person thought to be blustering risked the taunt “Copper! Copper!” – implying that he was “giving himself the airs of police authority.”
But where did cop itself come from? “There has been more discussion over this widely applied word than any other in the kingdom of phrase,” writes Ward in one of his longer entries. It is incredibly polysemic, meaning “taken, seized, thrashed, struck, caught by disease, well-scolded, discovered in cheating,” and could also be used as a verb meaning “to take too much to drink” (hence copping the brewery). Thunderous applause for an especially good show at a music hall show would cop the curtain, so that the performer could take a bow.
The vast majority of Ware’s 4,000 entries define expressions that vanished without a trace. Hence his original title: "passing English" comes and goes. It's the vigor of language that drives the vitality of the book.
One craze of the 1880s was corpse-worship – “the extreme use of flowers at funerals” – which got so bad that by the ‘90s “many death notices in the press were followed by the legend ‘No flowers.’ ” Slang words often come from the contraction of common expressions; for example, damirish, “damned Irish,” and damfino, “I am damned if I know.”
Nobody still describes an egg gone bad as suitable for electioneering purposes (derived from “the exercise of projecting them at antagonistic candidates”) and the culture is all the poorer for it. Then again, it's a relief that suggestionize -- an old bit of legal jargon meaning “to prompt,” as with a witness – never caught on. Now if we could just euthanize “finalize.”
Decades before the birth of Jerry Garcia, deadhead was the entertainment-industry label for an audience member who got in without buying a ticket. It applied to critics and “’theatrical people’… [who] never pay to enter a theatre.” Ware, as a playwright, resented them, and the dictionary vents his frustration in an amusing manner:
“The experienced eye can always divide the deadheads from the ‘plank-downers’ in a theatre. The deadheads are always dressed badly, and give themselves airs when looking at the inferior parts of the house. The plank-downers never give themselves airs, mean business, and only look at the stage. Deadheads are very emphatically described by a theatrical official: ‘Here come two more deadheads; look at their boots.’”
Many entries are no doubt the only record of a term or catchphrase, and in some cases the lexicographer can just guess what they originally signified. Who stole the goose? is an “interjection of contempt, which appears to have some hidden meaning, probably of an erotic nature.” in the case of Who took it out of you?, Ware doesn't even try. The meaning is “wholly unknown to people not absolutely of lower class.”
Of comparable subaltern origins, but easier to understand, is the slang term label for sausage: bags o’ mystery. That one should come back into circulation. Its use could be extended to the hot dog.
Speaking of mystery, another book recently reissued in facsimile is Andrew Forrester, Jr.’s The Female Detective, a collection of short fiction from 1864, reprinted by the British Library and distributed in the U.S. by the University of Chicago Press. A digital edition is available from Amazon.
You can find the book online in PDF for free -- which is also true with Ware’s dictionary, although Simpson’s introduction is not to be missed. With The Female Detective, the new material consists of a foreword by Alexander McCall Smith (best known for his No. 1 Ladies Detective Agency novels, but also the author of What W.H. Auden Can Do For You, just out from Princeton University Press) and an introduction by Mike Ashley, whose books include a biography of Algernon Blackwood.
The Female Detective is offered as a collection of reports from the files of the elusive Miss G----- (she is nothing if not discreet) as edited by Forrester. The detective’s “casebook” was a very popular genre at the time, part of the “railroad literature” that sprang up to meet the demand of commuters. Forrester later wrote at least two more such collections, but his place in the history of the genre comes from having created Miss G---- (a.k.a. Mrs. Gladden), the first female professional detective in fiction. Kathleen Gregory Klein devoted several pages to the book in The Woman Detective: Gender & Genre (University of Illinois Press, 1988) and puzzlement over Forrester's identity – was it a pseudonym? – has echoed down the scholarship ever since.
It now seems very likely that the author was, in fact, J. Redding Ware. Simpson accepts it as credible in his introduction to the dictionary, as does Mike Ashley in the opening pages of the short-story collection. The identification was proposed by Kate Summerscale in the notes to her nonfiction novel The Suspicions of Mr. Whicher: A Shocking Murder and the Undoing of a Great Victorian Detective (2008). In researching her book, Summerscale noticed that Ware published a pamphlet about the crime in question: a child murder that occurred in a country house in 1860. (The circumstances are oddly reminiscent of the JonBenet Ramsey case.) He seems to have incorporated the text into a chapter of The Female Detective.
Mrs. Gladden had keen powers of observation and deduction, and a reader can’t help thinking of the much better-remembered private eye who came on the scene later in Victoria’s reign. Ware must have felt that Arthur Conan Doyle had stolen his thunder – though that seems like a rather peculiar phrase, come to think of it.
Wade lists it in his dictionary, explaining that it means “annexing another man’s idea, or work, without remunerating him, and to your own advantage.” It was first used, he writes, by one John Dennis, “a play-writer of the 17th century, who invented stage thunder for a piece of his own which failed.” The theater manager incorporated the technique in the production of someone else’s play, prompting the enraged Dennis to yell out, “They won’t act my piece, but they steal my thunder.”
I hope J. Redding Ware studies comes into its own, or at least that others discover him. How often is it worth reading a dictionary just for the stories?
Originally published in France in 1989 and now arriving in English translation from Yale University Press, Arlette Farge’s The Allure of the Archives is a little gem of a book. A diamond, perhaps, given both its clarity and the finesse with which it’s been cut and set. It is an unmistakable classic: one of the great memoirs of the silent, day-to-day drama of research.
Yet it is not the story of a career. A reader unfamiliar with her name will soon enough recognize that Farge specializes in 18th century social history. (She is director of research in modern history at the Centre Nationale de la Recherche Scientifique, in Paris.) One of her endnotes refers to the volume she published in collaboration with Michel Foucault in 1982, two years before the philosopher’s death. It was an edition of selected lettres de cachet from the files of the Bastille. (In A Tale of Two Cities, Dickens portrays the notorious legal instrument as one of the aristocracy’s corrupting privileges, but Farge and Foucault showed that it was disturbingly easy for an ordinary person to take one out against a family member.)
Many an over-active ego has turned to autobiography on far slimmer grounds than that. But in spite of its personal and even intimate tone, The Allure of the Archives is never primarily about the author herself. She instead writes about a subjectivity that exists only upon entering the reading room – or perhaps while trying to find it during the first visit.
As much as an institution, the archive is, for Farge, the site of a distinct array of experiences. It is defined by rules, habits, worries (especially with fragile documents), and a kind of ambient awkwardness that she evokes perfectly. “The silence of the archives,” she writes, “is created out of looks that linger but do not see, or gazes that stare blindly. No one escapes these wandering eyes, not even the most obstinate reader, his face clouded by work.”
In the midst of that tense, watchful silence, the patrons jockey for prime desk space, or drive each other to distraction with coughs and nervous tics. Locating the documents you want requires initiation into the cabala of archival reference numbers. Even if you have one, it “often will only direct the reader to another serial number that will itself only give access to a new series where other serial numbers await.” For guidance, one must turn to the superintendent of the reading room: “She reigns, gives advice that bears a strong resemblance to orders, speaks very loudly, and does not understand what she does not wish to understand, all the while constantly ruffling the pages of her morning newspaper.”
Her descriptions are very true to life, even an ocean away from the French archives she has in mind. The part about reference numbers may sound like comic exaggeration. It isn’t, or at least it corresponds to my last visit to the Library of Congress manuscript collection some years ago.
But all of it is just the setting for the necromancy of research – calling up men and women who are long dead and utterly forgotten, speaking for the first time in two or three hundred years. The dossiers Farge explored in her research consisted of police and judicial records as well as reports to His Majesty on what the rabble of Paris were doing and saying from week to week. A whole layer of informers – called in slang mouches, “flies” – kept track of rumors and complaints making the rounds. (They were, in effect, flies on the wall.) Also in the files are slanderous and subversive posters with traces of grit clinging to the back after the authorities ripped them down.
Police interrogations (transcribed for use during an investigation, then filed away and eventually warehoused) document the everyday happenings, criminal and otherwise, of urban life. Most of those speaking left no other account of their lives. Nor could they; even those who could read a little had not always learned to write. Having become accustomed to reading the faded or erratically punctuated documents, the researcher may undergo a sort of rapture: “the sheer pleasure of being astonished by the beauty of the texts and the overabundance of life brimming in so many ordinary lives.”
At that point, a danger becomes evident: “You can begin to forget that writing history is actually a kind of intellectual exercise, one for which fascinated recollection is just not enough…. Whatever the project is, work in the archives requires a triage, separation of documents. The question is what to take and what to leave.” And the researchers answers it by coming to recognize the patterns that emerge from one document to the next: the unstated assumptions made by those on either side of the interrogation, the changes in tone or preoccupation recorded over time.
The historian learns to frame questions that the archive knows how to answer – while remaining open to the secrets and surprises to be found in the boxes of paper that haven’t been delivered to her desk yet.
“In the archives,” Farge writes, “whispers ripple across the surface of silence, eyes glaze over, and history is decided. Knowledge and uncertainty are ordered through an exacting ritual in which the color of the note cards, the strictness of the archivists, and the smell of the manuscripts are trail markers in this world where you are always a beginner.” Farge's memoir is adamantine: sharp, brilliant, perfect, and created to last.
About this time of year one invariably reads fulsome, even orgiastic essays by academics professing the exhilaration and sense of joy they feel on the first day of class each August or September. In so doing, they often blather on about limitless possibilities and rituals of renewal, etc., and wax on about frisson A and epiphany B on the quad.
I must admit that my experience is quite different. Whereas for many professors the beginning of the academic year is a time of excitement and anticipation, for me it is — indeed, has been for the 30-plus years I’ve been teaching at the university level — a time of melancholy, even gloom. Indeed, late August/early September marks the peak period of my annual bout of SAD. To most clinicians, SAD denotes "seasonal affective disorder," a condition in which normally well-adjusted people experience a range of depressive symptoms, but for me SAD means "student affective disorder." Same symptoms, different etiology.
Around the beginning of August -- even earlier now -- I begin to suffer the symptoms: heightened anxiety; enervation; difficulty concentrating; social withdrawal; increased irritability; nausea. Over time, I’ve found that the reason for the onset of such conditions is the looming return of STUDENTS into my life.
It is not August, but the end of the exam period in May that elicits in me a sense of joy and limitless possibilities. Only when my grades are turned in, the seniors graduated, and the dorms emptied out do I begin to feel a sense of excitement and anticipation and the possibility for renewal. For it is only then that I can focus on research and writing without the threat of being interrupted by tedious office hours, middle-of-the night phone calls, and "urgent" e-mails ("Can I get an extension on my book review?"), not to mention lectures, seminars, grading, meetings, committee work, etc., etc.
Rather, with May comes "summer break" and the tantalizing possibility of finally honoring long overdue writing commitments, of making headway on a scholarly monograph, and of thinking deeply about new projects down the line. If sufficiently lucky, it might mean a trip or two to an archive to immerse oneself in source materials one has waited months, if not years to dive into. And it might even give one a chance to attend a conference, present a paper, and get some useful feedback from experts in one’s field. Talk about renewal!
But, alas, before one knows it, August comes around. David M. Shribman recently wrote a beautiful essay in The Wall Street Journal entitled "Whatever Happened to August?," an elegiac piece lamenting that August, once the Platonic ideal of summer, has been turned into a "month of work, school and calendars run amok." Nowhere is this more true than at universities. At colleges and universities across the land, the ecological system in town begins to become student-centric earlier and earlier each year, with suck-up seniors, jaded juniors, sophomoric sophs, eager-beaver fresh-faced frosh transforming the placid summer landscape on campus into a crowded cacophonous mob scene.
Even worse, by then the seemingly “limitless possibilities” for summer, the best-laid plans, the hopes and dreams have all been dashed. Some of the overdue commitments are still outstanding. Progress has been made on the unfinished monograph, but it still sucks. The trips to the archives brought disappointing results. The professional meetings were as boring as ever. And now the students are back. Any wonder that I get depressed?
To make things even worse, it seems more and more as though “summer break” is over just after the 4th of July. That’s when the first, vague symptoms of SAD begin to appear. They accelerate through July and peak about the third week of August when classes begin, at which time I feel like I’m about to embark on the academic equivalent of a death march.
Funny, though, every year the symptoms recede. Gradually, I adapt to the new ecology and again find my niche. By mid-to-late September — usually a few weeks after Labor Day — the symptoms are gone and I begin to feel like myself. The "new" landscape has been naturalized. I again begin to appreciate students — the putative causes of my seasonal plague — suck-up seniors, jaded juniors, sophomoric sophs, fresh-faced frosh all.
Peter A. Coclanis is Albert R. Newsome Distinguished Professor of History and director of the Global Research Institute at the University of North Carolina at Chapel Hill.