English

History shows: the humanities have vocational utility (essay)

The current state and future prospects of the humanities are occasioning considerable anxious comment.  Many humanists are sadly resigned to a belief that the humanities have irrevocably ceded pride of place to the social sciences and sciences; and, indeed, the social sciences and sciences generate and command much intellectual energy in the 21st-century university, for understandable reasons.

The usual remedies proposed for this state of affairs have seemed to me to be limited at best and perhaps even misguided. A typical argument for the utility of the humanistic disciplines is that studying them enhances critical thought and powers of expression, and one would certainly agree.

But I wonder whether such an argument will gain much traction with college-age students and especially their parents. The data suggest a clear national trend away from the humanistic disciplines toward those that seem to offer a different kind of promise or outcome: a vocational utility or practical applicability. Under such circumstances, abstract arguments about the enhancement of critical thought – no matter how skillfully they are advanced, no matter how much one might agree with them – are less likely to prevail.

I propose here one different kind of case for the humanities, one that identifies – and celebrates – their specific vocational utility.

Now, many of my fellow humanists, I suspect, will be troubled – even offended – by such an argument: the humanities ought not to be sullied by vulgar assertions about their supposed practicality. But there would be an irony in that response to my argument.

As a historian, I – like all historians – have invariably found it informative, illuminating and useful to consider the historical context and precedents for the issue at hand. And as a student of the Italian Renaissance, I have always found it ironic that, notwithstanding likely present-day resistance to evaluating the humanities in terms of their vocational utility, they enjoyed the considerable prestige they enjoyed during the Italian Renaissance and thereafter precisely because of their perceived practical utility.

Currently, the humanities, relative not only to the current place of the sciences but also to the place of the humanities during the Italian Renaissance, have withdrawn from a prominent role in the public arena, and this, I suspect, is one of the causes of their momentarily precarious state. During the Italian Renaissance, on the other hand, the humanistic disciplines were prestige subjects of study expressly because they enjoyed a relationship to the political and social order -- because those with political authority saw real practical value in encouraging humanistic study and employing those who had undertaken and completed it.

The adherents of the studia humanitatis held posts in the governments of the Italian cities and courts of the 15th and 16th centuries; their skills enabled them to serve their employers effectively as speech and letter writers, historians of the state, diplomats and government magistrates. They wrote elegant prose that was then deployed in diplomatic dispatches and letters and in speeches that they or their employers – the bearers of political authority – delivered effectively and persuasively, in part due to the elegance of the language, in part to the emphasis characteristic of the humanist program on skilled oratorical delivery.  If I understand correctly, this is the collective opinion of a succession of distinguished historians of the Italian Renaissance: Paul Oskar Kristeller; Lauro Martines; Anthony Grafton and Lisa Jardine; James Hankins; and others.

Precisely how were such linguistic and literary skills leveraged as professional assets? In the words of one student of Renaissance humanism, rhetoric “was ... effective in the daily encounters of the tribunal, marketplace, and political forum, not to mention in diplomatic and personal correspondence.  Artful communication ... became a[n] ... .instrument for gaining or maintaining power.” Grafton and Jardine have written that the skills

...inculcated had an established practical value in fifteenth-century Italy. The ability to speak extempore on any subject in classical Latin, the ability to compose formal letters to order in the classical idiom... were... valuable assets. Equipped with them the student could serve as an ambassador, or secretary to a government department... In other words, although the programme was strictly literary and non-vocational, it nevertheless opened the way to a number of careers....[T]he independence of liberal arts education from establishment values is an illusion. The individual humanist is defined in terms of his relation to the power structure, and he is praised or blamed, promoted or ignored, to just the extent that he fulfils or fails to fulfil those terms. It is ... a condition of the prestige of humanism in the fifteenth century, as Lauro Martines stresses, that “the humanists ... were ready to serve [the ruling] class.”

“In this setting,” Grafton and Jardine continue, “the rhetoric of humanism represents the power of Latinity and eloquence as actual power – as meshed with civic activity in a close and influential relationship.”

As models for their linguistic practices, the Italian Renaissance humanists turned to familiar and newly recovered classical texts, and the classicizing character of university education in the post-Renaissance European and Europeanized world is directly attributable to the influence of the Renaissance humanists, who advocated strenuously and successfully for the virtues of their particular disciplines. As late as the mid-to-late 19th century, venerable American liberal arts colleges offered a course of study for the A.B. degree that continued to feature classical texts, almost to the exclusion of other subject matter. (The course of study for the A.B. at such institutions also included some more limited course work in “geometry and conic sections,” algebra, plane and spherical trigonometry, mechanics, “general chemistry and the non-metals,” and additional subjects other than classical languages and literatures.)

So persuasive had the Italian humanists been in their advocacy that, centuries later, the course of study in the classic 18th- and 19th-century American liberal arts college continued to reveal the influence of the Italian Renaissance, notwithstanding the challenges one would have faced in arguing compellingly for the continuing utility of such an educational tradition in 18th- and 19th-century America. The Harvard historian Bernard Bailyn wrote that “[t]he classics of the ancient world are everywhere in the literature of the [American] Revolution,” “everywhere illustrative… of thought. They contributed a vivid vocabulary..., a universally respected personification...of political and social beliefs. They heightened the colonists’ sensitivity to ideas and attitudes otherwise derived.” And, indeed, James Madison, A.B., LL.D. Princeton University, 1771, 1787, mastered several ancient languages before “fathering” the American Constitution.

Harvard president and chemist James Bryant Conant could write as late as the 1950s that “[in] Europe west of the Iron Curtain, the literary tradition in education still prevails. An educated man or woman is a person who has acquired a mastery of several tongues and retained a working knowledge of the art and literature of Europe.”

Now, what does one learn from this brief primer on the historical context? First, that advocacy – the kind of advocacy characteristic of the Italian Renaissance humanists, who, according to Kristeller and those who wrote after him, wrested a temporary position of preeminence in their society precisely through the force and effectiveness of their advocacy – is perfectly acceptable, and carries no risk of coarsening the quality of the enterprise: a succession of Italian Renaissance humanists beginning with Petrarch advocated spiritedly for their program, and one could scarcely argue that their intellectual achievement was cheapened as a result of that advocacy.

And second, that such advocacy is especially successful when it legitimately emphasizes vocational utility and professional applicability, when it advances an argument that one’s field of study leads incontrovertibly to coveted careers and has concrete benefits for the state and for the political and social order.  Let us be spirited advocates, therefore, and celebrate the utility of the humanities as one of the justifications for studying them.

Could a similar, and similarly effective, case be made today for the humanistic disciplines?  I believe so. In what ways could one argue – reasonably, justifiably, and therefore persuasively – that the humanities have direct professional viability, and that one can therefore envision and countenance studying them not only because of the intrinsic intellectual satisfactions of doing so or merely because their study enhances critical thought or powers of expression in some abstract sense, but also because there is true, clear utility to doing so?

It would not be difficult to inventory a considerable number of coveted professions and enterprises where humanistic training is not only professionally valuable, but indispensable. I offer just a few possibilities here, and the list could easily be extended, I should imagine. (For example, Lino Pertile suggested the importance of humanistic training to careers in the growing nonprofit sector.) 

And my argument is that, in our advocacy for the humanities, we should not be at all reluctant to make much fuller and more explicit reference to their career utility.

What would a 21st-century inventory of concrete vocational applications of the humanities look like? For example:

A field that embraces what was once termed bioethics and related areas. When one addresses and attempts to resolve such pressing public-policy issues as stem-cell research, abortion, the availability of health care, female genital mutilation, AIDS, epidemics and pandemics, and many others, a satisfactory resolution of the problems encountered will depend not solely on scientific and medical expertise, but also a command of the time-honored questions of the ancient discipline of philosophy: notions of justice (for example, determining how to distribute justly limited resource like health care); morality; and ethics. These are urgent matters that require a humanist’s expertise and the philosophers’ millennia of experience in analyzing such vexing issues. The career possibilities in international health organizations, government agencies, non-government organizations, and think tanks seem promising. The indispensability of the humanities to the successful practice of this field is such that it is now often termed the medical humanities.

Architecture and urban planning. The architect and urban planner creates the built environment (an urgent and practical, enterprise, in that human beings require spaces in which to live and work), and in doing so, he or she functions at the nexus of the political-economic, the social, and the aesthetic; the architect and urban planner is equal parts humanist (who deploys aesthetic sensibilities in the design work) and sensitive reader of the practical social, political, and economic contexts within which he or she necessarily operates. Enlightened city planning offices welcome colleagues with such sensibilities.

Foreign service and diplomacy. Never before has there been a more urgent need for skilled readers of cultural difference. A sensitive humanistic understanding of other cultures, acquired above all through the rigorous study of foreign languages (and literatures), will be indispensable in coming to terms with such developments as the encounter of Islam and the European and Europeanized worlds. The repercussions for so practical a consideration as American national security are obvious, and one can imagine many outlets for such skills in government service.

Various modes of public discourse (or “writing in action,” as my former Tulane colleague Molly Rothenberg has termed it). By this I mean the effective use of language in the public arena, such as journalism (both print and broadcast, and, increasingly, digital) or television and motion-picture screenwriting. But it could also be extended to embrace advertising (increasingly web-based, which entails yet another humanistic skill, the aesthetic sense required in the visual and aural material that now invariably complements text); web-page design (which, once more, will entail a fusion of the visual, aural, and textual); and related enterprises. The humanist’s command of the aesthetic complexities of text and language, visual image, and aural material, and their simultaneous deployment will be indispensable. Indeed, the digital technologies of the 20th and 21st centuries are so powerful, and the full reach of the transition currently under way so difficult to apprehend, that one can only speculate as to what shape human communication will take when the shift to a new paradigm is more or less complete. (Indeed, humanistic sensibilities may prove to have a salutary, tempering influence on the effects of digital technologies.) The skillful fusion of still and moving images, aural material, and text will determine the effectiveness of MOOCs, which will depend as much on humanistic skills as scientific and technical.

Rhetoric and oratory. This element is related to the previous one. The electronic information technologies that emerged beginning with the invention of the telegraph in the 19th century have a characteristic that makes them unlike manuscript copying and print: they “dematerialize” information and make it possible for it to be disseminated with lightning speed across vast distances. And the invention of radio, film, and television added the elements of the aural and moving visual to those that had characterized the medium of print (and manuscript copying before it): written text and still image. These newer technologies more closely replicate “live” human experience, and much more closely than print, which freezes discourse, and alters its character. As a technology, print (and the media associated with it) have been giving way to electronic technologies, with their capacity for the full integration of written and spoken language, still and moving image, and sound (music and other aural material), and for the dematerialization and dissemination of such information. The implication for colleges and universities is as follows: we have invested admirably in initiatives designed to train our students to write well and read texts critically and perceptively. But given the power of the new technologies, there is a case to be made for a return to greater instruction in rhetoric and oratory, to an equal command of the spoken word, which can be captured on audio-  or videotape or broadcast over radio, television, and the computer (via Skype), in a guise that print has never demanded. The development of electronic communication technologies that permit us to communicate extemporaneously over vast distances in a conversational tone and manner, suggests that we might well retool our educational system to feature once again the time-honored humanistic practice of effective oratory and refine our students’ facility in the spoken word.

One need only consider the example of Barack Obama’s skilled oratory (or Franklin Roosevelt’s, or Ronald Reagan’s, or John Kennedy’s) to appreciate the importance to the political order of a venerable humanistic skill like oratory; these are political figures who postdate the development of electronic technologies, notably. Columnist George F. Will has observed that the American presidency is “an office whose constitutional powers are weak but whose rhetorical potential is great.”

By no means do the new electronic information technologies obviate the need for continuing skill in other, more traditional and familiar humanistic modes of communication – the kind of careful, comprehensive, subtle argument that written text affords – and the close, sensitive reading and command of existing texts that inform the authorship of new texts. Henry Riecken suggested that “[t]he text of the Federalist Papers was put into machine-readable form in order to carry out an analysis that resolved questions of disputed authority of some of the papers; but the new format did not replace the bound volumes for readers who want to absorb the thoughts and reflect on the aspirations of this stately document.”

Art conservation, and its relationship to the political economy. Nations with an exceptional legacy of monuments in the visual arts (Italy being an well-known example) face a particular challenge with respect to maintaining the condition of that legacy. And in Italy’s case, the relationship of the condition of that legacy to the economy is obvious: given the central place of tourism in the Italian economy, it is vital that the nation’s artistic patrimony be satisfactorily conserved.  Sensitive art conservation is at the intersection of the humanistic (the aesthetic), the scientific and technological (an understanding of the nature of surfactants and the effects of environmental conditions), and the political-economic (the need to balance the claims of conserving the artistic patrimony acceptably against other claims on public resources).

What is interesting about this list is how closely its elements are aligned with the Italian Renaissance humanist’s earlier construction of the studia humanitatis. The kind of ethical reasoning demanded in successful practice of the medical humanities is, in its way, a modern iteration of the Renaissance humanist’s moral philosophy; 21st-century applications of writing, rhetoric, and oratory are, in their way, contemporary versions of the Renaissance humanist’s grammar, poetry, and rhetoric; the understanding of foreign cultures and languages required for effective foreign service in today’s bewilderingly complex and interdependent world is, in its way, the modern expression of the Renaissance humanist’s practice of history. The foundational elements of the core humanistic program have perhaps not changed so very much.

What is different is the explicitness with which the Renaissance humanists advocated – persuasively, compellingly, successfully – for the professional utility of their disciplines, which permitted them to secure a place of considerable prestige and authority in their world. There is warrant for their 21st-century successors’ advancing a similar argument: that one undertake the study and practice of the humanistic disciplines not only within the confines of the academic world (as intrinsically worthwhile, in a fundamental intellectual sense) but outside them as well (as critical to the successful execution of one’s expressly professional and vocational responsibilities).

Specifically, I propose that we self-consciously reframe the presentation and delivery of the humanistic offerings of the modern-day college and university to make much more explicit reference to their potential applicability: that we foreground this kind of argument for their virtues. Some of what is now being done within the university is being done absentmindedly, so to speak, without a sufficiently self-conscious articulation of why we do what we do. Were we to reframe our offerings in this way – reposition the humanities and articulate their virtues differently – we might find that the national trend away from them could be halted and perhaps even reversed. 

My sense is that many students rather naturally hunger for the humanistic disciplines and are driven to make other curricular choices in part because of concerns about career viability.  Were such concerns addressed – legitimately, effectively, persuasively – we might find some such students willing to study what their hearts prompt them to study. In our curriculums, were we to foreground explicit, purposeful reference to the ways in which the humanities are indispensable to the successful practice of some of the esteemed and rewarding professions identified above (rewarding in several senses of that word), we might succeed in alleviating student (and parental) anxiety about the practicality of studying such supposedly “impractical” subjects.

Only by such means, I believe, will the humanities truly be able to re-secure the place they once enjoyed, and still deserve, in the collective cultural imagination and in the great public arena. And by no means should we be hesitant about advancing such an argument, since we have the example of the Italian Renaissance before us: it would be difficult to argue that energetic advocacy on grounds of vocational viability compromised the artistic and intellectual integrity of the achievements of Petrarch and his venerated successors.
 

Anthony M. Cummings is professor of music and coordinator of Italian studies (and former provost and dean of the faculty) at Lafayette College.

Essay on J. Redding Ware, philologist of slang and detective-fiction pioneer

Very seldom in writing about scholarly publishing have I had an occasion to use the word “fun”  -- actually, this may be the first time -- but with a couple of recent titles, nothing else will do.

They are not frivolous books by any means. Sober and learned reviews by experts may appear in specialist journals two or three years from now, and they will be instructive. But the books in question should generate some interest outside the field of J. Redding Ware studies.

Nobody who appreciates the history and color of the English language can fail to enjoy the University of Oxford Bodleian Library’s new facsimile edition of Ware’s masterpiece, the invaluable Passing English of the Victorian Era: A Dictionary of Heterodox English, Slang, and Phrase. First published in 1909, shortly after Ware’s death, it is now available as Ware’s Victorian Dictionary of Slang and Phrase -- a title that is more marketable, perhaps, but decidedly wanting in precision. Most of the lingo it covers is Victorian, but the dictionary itself, appearing as it did in the final month of the king’s life, is Edwardian. (A pedant’s work is never done.) The new edition contains a valuable introduction by John Simpson, who retires this month as chief editor of the Oxford English Dictionary.

It covers almost everything currently known about Ware (i.e., not much) and assesses his contribution to lexicography, which was considerable. Ware’s dictionary is cited in the OED more than 150 times, including “almost 50 times for the first recorded evidence for a word or meaning.” My earlier reference to Ware studies was a bit of a stretch, since nobody has ever written a book about him, nor a dissertation -- nor even, it seems, a journal article devoted solely to his life or works. A couple of papers by Joyce scholars identify his dictionary as a source the novelist consulted while writing Ulysses. Simpson’s introduction is the landmark in a nearly barren field.

Ware was born in 1832 and his first name was James, after his father, who was a grocer. In his teens the future author served a short jail sentence following a violent family quarrel, during which he grabbed a bacon knife and threatened to kill James the elder. He worked as mercantile clerk while writing his first novel, published in 1860. From that point on Ware seems to have eked out a hardscrabble existence along the lines George Gissing depicts in New Grub Street, cranking out fiction, journalism, and quite a few plays, along with a handbook on playing whist and a tourist’s guide to the Isle of Wight.

Simpson unfortunately neglects to mention one other documented occasion when Ware went to court, seeking relief from a downstairs neighbor who played the piano at all hours. The outcome remains unclear but it seems no bacon-knife was involved.

Not quite a gentleman, then, nor by profession a scholar. Ware’s dictionary was commissioned by Routledge as a supplement to another volume on colloquial English. “His lexicographical method is arguably modern,” Simpson notes, “as he based his selection largely on printed evidence that he (and, one imagines, other interested parties) had collected principally from newspapers and other ephemeral sources… The latest quotations date from 1902, though the majority date from the 1880s and 1890s.”

No don would have come up with what Simpson calls Ware’s “idiosyncratic labeling system,” which identifies the provenance of a given piece of slang through categories such as “Slums,” “Italian Organ-Grinders,” “Music Hall,” or “Colloquial Imbecile.” He clearly spent a good bit of time hanging around theaters and pubs, and “was painfully aware of changes in hairstyles and fashion generally over the decades, and can with the help of his printed evidence place the introduction of a new ‘look’ to a precise year.”

Some of the expressions Ware includes have passed into accepted use. He identifies opportunism as a piece of political slang from the 1860s, explaining: “Used rather in contempt, as subserving conscience to convenience, or to personal advantage.” It turns out that flappers -- young women of an immodest sort -- were on the scene well before the 1920s, And while Susan Sontag doesn't mention it in wrote her notes, Ware identified camp as an adjective applying to “actions and gestures of exaggerated emphasis," noting that it was “used chiefly by persons of exceptional want of character.” By that Ware undoubtedly means sissies, “effeminate men in society,” a term he indicates (citing an American newspaper) caught on in the 1890s.

I would have assumed the slang word narc -- pertaining to an informer, and used as both noun and verb -- derived from narcotics. Apparently not: copper’s nark is defined as thieves’ argot, also from the 1890s, meaning “a policeman’s civilian spy.” Ware indicates that police were called coppers from 1868 on, and you’d have found a copper-slosher (“individual with the mania for ‘going for’ policemen”) hanging around a copper’s shanty, as the station house was known. A working-class person thought to be blustering risked the taunt “Copper! Copper!” – implying that he was “giving himself the airs of police authority.”

But where did cop itself come from? “There has been more discussion over this widely applied word than any other in the kingdom of phrase,” writes Ward in one of his longer entries. It is incredibly polysemic, meaning “taken, seized, thrashed, struck, caught by disease, well-scolded, discovered in cheating,” and could also be used as a verb meaning “to take too much to drink” (hence copping the brewery). Thunderous applause for an especially good show at a music hall show would cop the curtain, so that the performer could take a bow.

The vast majority of Ware’s 4,000 entries define expressions that vanished without a trace. Hence his original title: "passing English" comes and goes. It's the vigor of language that drives the vitality of the book.

One craze of the 1880s was corpse-worship – “the extreme use of flowers at funerals” – which got so bad that by the ‘90s “many death notices in the press were followed by the legend ‘No flowers.’ ” Slang words often come from the contraction of common expressions; for example, damirish, “damned Irish,” and damfino, “I am damned if I know.”

Nobody still describes an egg gone bad as suitable for electioneering purposes (derived from “the exercise of projecting them at antagonistic candidates”) and the culture is all the poorer for it. Then again, it's a relief that suggestionize -- an old bit of legal jargon meaning “to prompt,” as with a witness – never caught on. Now if we could just euthanize “finalize.”

Decades before the birth of Jerry Garcia, deadhead was the entertainment-industry label for an audience member who got in without buying a ticket. It applied to critics and “’theatrical people’… [who] never pay to enter a theatre.” Ware, as a playwright, resented them, and the dictionary vents his frustration in an amusing manner:

“The experienced eye can always divide the deadheads from the ‘plank-downers’ in a theatre. The deadheads are always dressed badly, and give themselves airs when looking at the inferior parts of the house. The plank-downers never give themselves airs, mean business, and only look at the stage. Deadheads are very emphatically described by a theatrical official: ‘Here come two more deadheads; look at their boots.’”

Many entries are no doubt the only record of a term or catchphrase, and in some cases the lexicographer can just guess what they originally signified. Who stole the goose? is an “interjection of contempt, which appears to have some hidden meaning, probably of an erotic nature.” in the case of Who took it out of you?, Ware doesn't even try. The meaning is “wholly unknown to people not absolutely of lower class.”

Of comparable subaltern origins, but easier to understand, is the slang term label for sausage: bags o’ mystery. That one should come back into circulation. Its use could be extended to the hot dog.

Speaking of mystery, another book recently reissued in facsimile is Andrew Forrester, Jr.’s The Female Detective, a collection of short fiction from 1864,  reprinted by the British Library and distributed in the U.S. by the University of Chicago Press. A digital edition is available from Amazon.

You can find the book online in PDF for free -- which is also true with Ware’s dictionary, although Simpson’s introduction is not to be missed. With The Female Detective, the new material consists of a foreword by Alexander McCall Smith (best known for his No. 1 Ladies Detective Agency novels, but also the author of What W.H. Auden Can Do For You, just out from Princeton University Press) and an introduction by Mike Ashley, whose books include a biography of Algernon Blackwood.

The Female Detective is offered as a collection of reports from the files of the elusive Miss G----- (she is nothing if not discreet) as edited by Forrester. The detective’s “casebook” was a very popular genre at the time, part of the “railroad literature” that sprang up to meet the demand of commuters. Forrester later wrote at least two more such collections, but his place in the history of the genre comes from having created Miss G---- (a.k.a. Mrs. Gladden), the first female professional detective in fiction. Kathleen Gregory Klein devoted several pages to the book in The Woman Detective: Gender & Genre (University of Illinois Press, 1988) and puzzlement over Forrester's identity – was it a pseudonym? – has echoed down the scholarship ever since.

It now seems very likely that the author was, in fact, J. Redding Ware. Simpson accepts it as credible in his introduction to the dictionary, as does Mike Ashley in the opening pages of the short-story collection. The identification was proposed by Kate Summerscale in the notes to her nonfiction novel The Suspicions of Mr. Whicher: A Shocking Murder and the Undoing of a Great Victorian Detective (2008). In researching her book, Summerscale noticed that Ware published a pamphlet about the crime in question: a child murder that occurred in a country house in 1860. (The circumstances are oddly reminiscent of the JonBenet Ramsey case.) He seems to have incorporated the text into a chapter of The Female Detective.

Mrs. Gladden had keen powers of observation and deduction, and a reader can’t help thinking of the much better-remembered private eye who came on the scene later in Victoria’s reign. Ware must have felt that Arthur Conan Doyle had stolen his thunder – though that seems like a rather peculiar phrase, come to think of it.

Wade lists it in his dictionary, explaining that it means “annexing another man’s idea, or work, without remunerating him, and to your own advantage.” It was first used, he writes, by one John Dennis, “a play-writer of the 17th century, who invented stage thunder for a piece of his own which failed.” The theater manager incorporated the technique in the production of someone else’s play, prompting the enraged Dennis to yell out, “They won’t act my piece, but they steal my thunder.”

I hope J. Redding Ware studies comes into its own, or at least that others discover him. How often is it worth reading a dictionary just for the stories?

Editorial Tags: 

Essay on building audience for the humanities

It’s been a few weeks since "The Heart of the Matter," the congressionally ordered report on the state of the humanities and social sciences, was issued by the American Academy of Arts and Sciences. And while there is little to object to in the actual text, which brims with veracities on the importance of education and good citizenship, the smarting hasn’t stopped.

In The New York Times alone, we heard from three pundocratic naysayers. David Brooks, a member of the commission, bemoaned the collective suicide of the humanities professoriate, Verlyn Klinkenborg lamented the decline and fall of the English major, and Stanley Fish excoriated the report itself for its "bland commonplaces" and "recommendations that could bear fruit only in a Utopia" (and as a Miltonist, he knows that’s not the world we live in).

There have been valiant attempts to parry this latest volley of gloom regarding the "crisis of the humanities."

But even if "our classrooms remain packed," that still leaves the population at large. And there, a basic contention remains unanswered: the, by now, self-evident truth that the humanities are a "public relations failure" (Fish ventriloquizing Klinkenborg). Humanists are just no good at explaining why their work matters. No wonder, then, that the public has little use for their efforts.

The marketing problem is such a commonplace in the debate, it goes virtually unremarked these days. But to me, it raises a very basic question: why is it the job of a humanist to be her own advertiser? Other creative types have managers, agents, and publicists for that task, to say nothing of film studios, theaters, galleries, and museums – infrastructures, in other words, that allow them to focus on what they do best.

What would such an institution look like for humanists? As it happens, I help lead one. The Chicago Humanities Festival, which I serve as artistic director, is the largest organization of its kind in the United States. In fact, we were name-checked as such in "The Heart of the Matter," praised for our success in "inviting academics and artists to share their passions and expertise with new audiences."

We’re in our 24th year of doing so. Our annual fall festival features about 100 events, always organized around a theme. Attendance is around 50,000, with hundreds of thousands more consuming our content digitally. Not your average public relations failure, we like to think.

But how do we do it? How do we get thousands of people to come out to hear humanities professors lecture?

It’s really quite simple: we treat our presenters as stars and our events as performances.

It’s probably best to think about this mode of operation in contrast to your typical university programming. Sure, those lectures are free and open to the public. But how would that public even know? Dreary leaflets on campus bulletin boards aren’t likely to draw general audiences. Nor would such crowds feel particularly engaged by often dry and specialized presentations. (I should hasten to add that in my other life as a university professor, I can get quite excited about an earnest announcement circulating on a listserv; but then again, I’m already sold on the product.)

Everything we do at the Chicago Humanities Festival is designed to break those kinds of barriers. We write tantalizing copy for our events, place our ads in all the local media, and hustle for coverage in those same publications.

And we make sure that our featured talent appears in the best light possible. We think about stage sets and backdrops and fuss endlessly with our sound systems. Even more importantly, we spend a huge amount of time thinking about the best way to showcase a speaker. Is the great Harvard historian we invited known to go on and on when lecturing? No problem, we feature him in conversation with that gentle, yet firm journalist who has a deep passion for his subject. Is there worry about the density of a speech on continental philosophy? We coach the presenter in the joys of multi-media.

When it all works out, we get big, expectant audiences who, after basking in the erudition of our speakers (and asking sometimes stunningly insightful questions), can’t wait to come back for more.

This isn’t so different from the way other cultural organizations operate. Let me take an example from the world of opera, another domain whose utility could seem suspect in the glare of neoliberal scrutiny. When soprano Anna Netrebko was scheduled to make her Chicago debut in the 2012-13 season, it brought shivers of anticipation to local opera fanatics. But for a good chunk of Lyric Opera’s patrons, she was just a Russian-sounding name. It was the task of the company to create the appropriate excitement for her bow as Mimì, to say nothing of providing her with all the theatrical props needed for an optimal performance. The result: a sold-out run and something approaching collective hysteria (of the good kind).

Why should it be so different for humanists? Sure, in the academy we recognize folks like Julia Kristeva, Frans de Waal, and Maria Tatar as stars (to mention three of the speakers who will join us for this fall’s festival on Animal: What Makes Us Human). But just like your average opera diva, they need a bit of professional marketing and a decent production to find and captivate their audience. As a performing organization in the humanities, that’s our job. And after 24 years, we know how to do it.

Imitators wanted!

Matti Bunzl is artistic director of the Chicago Humanities Festival and professor of anthropology at the University of Illinois. 

Editorial Tags: 

Essay defending the way creative writing is taught

Creative writing has its share of detractors, those who believe that the study of and teaching of creative writing produces deleterious effects for students and for literature. For example, in "Poetry Vs. Ambition,"  Donald Hall worries that invention exercises (writing warm-ups which help writers find their subject) in writing classrooms "reduce poetry to a parlor game," producing "McPoems" on assembly lines. "Abolish the M.F.A.!" states Hall with an exclamation point, and then, in Latin, he cries, "The Iowa Writers Workshop must be destroyed."

Hall’s essay reflects a particularly unproductive strand of criticism aimed at creative writing that has arisen of late. Anis Shivani is perhaps its most recent practitioner, with a soapbox on which to stand, but certainly not the only detractor. Indeed, when Shivani’s critiques of creative writing programs emerged, preceding the publication of his book Against the Workshop: Provocations, Polemics, Controversies (2011) many of us were approached by creative writing colleagues wanting to know what we thought of this brash new critique of the workshop. What they didn’t understand was that his assertions weren’t new at all, but by boldly ignoring the scholarly conversations that had been going on about this subject for many years — a fact made plain by what is missing in the book, specifically in the works cited — Shivani was able to create a scholarly stance that acted as if they were.

In fact, what Shivani, Hall and others practice is better termed a "new old criticism" — new because it proliferates in electronic social media, old because it rehashes simplistic assertions that have been around for decades. This argument has three major problems: First, its rhetorical stance is far more appropriate for talk radio than for a serious scholarly or public debate; its practitioners refuse to engage with the actual arguments of those with whom they disagree. Second, this new old criticism is rooted in dated and limited assumptions about what creative writing is and can be.

In reading this criticism, one can see that it is aimed at the Iowa Writers’ Workshops as they are said to have existed in the 1950s; there is no admission of the diversification and complexity of creative writing that has flourished in the decades since then. Third, this new old criticism is stuck within a narcissistic worldview. It perceives an age-old challenge — the difficulty of writers finding readers in a world where print technology proliferates — as a personal affront. The new old criticism drapes itself in a narrow banner of great art, adopting the hubristic stance that a writer can actually know with certainty that he or she is producing such great art in the moment of creating it.

So should everyone associated with creative writing programs pack up and shut our doors? This isn’t going to happen. The horse is out of the barn. Creative writing classes are more popular than ever, in part because they offer not just a means of expression but an alternative to theory-laden literary analysis.

The real questions are: How can we best design our curriculums and our classes to best serve the needs of our students? What can we do to ensure that creative writing — the teachers, the students, the courses, the programs — has a positive impact on contemporary literature? Which aspects of creative writing — the writing itself as a process and as a product of our efforts — can be taught, and what are the best practices for such teaching? How does creative writing fit into English departments, into the liberal arts generally, and into the colleges and universities where it is housed? Finally, in our breathtakingly tight economy, what kinds of careers and lives are creative writing students being prepared for?

Given the scope of its critical mass, creative writing stands as a knowledge-based discipline. Rather than associate knowledge with certainty as traditional academic models often do, the knowledge in creative writing is in the discovery that takes the writer beyond the routines and in the questions that arise and that are answered through the writing process. Study of writing through reading and writing is the methodology we use; this mode of knowledge acquisition leads to new conclusions. Knowledge through practice, through doing, through thinking about and talking about what we’re doing. To wit, we have observed that the "flipped" classroom, in which students absorb lectures online outside of class and come to class to work hands-on with the material, has become the latest trend in college teaching.  

By engaging students in hands-on work on their own writing and that of others, the oft-maligned "workshop," which has evolved over the years to suit varying constituents, undergraduates, graduate students, general education students and majors, has modeled a "flipped" classroom almost since its inception. This conversation about creative writing also speaks to what has become recently known as the crisis in the humanities.  Helene Moglen, in the latest issue of the Modern Language Association's Profession, gives a convincing overview of a crisis that goes back to the 1980s, with the report called "A Nation at Risk." Among the few causes of the crisis in the humanities that Moglen defines are "internal disagreements about the appropriate development of our disciplines" and "prevalent social attitudes toward education that assume irrelevance of humanistic study."

David Fenza, of the Association of Writers and Writing Programs, points out, in his history of creative writing in higher education, that "creative writing classes have become among the most popular classes in the humanities." To meet demand, creative writing programs have at least tripled in number in the last 30 years, and many of us are housed in English departments. If the humanities are in crisis, creative writing is not. In fact, creative writing is not only healthy within the academy but has relevance beyond it. Contemporary literature, after all, is written by creative writers, whether or not they have earned an M.F.A. This relevance offers an example for other disciplines, a way to resist prevalent social attitudes that overlook the value and contributions of the arts and humanities to our culture and our daily lives.

Finally, many creative writing programs have recognized that while most students won’t necessarily go on to become the next Jonathan Franzen (just like most violin students won’t play with the National Symphony, or sculpture students exhibit their work at the Hirshhorn), they do want to work in creative industries. A survey of the curriculums of many of these programs, which usually offer courses not only in creativity and craft but also in new media, editing and publishing, reveals that they prepare students to do just that.

The sniping about what’s appropriate for our discipline or whether creative writing should even be an academic discipline emerges, however, from within our ranks. Hall has taught workshops and visited creative writing programs to read his work, and Shivani is a creative writer as well as a critic. Airing our internal disagreements and pitting writers against each other — outside or within the academy — does few of us any good and invites a sense of crisis in creative writing when there isn’t one. Let’s do our research. Let’s have productive conversations.

People who shoot the occasional salvo at creative writing aren’t really interested in taking part in a conversation, but we are, and we invite others in our midst to join us in this ongoing conversation about our discipline. This discussion can shape the healthy development of creative writing, position us positively within academe, and shift social attitudes toward a better future for literature and learning.

Tim Mayers is author of (Re) Writing Craft: Composition, Creative Writing and the Future of English Studies. He teaches at Millersville University of Pennsylvania.

Dianne Donnelly is the author of Establishing Creative Writing Studies as an Academic Discipline, editor of Does the Writing Workshop Still Work and co-editor of Key Issues in Creative Writing.

Tom Hunley is an associate professor at Western Kentucky university.  His books include The Poetry Gymnasium, Teaching Poetry Writing and Octopus.

Anna Leahy is the author of Constituents of Matter  She edited Power and Identity in the Creative Writing Classroom. She teaches at Chapman University.

Stephanie Vanderslice is professor of writing and director of the Arkansas Writer's M.F.A. Workshop, at the University of Central Arkansas.

 

Editorial Tags: 

New Academy of Arts and Sciences report stresses importance of humanities and social sciences

Smart Title: 

Amid talk of outcomes-based education, a new report from the Commission on the Humanities and Social Sciences stresses the disciplines' role in long-term career success and international competitiveness.

review of Ted Anton, 'The Longevity Seekers: Science, Business, and the Fountain of Youth'

Standing in line at the drugstore a couple of weeks ago, I spied on the magazine rack nearby this month’s issue of National Geographic – conspicuous as one of the few titles without a celebrity on the cover. Instead it showed a photograph of an infant beneath a headline saying "This Baby Will Live to Be 120."

The editors must have expected disbelief, because there was a footnote to the headline insisting that the claim was not hype: "New science could lead to very long lives." When was the last time you saw a footnote in a popular periodical, on the cover, no less? It seemed worth a look, particularly after the septuagenarian in front of me had opened complex, in-depth negotiations with the pharmacist.

The headline, one learns from a comment on the table of contents, alludes to a traditional Jewish birthday wish or blessing: "May you live to be 120." This was the age that Moses was said to have reached when he died. The same figure appears -- not so coincidentally perhaps – at an important moment in the book of Genesis. Before sending the Flood, Jehovah announces that man’s lifespan will henceforth peak at 120 years. (I take it there was a grandfather clause for Noah. When the waters recede, he lives another 350 years.)

The cap on longevity, like the deluge itself, is ultimately mankind’s own fault, given our tendency to impose too much on the Almighty’s patience and good humor. He declares in about so many words that there is a limit to how much He must endure from any single one of us. Various translations make the point more or less forcefully, but that’s the gist of it. Even 120 years proved too generous an offer – one quietly retracted later, it seems. Hence the Psalmist’s lament:

“The days of our years are threescore years and ten; and if by reason of strength they be fourscore years, yet is their strength labor and sorrow; for it is soon cut off, and we fly away.”

Nursing homes are full of people who passed the fourscore marker a while ago. If you visit such places very often, as I have lately, “May you live to be 120” probably sounds more like a curse than a blessing. Not even a funeral obliges more awareness of mortal frailty. There is more to life than staving off death. The prospect of being stranded somewhere in between for 30 or 40 years is enough to make an atheist believe in hell.

Meanwhile, in science…. The medical and biological research surveyed in that NatGeo article promises to do more than drag out the flesh’s “labor and sorrow” a lot longer. The baby on the magazine cover will live his or her allotted span of six score decades with an alert mind, in a reasonably healthy body. Our genetic inheritance plays a huge but not absolutely determinate role in how long we live. In the wake of the mapping of genome, it could be possible to tinker with the mechanisms that accelerate or delay the aging process. It may not be the elixir of youth, but close enough.

Besides treating the same research in greater depth, Ted Anton’s The Longevity Seekers: Science, Business, and the Fountain of Youth (University of Chicago Press) emphasizes how profound a change longevity research has already wrought. It means no longer taking for granted the status of aging as an inescapable, biologically hardwired, and fundamentally irreversible process of general decline. Challenging the stereotypes and prejudices about the elderly has been a difficult process, but longevity engineering would transform the whole terrain of what aging itself entails.

Anton, a professor of English at DePaul University, tells the story in two grand phases. The first bears some resemblance to James Watson’s memoir The Double Helix, which recounts the twists and turns of laboratory research in the struggle to determine the structure of DNA – work for which he and Francis Crick received a Nobel Prize in medicine in 1962. Watson’s book is particularly memorable for revealing science as an enterprise in which personalities and ambitions clash as much as theories ever do. (And with far more rancor as Watson himself demonstrated in the book’s vicious and petty treatment of Rosalind Franklin, a crystallographer whose contribution he downplayed as much as possible.)

A practitioner of long-form journalism rather than a longevity researcher, Anton writes about conflicts in the field with some detachment, even while remaining aware that the discoveries may change life in ways we can’t yet picture. The initial phase of the research he describes consisted largely of experiments with yeast cells and microscopic worms conducted in the 1990s. Both are short-lived, meaning that the impact of biochemical adjustments to their genetic “thermostats” for longevity would register quickly.

During the second phase of Anton’s narrative, lab research involved more complex organisms. But that that was not the most important development. The public began hearing news flashes that scientists had discovered that the key to a longer life was, say, restricted caloric intake, or a chemical called resveratrol found in red wine. Findings presented in scientific journals were reported on morning news programs, or endorsed on Oprah, within days or even hours of publication. Hypotheses became hype overnight.

This generated enthusiasm (more for drinking red wine than restricting calories, if memory serves) as well as additional confidence that biotechnological breakthroughs were on the way. Everybody in longevity research, or almost everybody, started a company and ran around looking for venture capital. Models, evidence, and ideas turned proprietary information -- with the hurry to get one’s findings into professional journals looking more and more like the rush to issue a press release.

So far, no pharmaceutical has arrived on the market to boost our lifespans as dramatically as the worm and yeast cells in the laboratory worms. “The dustbin of medical breakthroughs,” Anton reminds us, “bears the label ‘It Worked in Mice.’ ” On the other hand, the research has been a boon to the cosmetics industry.

As it is, we’re nowhere near ready to deal with the cumulative effect of all the life-extending medical developments from the past few decades. The number of centenarians in the world “is expected to increase tenfold between 2010 and 2050,” the author notes, “and the number of older poor, the majority of them women,” is predicted “to go from 342 million today to 1.2 billion by that same year.”

But progress is ruthless about doing things on its own terms. Biotech is still in its infancy, and its future course -- much less its side effects -- is beyond imagining. The baby on the magazine cover might well live to see the first centenarian win an Olympic medal. I wish that prospect were more cheering than it is.

Editorial Tags: 

Review of Matthew L. Jockers, 'Macroanalysis: Digital Methods & Literary History'

“A poem,” wrote William Carlos Williams toward the end of World War II, “is a small (or large) machine of words.” I’ve long wondered if the good doctor -- Williams was a general practitioner in New Jersey who did much of his writing between appointments – might have come up with this definition out of weariness with the flesh and all its frailties. Traditional metaphors about “organic” literary form usually imply a healthy and developing organism, not one infirm and prone to messes. The poetic mechanism is, in Williams’s vision, “pruned to a perfect economy,” and there is “nothing sentimental about a machine.”

Built for efficiency, built to last. The image this evoked 70 years ago was probably that of an engine, clock, or typewriter. Today it’s more likely to be something with printed circuits. And a lot of poems in literary magazines now seem true to form in that respect: The reader has little idea how they work or what they do, but the circuitry looks intricate, and one assumes it is to some purpose.

I had much the same response to the literary scholarship Matthew L. Jockers describes and practices in Macroanalysis: Digital Methods & Literary History (University of Illinois Press). Jockers is an assistant professor of English at the University of Nebraska at Lincoln. The literary material he handles is prose fiction -- mostly British, Irish, and American novels of the 18th and 19th centuries -- rather than poetry, although some critics apply the word “poem” to any literary artifact. In the approach Jockers calls “macroanalysis,” the anti-sentimental and technophile attitude toward literature defines how scholars understand the literary field, rather than how authors imagine it. The effect, in either case, is both tough-minded and enigmatic.

Following Franco Moretti’s program for extending literary history beyond the terrain defined by the relatively small number of works that remain in print over the decades and centuries, Macroanalysis describes “how a new method of studying large collections of digital material can help us to understand and contextualize the individual works within those collections.”

Instead of using computer-based tools to annotate or otherwise explore a single work or author, Jockers looks for verbal patterns across very large reservoirs of text, including novels that have long since been forgotten. The author notes that only “2.3 percent of the books published in the U.S. between 1927 and 1946 are still in print” (even that figure sounds high, and may be inflated by the recent efforts of shady print-on-demand “publishers” playing fast and loose with copyright) while the most expansive list of canonical 19th-century British novels would represent well under 1 percent of those published.

Collections such as the Internet Archive and HathiTrust Digital Library available for analysis. Add to this the capacity to analyze the metadata about when and where the books were published, as well as available information on the authors, and you have a new, turbocharged sort of philology – one covering wider swaths of literature than even the most diligent and asocial researcher could ever read.

Or would ever want to, for that matter. Whole careers have been built on rescuing “unjustly neglected” authors, of course, but oblivion is sometimes the rightful outcome of history and a mercy for everyone involved. At the same time, the accumulation of long-unread books is something like a literary equivalent of the kitchen middens that archeologists occasionally dig up – the communal dumps, full of leftovers and garbage and broken or outdated household items. The composition of what’s been discarded and the various strata of it reveal aspects of everyday life of long ago.

Jockers uses his digital tools to analyze novels by, essentially, crunching them -- determining what words appear in each book, tabulating the frequency with which they are used, likewise quantifying the punctuation marks, and working out patterns among the results according to the novel’s subgenre or publication date, or biographical data about the author such as gender, nationality, and regional origin.

The findings that the author reports tend to be of a very precise and delimited sort. The words like, young, and little “are overrepresented in Bildungsroman novels compared to the other genres in the test data.” There is a “high incidence of locative prepositions” (over, under, within, etc.) in Gothic fiction, which may be “a direct result of the genre’s being ‘place oriented.’” That sounds credible, since Gothic characters tend to find themselves moving around in dark rooms within ruined castles with secret passageways and whatnot.

After about 1900, Irish-American authors west of the Mississippi began writing more fiction than their relations on the other side of the river, despite their numbers being fewer and thinner on the ground. Irish-American literature is Jockers’s specialty, and so this statistically demonstrable trend proves of interest given that “the history of Irish-American literature has had a decidedly eastern bias…. Such neglect is surprising given the critical attention that the Irish in the West have received from American and Irish historians.”

As the familiar refrain goes: More research is needed.

Macroanalysis is really a showcase for the range and the potential of what the author calls “big data” literary study, more than it is a report on its discoveries. And his larger claim for this broad-sweep combination of lexometric and demographic correlation-hunting – what Moretti calls “distant reading” -- is that it can help frame new questions about style, thematics, and influence that can be pursued through more traditional varieties of close reading.

And he’s probably right about that, particularly if the toolkit includes methods for identifying and comparing semantic and narrative elements across huge quantities of text. (Or rather, when it includes them, since that’s undoubtedly a matter of time.)

Text-crunching methodologies offer the possibility of establishing verifiable, quantifiable, exact results in a field where, otherwise, everything is interpretive, hence interminably disputable. This sounds either promising or menacing. What will be more interesting, if we ever get it, is technology that can recognize and understand a metaphor and follow its implications beyond the simplest level of analogy. A device capable of, say, reading Williams’s line about the poem as machine and then suggesting something interesting about it – or formulating a question about what it means.

Editorial Tags: 

Essay on how to keep humanities vibrant by rejecting elite universities' models

In "Howl," a blistering poetical rant and perhaps the most important poem of the 60’s counterculture, Allen Ginsberg anatomizes the minds of his generation. They are young men and women who "studied Plotinus Poe St. John of the Cross telepathy and bop kabbalah because the cosmos instinctively vibrated at their feet in Kansas." When students come to our offices to consider studying the humanities, we can all recite the litany of reasons for doing so. It provides them with the critical thinking skills needed for success in any career; it endows them with the cultural capital of the world’s great civilizations; and it helps them explore what it means to be human.

But for those of us who have spent our lives studying the humanities, such reasons are often just the fossilized remains of the initial impulse that set us on our educational journey -- the feeling that Kansas was vibrating at our feet, and that to chart our futures we desperately needed to understand the meaning of that vibration.

The main challenge for the humanities teacher has always been to show how the great works of philosophy, literature, religion, history, and art answer to the good vibrations in our young people. But at the dawn of the 21st century the academic scaffolding of the humanities thwarts this fundamental goal. The central problem is that the Harvard University model of humanistic study dominates academia.

The Harvard model sees the humanities as a set of distinct and extensively subdivided disciplines, overseen by hyper-specialized scholars who produce disciplinary monographs of extraordinary intellectual subtlety and technical expertise. Though the abstruse work produced with this model periodically makes it the butt of media jokes, no one with an appreciation for good scholarship would want to eliminate the rigorous discipline represented by the work of scholars at Harvard and institutions like it. But neither should it be allowed to dominate the agenda of all higher education, which it now incontestably does, to the detriment of both the humanities and the students who want to understand the meaning of their unique vibration.

The disciplining of knowledge was central to the creation of the modern research university. In the second half of the 19th century, Harvard and then schools across the academic landscape dropped their common curriculum, creating instead departments and majors. Beginning with the natural sciences of physics, chemistry, and biology, this flowering of disciplines issued in countless discoveries and insights with repercussions far beyond the university. Flushed with this success, this triumph of knowledge production, and the 19th century scientific methodology that was its seed, spread to the examination of society. The newly-invented social sciences -- economics, sociology, anthropology and the like — grabbed hold of the explosive new problems that followed in the wake of modern industrial life. But at the same time they marginalized the traditional questions posed in the humanities. The social sciences raised "humanistic" questions within the strictures of 19th century positivist assumptions about scientific "objectivity," and they have been doing so, despite post-modern blows dealt to claims of objectivity, ever since.

As the natural and social sciences divided the world between themselves the humanities threatened to become a mere leftover, a rump of general reflections and insights that lacked the rigor of the special sciences. Eager to be properly scientific themselves, and thereby forestall such a humiliating fate, the humanities disciplined themselves. They sought to emulate the success of the sciences by narrowing their intellectual scope, dividing and subdividing their disciplines into smaller and ever smaller scholarly domains, and turning themselves into experts.

The norm became the creation of inward-looking groups of experts who applied a variety of analytic approaches to sets of increasingly technical problems. In short, the humanities found themselves squeezed by the demands for professionalization and disciplinization, the need to become another regional area of study analogous in form, if not in content, to the other special sciences. And the humanities have been content to play this disciplinary game ever since.

In the last 30 years, the rise of Theory promised to breathe a new, post- modern life into this disciplinary game. By the mid-20th century, the sterility of old fashioned explication de texte was becoming apparent. The linguistic turn opened up a new way for the humanists to ape the rigor of the sciences while simultaneously extending their scholarly turf. In their zeal for technical rigor, they discovered to their delight that texts marvelously shift shape depending upon the theoretical language used in their analyses. Into the moribund body of the humanities flowed the European elixirs of psychoanalysis, phenomenology and hermeneutics, structuralism and post-structuralism, all of which boasted technical vocabularies that would make a quantum physicist blush. With these languages borrowed from other disciplines, the great books of the Western tradition looked fresh and sexy, and whole new fields of scholarship opened up overnight.

At the same moment, however, scholars of the humanities outside the graduate departments of elite universities suddenly found themselves under-serving their students. For the impulse that drives young people to the humanities is not essentially scholarly. The cult of expertise inevitably muffles the jazzy, beating heart of the humanities, and the students who come to the university to understand their great vibration return home unsatisfied. Or worse, they turn into scholars themselves, funneling what was an enormous intellectual curiosity through the pinhole of a respectable scholarly specialty.

Indeed, their good vibrations fade into a barely discernable note, a song they recall only with jaded irony, a sophisticated laugh at the naiveté of their former selves, as if to go to school to learn the meaning of their own lives were an embarrassing youthful enthusiasm. The triumph of irony among graduate students in the humanities, part of the deformation professionelle characteristic of the Harvard virus, exposes just how far the humanities have fallen from their original state. As they were originally conceived, the humanities squirm within the research paradigm and disciplinary boxes at the heart of the Harvard model.

The term "humanities" predates the age of disciplinary knowledge. In the Renaissance, the studia humanitatis formed part of the attempt to reclaim classical learning, to serve the end of living a rich, cultivated life. Whether they were contemplative like Petrarch or engaged like Bruni, Renaissance humanists devoted themselves to the study of grammar, rhetoric, logic, history, literature, and moral philosophy, not simply as scholars, but as part of the project of becoming a more complete human being.

Today, however, the humanities remain entrenched in an outmoded disciplinary ideology, wedded to an academic model that makes it difficult to discharge this fundamental obligation to the human spirit. Despite the threat of the Great Recession, the rise of the for-profit university, and a renewed push for utility the humanities continue to indulge their fetish of expertise and drive students away. Some advocate going digital, for using the newest techno and cyber techniques to improve traditional scholarly tasks, like data-mining Shakespeare. Others turn to the latest discoveries in evolutionary psychology to rejuvenate the ancient texts. But both of these moves are inward looking — humanists going out into the world, only to return to the dusty practices that have led the humanities to their current cul-de-sac. In so doing, colleges and univeristies across the country continue to follow the Harvard model: specialize, seek expertise, and turn inward.

When Descartes and Plotinus and Poe and St. John of the Cross created their works of genius, they were responding not to the scholar’s task of organizing and arranging, interpreting and evaluating the great works of the humanistic tradition, but rather to their own Kansas. Descartes and Rousseau were latter-day Kerouacs, wandering Europe in search of their souls. These men and women produced their works of genius through a vibrant, vibrating attunement to the needs of their time.

The Humanities! The very name should call up something wild. From the moment Socrates started wandering the Greek market and driving Athenian aristocrats to their wits end, their place has always been out in the world, making connections between the business of living and the higher reaches of one’s own thought, and drawing out implications from all that life has to offer. The genius of the humanities lies in the errant thought, the wild supposition, the provocation -- in Ginsburg’s howl at society. What this motley collection of disciplines is missing is an appreciation of the fact that the humanities have always been undisciplined, that they are essentially non-disciplinary in nature. And if we want to save them, they have to be de-disciplined and de-professionalized.

De-disciplining the humanities would transform both the classroom and the curriculum. Disengaging from the Harvard model would first and foremost help us question the assumption that a scholarly expert in a particular discipline is the person best suited to teaching the subject. The quality that makes a great scholar — the breadth and depth of learning in a particular, narrow field — does not make a great teacher; hungry students demand much more than knowledge. While the specialist is hemming himself in with qualifications and complications, the broadly-educated generalist zeros in on the vital nub, the living heart of a subject that drives students to study.

While a scholarly specialist is lecturing on the ins and outs of Frost’s irony, the student sweats out his future, torn between embracing his parent’s dream of having a doctor in the family or taking the road less traveled and becoming a poet. The Harvard model puts great scholars in charge of classrooms that should be dominated by great teachers. And if the parents who are shelling out the price of a contemporary college education knew their dollars were funding such scholarly hobbyhorses, they would howl in protest.

De-disciplining the humanities would also fundamentally change the nature of graduate and undergraduate education. At the University of North Texas Department of Philosophy and Religious Studies, located in the Dallas Metroplex, we are training our graduate students to work with those outside their discipline —  with scientists, engineers, and policy makers — to address some of the most pressing environmental problems the country faces. We call it field philosophy: taking philosophy out into the world to hammer out solutions to highly complex and pressing social, political, and economic problems. Graduate students participate in National Science Foundation grants and practice the delicate skill of integrating philosophic insights into public policy debates, often in a "just-in-time" manner. In class they learn how to frame and reframe their philosophical insights into a variety of rhetorical formats, for different social, political, economic purposes, audiences and time constraints.

At Calumet College of St. Joseph, an urban, Roman Catholic commuter college south of Chicago that serves underprepared, working-class Hispanic, African-American, and Anglo students, we are throwing the humanities into the fight for social justice. Here the humanities are taught with an eye toward creating not a new generation of scholars, but a generation of humanely educated citizens working to create a just society. At Calumet, students are required to take a social justice class.

In it they learn the historical and intellectual roots of Catholic social justice teaching within the context of performing ten hours of community service learning. They work in a variety of social service fields (e.g. children, the elderly, homeless, etc.), which exposes them to the real-life, street-level experience of social challenges. Before, during, and after, students bring this experience back to the classroom to deepen it through reflective papers and class discussion.

High-level humanistic scholarship will always have a place within the academy. But to limit the humanities to the Harvard model, to make scholarship rather than, say, public policy or social justice, the highest ideal of humanistic study, is to betray the soul of the humanities. To study the humanities, our students must learn textual skills, the scholarly operations of reading texts closely, with some interpretive subtlety. But the humanities are much more than a language game played by academic careerists.

Ultimately, the self-cultivation at the heart of the humanities aims to develop the culture at large. Unless they end up where they began -- in the marketplace, alongside Socrates, questioning, goading, educating, and improving citizens -- the humanities have aborted their mission. Today, that mission means finding teachers who have resisted the siren call of specialization and training undergraduate and graduate students in the humanities in the art of politics.

The humanist possesses the broad intellectual training needed to contextualize social problems, bring knowledge to bear on social injustice, and translate disciplinary insights across disciplines. In doing so, the humanist helps hold together an increasingly disparate and specialized society. The scholasticism of the contemporary academy is anathema to this higher calling of the humanities.

We are not all Harvard, and nor should we want to be.

Chris Buczinsky is head of the English program at Calumet College of St. Joseph in Whiting, Indiana. Robert Frodeman is professor of philosophy and director of the Center for the Study of Interdisciplinarity at the University of North Texas.
 

Section: 
Editorial Tags: 

Essay on how the public humanities reshaped a dean's thinking about academic humanities

With so much focus on higher education's obligations to job preparation, the humanities are perpetually playing defense, especially in public higher education. We academic defenders of the humanities generally take one of two lines: we argue that 1) our majors ARE work force preparation -- we develop strong analytical skills, good writing, problem-solving, etc., or 2) we have no need to justify what we teach because the value of the humanities, the study of what makes us human, is self-evident.

These arguments over the value of degrees in the humanities run parallel to a set of arguments I find myself making as part of a role I occupy, as a board member for my state council for the humanities. The National Endowment for the Humanities allocates about a third of its funding through the state councils, and the councils in turn fund humanities initiatives at the state level.

State humanities councils such as mine (Rhode Island's) re-grant our NEH allocation as well as the money we raise locally to community humanities projects. We've funded research on communities of Cape Verdean longshoremen in Providence, oral histories of Second World War vets in hospice care, talk-back events at local theaters, seashore sound archives, a documentary film about a female 19th-century life-saving lighthouse-keeper, and lots of fascinating digital work, from archiving to app development. All the projects must involve humanities scholars — some of those scholars are affiliated with universities, and others aren’t. All of it aims at helping Rhode Islanders to understand ourselves, our histories, and our many cultures.

When economic times are tough, an agency such as the NEH is vulnerable unless legislators understand and value the role of the humanities in a strong democracy -- just as university humanities programs are vulnerable in state funding contexts when legislators, boards of trustees, or voters don't have a clear understanding of the value of the humanities in the culture and in the workplace.

In a career spent in higher education in the humanities, most of it at a liberal arts college, I rarely had to justify teaching what I taught. The value of an English major was self-evident to my colleagues and my students. Sure, the occasional parent would squeak, "But how will she make a living?" But I never hesitated to reassure the anxious check-writers of the value of our product. Having worked in the worlds of both journalism and Washington nonprofits, I knew how many good jobs demanded only a bachelor's degree, writing skills, research and analytic abilities, and common sense.

But then came the Great Recession and what many are calling the end of the higher education bubble. Questions about tuition increases, student debt, and colleges' lack of accountability (that is, the paucity of data on employment for recent graduates) get attached, in public perception, to the unemployment rate and to a re-emergence of the old post-Sputnik fears that the nation is not training enough folks in STEM fields.

Organizations such at the Association of American Colleges and Universities have been proactive in making the case for liberal learning as preparation for good citizenship, pointing to its employers' surveys. They have found that employers believe that the skills colleges should focus on improving are: written and oral communication; critical thinking and analytic reasoning; the application of knowledge and skills in real-world settings; complex problem solving; ethical decision making, and teamwork skills. These skills are not exclusive to the humanities, but they certainly line up with the student learning outcomes in humanities instruction at my institution.

It's not as if defenders of the values of a liberal arts education are ignoring economic realities: many liberal arts colleges are adding business majors, humanities fields are requiring internships and experiential learning, and colleges and universities are scrambling to make contact with successful alumni and to gather post-graduation employment data.

There's nothing wrong with linking liberal arts education in general, and the humanities in particular, to work. The humanities are inextricably linked to work and to U.S. civic life. When Lyndon Johnson signed legislation to bring the NEH into existence in 1965, it was in a context in which the federal government was pushed to invest in culture, as it had in science. NEH's account of its own history explains that the head of the Atomic Energy Commission told a Senate committee: "We cannot afford to drift physically, morally, or esthetically in a world in which the current moves so rapidly perhaps toward an abyss. Science and technology are providing us with the means to travel swiftly. But what course do we take? This is the question that no computer can answer."

Through my role in public humanities, I have come to understand that the humanities are what allow us to see ourselves as members of a civic community. Public history, public art, shared cultural experiences make us members of communities. This link has not been stressed enough in defense of the academic humanities. It's past time to make this important connection -- to help our boards of trustees, our communities, and our legislators to know what the humanities brings to civil society and gives to students as they enter the workforce.

In the first class I ever taught as a teaching assistant, I did my first lecture on Death of a Salesman. My topic was work -- how Willy's job is his identity. I pointed to a student I knew in the 150-student lecture hall and told him that his surname, Scribner, probably indicated the employment of some ancestor of his, a "scrivener," like Bartleby. Then I asked who else had last names that might have indicated a job. We had Millers and Coopers and Smiths, and many more.

When those students' ancestors worked as barrel-makers or at their forges, they worked those jobs for life, and their sons afterward did the same. But how many of us do the job our parents did? How many of our students will do the same job in their 30s that they will do in their 20s? Narrow ideas about work force preparation will not prepare our students for the work of the rest of their lives. Each job they take will train them in the skills they need to succeed in that particular industry. But a broad, liberal education will have been what made them people worth hiring, people who have learned the value of curiosity, initiative, problem-solving. Students in STEM fields and students in arts, social sciences, and humanities all will become members of communities, and a good background in the humanities will enrich their membership.

I loved the humanities as an English professor. But it was only when I became involved in public humanities that I began to understand their value not just for individuals but for communities. That's the public good. And that's why we cannot afford to let a narrow rhetoric of work force preparation push the humanities from our curriculums or defund the work of the National Endowment for the Humanities.

Paula M. Krebs is dean of the College of Humanities and Social Sciences at Bridgewater State University, in Massachusetts, and a member of the board of directors of the Rhode Island Council of the Humanities.

Editorial Tags: 

Essay on digital humanities

Last year Temple University Press published Toby Miller's Blow Up the Humanities, a book that starts straining for provocation with its title and never lets up. The author is a professor of media and cultural studies at the University of California at Riverside. His preferred rhetorical stance is that of the saucy lad -- pulling the nose of Matthew Arnold and not fooled for a minute by all that “culture as the best which has been thought and said” jazz, man.

What we must recognize, his argument goes, is that there are two forms of the humanities now. What the author calls "Humanities One" (with literature, history, and philosophy at their core) is just the privileged and exclusionary knowledge of old and dying elites, with little value, if any, to today’s heterogeneous, globalized, wired, and thrill-a-minute world. By contrast, we have studies of mass media and communications making up “Humanities Two,” which emerged and thrived in the 20th century outside “fancy schools with a privileged research status.”

In the future we must somehow establish a third mode: “a blend of political economy, textual analysis, ethnography, and environmental studies such that students learn the materiality of how meaning is made, conveyed, and discarded.” Enough with the monuments of unaging intellect! Let the dead bury the dead; henceforth, culture must be biodegradable.

What I chiefly remember about Blow Up the Humanities, a few months after reading it, is exclaiming “What a cheeky monkey you are!” every few pages -- or at least feeling like this was expected of me. Otherwise it mostly seemed like vintage cultural-studies boilerplate. But one passage in the book did strike me as genuinely provocative. It takes the form of a footnote responding to Google’s claim of a "commitment to the digital humanities." Here it is, in full:

“In the United States, ‘the digital humanities’ can mean anything from cliometric analysis to ludic observation. It refers to a method of obtaining funds for conventional forms of Humanities One, dressed up in rather straightforward electronic empiricism. So counting convicts in law reports or references to Australia in Dickens becomes worthy of grant support because it is archival and computable.”

A scrawl in the margin records my immediate response upon reading this: “Cute but misleading.” But now, on second thought… Well, actually “cute but misleading” pretty well covers it. The caricature of the digital humanities might have been recognizable a dozen years ago, though just barely even then. What makes Miller’s polemical blast interesting is the angle of the assault. For once, a complaint about the digital humanities isn’t coming from traditionalist, semi-luddite quarters -- “traditionalist” with regard to the objects of study (i.e., books, manuscripts, paintings) if not necessarily the theories and methods for analyzing them.

On the contrary, Miller regards video games as a rich cultural medium, both profitable and profound. To shore up his claims for Humanities Two (or, fingers crossed, Three) he finds it useful to pretend that the digital humanities will, in effect, take us back to the era of professors tabulating Chaucer’s use of the letter “e.” The scholarship will be more efficient, if no less dull.

Now, I have no interest in impeding the forward march of Angry Birds studies, but there is no way that Miller doesn’t know better. The days when humanities computing was used to count dead convicts are long gone. Much more likely now would be a project in which all of the surviving files of Victorian prisons are not simply rendered searchable but integrated with census data, regional maps, and available documentation of riots, strikes, and economic trends during any given year.

My griping about Miller’s griping has as its catalyst the recent appearance of Literary Studies in the Digital Age: An Evolving Anthology, which is part of the Modern Language Association’s “commons” site.  (Anyone can read material published there; only members can contribute.)

MLA is a major component of the Humanities One infrastructure, of course, but has enough Humanities Two people in it to suggest that the distinction is anything but airtight. And while Miller pillories the digital humanities as nothing but “a method of obtaining funds for conventional forms of Humanities One,” even old-school philological practice takes on new valences in a digital environment.

“In the humanities,” write Charles Cooney, Glenn Roe, and Mark Olsen in their contribution, “scholars are primarily concerned with the specifics of language and meaning in context, or what is in the works. [Textbases] tend to represent specific linguistic or national traditions, genres, or other characteristics reflecting disciplinary concerns and scholarly expertise.… [T]extbases in the digital humanities are generally retrospective collections built with an emphasis on canonical works in particular print traditions.”

So far, so Humanities One-ish -- with only the neologism “textbase” to show that much has changed since Isaac Casaubon’s heroic proof that the Corpus Hermeticum wasn’t as ancient as everybody thought. Textbase just means “collection,” of course. For that matter, the options available in textbase design (the ways of annotating a text, of making it searchable, of cross-referencing it with other items in the textbase or even in other textbases) are basically high-tech versions of what scholars did four hundred years ago.

Alas, what Casaubon could do alone in his study now requires an interdisciplinary team, plus technicians. But he did not have the distractions we do.

If digital humanists were limited to converting cultural artifacts of the print era into textbases, that would still be useful enough, in its way. The classics aren’t going to annotate themselves. But the warehouse is much larger than that. Besides the inherited mass of documents from the past 5,000 years, more and more texts are now “born digital.” Besides warehousing and glossing such material, the digital humanities incorporate the changes in how people receive and engage with cultural material, as Alan Liu discusses in “From Reading to Social Computing,” his essay for the MLA anthology.

What Liu calls “the core circuit of literary activity” – the set of institutions, routines, and people involved in transmitting a poem (or whatever) from the author’s notebook to the reader's eyeballs – has been reconfigured dramatically over the past two decades. Besides making it possible to publish or annotate a text in new ways, the developing communication system transforms the culture itself. The digital humanist has to map, and remap, the very ground beneath our feet.

Nor is that a new development. Other papers in the anthology will give you a sense of how the digital humanities have developed over the long term -- beginning when Roberto Busa started using a computer to prepare an exhaustive concordance of Thomas Aquinas in the 1940s. At some point, an important change in the digital humanities will be necessary, which is to drop the word "digital."

(Note: This essay has been updated from an earlier version to correct Toby Miller's name.)

Editorial Tags: 

Pages

Subscribe to RSS - English
Back to Top