Humanities

Analysis considers contradictions in high school and college students' interest in humanities

New analysis seeks to explain the declining interest of high school students in studying the humanities in college, and a reversal once they arrive.

Essay on Michel Foucault's posthumous publications

Franz Kafka left explicit directions concerning the journals, letters and manuscripts that would be found following his death: they were to be burned -- all of them -- unread. Whether he expected Max Brod, the executor of his estate, to follow through with his instructions is a matter of some debate. In any case, Brod refused, and the first volume of Kafka’s posthumous works came out shortly after the author’s death in 1925.

The disregard for his wishes can be explained, if not justified, on a couple of grounds. For one thing, Kafka was a lawyer, and he must have known that expressing his intentions in a couple of notes wouldn’t be binding -- it takes a will to set forth a mandate in ironclad terms. And, too, Brod was both Kafka’s closest friend and the one person who recognized him as a writer of importance, even of genius. Expecting Brod not to preserve the manuscripts -- much less to leave them unread! -- hardly seems realistic.

On the other hand, Kafka himself destroyed most of his own manuscripts and did so in the same way he told Brod to do it, by setting them on fire. It is reasonable to suppose he meant what he said. If so, world literature has been enriched by an act of blatant disloyalty.

“Don’t pull the Max Brod trick on me,” Michel Foucault is said to have admonished friends. The philosopher and historian did Kafka one better by including a blunt, categorical line in his will: “No posthumous publications.”

Be that as it may, in late spring the University of Minnesota Press issued Language, Madness, and Desire: On Literature, a volume of short texts by Foucault originally published in France two years ago and translated by Robert Bonnono. The same press and translator also turned the surviving pages of an autobiographical interview from 1968 into a little book with big margins called Speech Begins After Death. The title is kind of meta, since Foucault, like Kafka, seems to be having an unusually wordy afterlife.

Foucault died in June 1984, the very month that the second and third volumes of The History of Sexuality appeared. He left a fourth volume in manuscript, but given the circumstances, it was destined only for the archives. And so things stood for about a decade. There was the occasional lecture or transcript of an interview he had given permission to publish, with claims made it was the “final” or “last” Foucault. After a while this started to get kind of silly, and it only made the thinker’s absence more palpable. Daniel Defert, the administrator of his estate, had also been Foucault’s lover for many years, and he seems to have taken the ban on posthumous works to heart in a way that Max Brod never did.

But by 1994, Defert relented enough to allow a four-volume collection of Foucault’s essays and interviews to be published in France. (A few years later, the New Press brought out an abridged translation as the three-volume Essential Works of Michel Foucault.) By the 20th anniversary of the thinker’s death in 2004, the situation had changed dramatically. Six of Foucault’s 13 courses of lectures at the Collège de France had been published and the rest were on the way. In September, Palgrave Macmillan is bringing out On the Punitive Society, at which point the whole series will be available in English. That adds another shelf’s worth of stout, dense and rich volumes to the corpus of Foucault’s work -- overlapping in various ways with the books he published (e.g., the Punitive Society lectures were given as he was working on Discipline and Punish) but developing his ideas along different trajectories and in front of an audience, sometimes in response to its questions.

In a paper published last year, John Forrester, a professor of history and philosophy at the University of Cambridge, expresses a mingled appreciation and dismay at how what he calls Foucault’s “pithy and ultra-clear command, ‘Pas de publication posthume,’” has been breached in the case of the Collège de France courses. The paper appears in Foucault Now: Current Perspectives in Foucault (Polity).

“Because these were public lectures,” writes Forrester, “they had already been placed in the public domain ‘dans son vivant,’ as the French language says, in his lifetime. Their transcription and editing therefore is not the production of posthumous texts, but the translation from one already published medium -- for instance, the tape recorder -- to another, the book.” While grateful that Brod and Defert “found a way to publish what Kafka and Foucault forbade them to publish,” he says, “that doesn’t mean to say I think they were right. They did right by me and many, very many, others. But I can’t see how they obeyed the legal injunction placed on them.”

Language, Madness, and Desire consists of six items it was not difficult to squeeze through that dans son vivant loophole, since they were delivered to audiences as radio broadcasts or lectures between 1963 and 1970. Speech Begins After Death is another matter entirely. It consists of the opening exchanges from a series of interviews Foucault gave to Claude Bonnefoy, a literary critic, in 1968. The plan had been to produce a book. It never came together for some reason (1968 was a big year for getting distracted), none of it was published and most of the transcript has been lost.

In short, there’s no real wiggle room for rationalizing Speech Begins After Death as permissible under the terms of Foucault’s will. And this is where things get interesting. To be blunt about it, Language, Madness, and Desire is not going to come as much of a revelation to anyone who has read, say, the literary essays in Language, Counter-Memory, Practice (the Cornell University Press anthology of Foucault’s work from the 1960s and early 1970s that’s still one of the best things out there). It would not be surprising if it turns out there are dozens of other such pieces which could slip past Foucault’s ban without adding much to the body of work he saw through the press.

By contrast, Speech Begins After Death is (1) a clear violation of the author’s wishes and (2) a pretty good example of why violating them might be a good idea. In later years Foucault was used to giving interviews but in 1968 he was uncomfortable with the whole process. Being treated as an author or a literary figure (rather than an academic) only makes him more nervous. As sometimes happens, the performance anxiety, once he gets it under control, inspires him to think out loud in a way that seems to surprise him.

One passage almost jumps off the page:

“As long as we haven’t started writing, it seems to be the most gratuitous, the most improbable thing, almost the most impossible, and one to which, in any case, we’ll never feel bound. Then, at some point -- is it the first page, the thousandth, the middle of the first book, or later? I have no idea -- we realize that we’re absolutely obligated to write. This obligation is revealed to you, indicated in various ways. For example, by the fact that we experience so much anxiety, so much tension if we haven’t finished that little page of writing, as we do each day. By writing that page, you give yourself, you give to your existence, a form of absolution. That absolution is essential for the day’s happiness.”

Like Kafka's demand for a book that “must be the ax for the frozen sea within us,” these lines are worth whatever guilt was incurred by whoever rescued them for us.

Editorial Tags: 

A new funding program at the NEH hopes to bring more humanities research to the general public

New grants from National Endowment for the Humanities aim to encourage books based on humanities research that are accessible to nonscholars.

Image: 

Review of Adam Mack, "Sensing Chicago: Noisemakers, Strikebreakers and Muckrakers"

The most distracting thing about costume dramas set in any period before roughly the turn of the 20th century -- in my experience, anyway -- is the thought that everything and everyone on screen must have smelled really bad. The most refined lords and gentry on Wolf Hall did not bathe on anything we would regard today as a regular basis.

No doubt there were exceptions. But until fairly recently in human history, even the most fastidious city dweller must have grown accustomed to the sight of human waste products from chamber pots that had been emptied in the street. (And not just the sight of it, of course.) Once in a while a movie or television program will evince something of a previous era’s ordinary grunge, as in The Return of Martin Guerre or Deadwood, where almost everything looks soiled, fetid and vividly uncomfortable. But that, too, is exceptional. The audience for costume drama is often looking for charm, nostalgia or escapism, and so the past usually wears a deodorant.

The wider public may not have heard of it, but a “sensory turn” among American historians has made itself felt in recent years -- an attention, that is, to the smells, tastes, textures and sounds of earlier periods. I refer to just four senses, because the importance of sight was taken for granted well before the turn. In their more polemical moments, sensory historians have even referred to “the tyranny of the visual” within their discipline.

That seems a little melodramatic, but point taken: historians have tended to scrutinize the past using documents, images, maps and other artifacts that chiefly address the eye. Coming in second as the organ of perception most likely to play a role in historical research would undoubtedly be the ear, thanks to the advent of recorded sound. The remaining senses tie for last place simply because they leave so few traces -- which, in any case, are not systematically preserved the way audiovisual materials are. We have no olfactory or haptic archives; it is difficult to imagine a library of flavors.

Calls to overcome these obstacles -- to analyze whatever evidence could be found about how everyday life once sounded, smelled, felt, etc. -- came from American historians in the early 1990s, with a few pioneers at work in Europe even before that. But the field of sensory history really came into its own over the past decade or so, with Mark M. Smith’s How Race is Made: Slavery, Segregation and the Senses (University of North Carolina Press, 2006) and Sensing the Past: Seeing, Hearing, Smelling, Tasting and Touching in History (University of California Press, 2007) being among the landmarks. Smith, a professor of history at the University of South Carolina, also convened a roundtable on the sensory turn published in the September 2008 issue of The Journal of American History. A number of the contributors are on the editorial board of the Studies in Sensory History series published by the University Illinois Press, which launched in 2011.

The series’ fifth and most recent title is Sensing Chicago: Noisemakers, Strikebreakers and Muckrakers by Adam Mack, an assistant professor of history at the School of the Art Institute of Chicago. Beyond the monographic focus -- it covers about fifty years of the city’s history -- the book demonstrates how much of the sensory field of an earlier era can be reconstructed, and why doing so can be of interest.

Overemphasis on the visual dimension of an urban landscape “mirrors a set of modern cultural values that valorize the eye as the barometer of truth and reason,” we read in the introduction, “and tend to devalue the proximate, ‘lower’ senses as crude and less rational.” Having thus recapitulated one of sensory history’s founding premises, the author wastes no time before heading to one site that must have forced its way deep into the memory of anyone who got near it in the 19th century: the Chicago River.

“A bed of filth,” one contemporary observer called it, where manure, blood, swill and unusable chunks of carcass from the slaughterhouses ended up, along with human sewage and dead animals -- all of it (an editorialist wrote) “rotting in the sun, boiling and bubbling like the lake of brimstone, and emitting the most villainous fumes,” not to mention drawing clouds of flies. A letter writer from 1862 mentions that the water drawn from his kitchen hydrant contained “half a pint or more of rotten fish.” Many people concluded that it was safest just to drink beer instead.

Laws against dumping were passed and commissions appointed to investigate the problem, for all the good it did. The poorest people had to live closest to the river, so disgust at the stench combined in various ways with middle- and upper-class attitudes towards them, as well as with nativist prejudices.

The horrific odor undermined efforts to construct a modern, rationally organized city. Imposing a grid of streets on the landscape might please the eye, but smell didn’t respect geometry. The same principle applied to the Great Fire of 1871, the subject of Mack’s next chapter. The heat and sheer sensory overload were overwhelming, and the disaster threw people from all walks of life together in the streets in a way that made social status irrelevant, at least for a while. The interplay between social hierarchy and sensory experience (exemplified in references to “the roar of the mob”) is the thread running through the rest of the book. Thinking of the “‘lower’ senses as crude and less rational” -- to quote the author’s phrase again -- went along with assumptions about refinement or coarseness as markers of class background.

The sources consulted by the author are much the same as any other historian might use: newspapers, civic records, private or otherwise unpublished writings by long-forgotten people, such as the recollections of the Great Fire by witnesses, on file at the Chicago History Museum. The contrast is at the level of detail -- that is, the kinds of detail the historian looks for and interprets. Perhaps the next step would be for historians to enhance their work with direct sensory documentation.

A prototype might be found in the work of John Waters, who released one of his movies in Odorama. Audience members received cards with numbered scratch-and-sniff patches, which they consulted when prompted by a message on the screen.

On second thought, it was difficult enough to read Mack’s account of the Chicago River in the 19th century without tickling the gag reflex. Olfactory realism might push historical accuracy farther than anyone really wants it to go.

Editorial Tags: 

Commentary on American mass shootings

Only satire can look certain horrible realities in the eye, as The Onion did with its article from last year about a lone-wolf mass shooting of random strangers. Its headline cut to the quick: “‘No Way To Prevent This,’ Says Only Nation Where This Regularly Happens.”

It’s the real American exceptionalism. Rampage shootings do take place in other countries (the 1996 Dunblane school massacre in Scotland, for example), but rarely. They remain distinct events in the public memory, rather than blurring together. In the United States the trauma is repetitive and frequent; only the location and the number of victims seem to change.

With Charleston we have the additional grotesquerie of a presidential candidate calling Dylann Roof’s extremely deliberate act an “accident” while the director of the Federal Bureau of Investigation made a point of denying that it was terrorism. (The shooter was an avowed white supremacist who attacked an African-American church and took care to leave one survivor to tell the tale. By no amount of semantic weaselry can it be described as anything but “[an] act of violence done or threaten[ed] to in order to try to influence a public body or citizenry,” to quote the director's own definition of terrorism.) But American rampage shootings do not always express an ideological agenda, or even a motive intelligible to anyone but the gunman. The meaninglessness of the violence, combined with its regularity, is numbing. So with time our scars grow callused, at least until the next spree rips them open again.

A few years ago Christopher Phelps, an intellectual historian who happens to be my closest friend, moved with his family to England, where he is now a senior lecturer (i.e., an associate professor) in American and Canadian studies at the University of Nottingham. At some point the British media began turning to him for commentary on life in these United States. “I tend to be asked on radio shows when there's a need for American expertise -- and implicitly an American accent, which adds an air of authenticity,” he wrote in an email when I asked him about it.

Among the topics he’s been asked about are “the re-election of Obama, the anniversary of JFK's death, and even what comprises the East Wing of the White House, since one only ever hears about the West Wing.” Of late, America’s everyday mayhem keeps coming up. In 2013 he discussed the Trayvon Martin case. Last August, it was the girl whose training in automatic weapons on an Arizona firing range ended when she lost control and sprayed her instructor with bullets. Phelps appeared on a nationally broadcast talk show hosted by Nick Ferrari, which seems like the perfect name for a bigger-than-life radio personality.

Ferrari wasted no time: “What is it with Americans and guns?” he asked. A fair question, though exceedingly blunt.

“I should have anticipated that, I suppose,” Phelps says now, “but I froze like the proverbial deer in the headlights, stuttering away.” Since then, unfortunately, he has gained experience in answering variations of the question. “The producers need people to do it,” he explains, “the university media team work hard to set up the gigs, and you feel as an American you should step in a bit to help modulate the conversation, but it sweeps away my life for a day or two when I have other plans and some psychopath shoots up America.” (The BBC program for which he was interviewed following the Charleston shootings can be found here.)

It is still depressing,” Phelps continues, “in fact draining, to be put in the position of explaining my people through this kind of event, but reflection has prompted some better ways of answering.”

A one-sentence question about the American pattern of free-range violence takes many more to address at all concretely. Phelps's assessment bears quoting at length:

“While I'm as drawn to generalities as anyone -- I've always thought there was something to H. Rap Brown's declaration that ‘violence is as American as cherry pie’ -- it’s important to realize that most American households do not possess guns, only a third do. So gun owners do not comprise all Americans but a particular demographic, one more white, male and conservative than the general population.

“The shooters in mass killings, likewise, tend to be white men. So we need to explain this sociologically. My shorthand is that white men have lost a supreme status of power and privilege, given a post-’60s culture claiming gender and racial equality as ideals, yet are still socialized in ways that encourage aggressiveness.

“Of course, that mix wouldn't be so dangerous if it weren't easy to amass an arsenal of submachine guns, in effect, to mow people down. Why do restrictions that polls say Americans consider reasonable always get blocked politically, if gun-owning households are a minority? For one thing, the gun manufacturing corporations subsidize a powerful lobby that doubles as a populist association of gun owners. That, combined with a fragmented federalist system of government, a strongly individualist culture and the centrality of a Constitution that seems to inscribe ‘the right to bear arms’ as a sacred right, makes reform very difficult in the United States compared to similarly positioned societies. This suggests the problem is less cultural than political.”                      

Following the massacre of 26 people, most of them children, at Sandy Hook Elementary School in Connecticut in 2012, National Rifle Association executive vice president Wayne LaPierre waited several days before issuing a statement. Whether he meant to let decent interval pass or just needed time to work up the nerve, his response was to blame our culture of violence on… our culture of violence.

He condemned the American entertainment industry for its constant output of “an ever more toxic mix of reckless behavior and criminal cruelty” in the form of video games, slasher movies and so forth. The American child is exposed to “16,000 murders and 200,000 acts of violence by the time he or she reaches the ripe old age of 18” -- encouraging, if not spontaneously generating, LaPierre said, a veritable army of criminals and insane people, just waiting for unarmed victims to cross their paths. “The only way to stop a monster from killing our kids,” he said, “is to be personally involved and invested in a plan of absolute protection.”

The speech was a marvel of chutzpah and incoherence. But to give him credit, LaPierre’s call for “a plan of absolute protection” had a sort of deluded brilliance to it -- revealing a strain of magical thinking worthy of… well, when you get right down to it, a violent video game. Despite living in a society full of people presumably eager to act out their favorite scenes in Natural Born Killers and American Psycho, having enough firepower will give you “absolute protection.”

On many points, Firmin DeBrabander’s book Do Guns Make Us Free? Democracy and the Armed Society (Yale University Press) converges with the analysis quoted earlier from my discussion with Christopher Phelps. But DeBrabander, an associate professor of philosophy at Maryland Institute College of Art, places special emphasis on the corrupting effect of treating the Second Amendment as the basis for “absolute protection” of civil liberties.

The vision of democracy as something that grows out of the barrel of a gun (or better yet, a stockpile of guns, backed up with a ready supply of armor-piercing bullets) involves an incredibly impoverished understanding of freedom. And it is fed by a paranoid susceptibility to “unmanageable levels of fear,” DeBrabander writes, and “irrationalities that ripple through society.”

He turns to the ancient Stoic philosophers for a more robust and mature understanding of freedom. It is, he writes, “a state of mental resolve, not armed resolve. Coexisting with pervasive threats, Seneca would say, is the human condition. The person who lives with no proximate dangers is the exception. And it’s no sign of freedom to live always at the ready, worried and trigger-happy, against potential threats; this is the opposite of freedom.” It is, on the contrary, “a form of servitude,” and can only encourage tyranny by demagogues.

“Freedom,” DeBrabander goes on to say, “resides in the ability to live and thrive in spite of the dangers that attend our necessarily tenuous social and political existence -- dangers that are less fearsome and debilitating to the extent that we understand and acknowledge them.” It is only one of many good points the author makes. (See also his recent essay “Campus Carry vs. Faculty Rights” for Inside Higher Ed.) And the certainty that another mass shooting will take place somewhere in the United States before much longer means we need all the stoicism we can get.

Editorial Tags: 

Ohio's largest community college receives unprecedented gift for humanities program

In an era when the humanities are overlooked or derided by politicians, one Ohio community college landed a $10 million grant to boost the liberal arts.

New book proposes teaching-intensive tenure-track model to address 'real' crisis in the humanities

New book proposes teaching-intensive tenure track to address what it calls the "real" crisis in the humanities.

Image: 

A competing manifesto on the value of academic conferences (essay)

Last week, the New York Times's “Opinionator” published an essay in which Christy Wampole decried the present state of humanities scholarship by holding up the worst forms of conference behavior to ridicule.

Let’s be honest: all academics have groaned at plenary papers that go over the time limit or senior colleagues who assume their listeners will fully absorb their arguments even when delivered in a monotone with no attention to rhetorical context. These examples of inconsiderate academics are certainly not the norm, however, just as misbehavior or nastiness are not often the norm in any other professional arena.

Wampole herself admits to engaging in the narcissistic habit of answering emails during plenaries and having “listened for the first five minutes of the talk, just long enough to seize upon a word around which [she’ll] construct a pseudoquestion in the Q and A.” She includes herself among those who sometimes give a paper and then spend the rest of the conference at the pool bar.

To suggest as she does, however, that we should judge the quality or the future of the humanities by these unfortunate instances of a professional lack of grace is irresponsible. It is judging a profession by its lowest common denominator, and it obscures the good, important exchange of ideas and generation of knowledge that occurs at academic conferences year in and year out, throughout most academic careers.

It also feeds the worst stereotypes about academics that subsequently become fuel for political agendas across the country seeking to defund education at the great expense of America’s future.

Of late, public critiques of the humanities have taken the explicit form of assertions that the disciplines have no practical value or contemporary relevance in a technological world. Implicitly, such critiques also manifest in persistent funding cuts to arts programs, in calls for exclusively STEM-based initiatives to improve our educational system, in claims about the unemployability of humanities graduates, even in assertions by some defenders of the disciplines that humanities knowledge is primarily good for business, economics or public-policy makers -- which imply that such knowledge and experience has no value if it cannot be turned to moneymaking.

But in the last few years, there has also been what feels like an exponential increase in those willing to engage in national conversations that ask, and attempt to answer, tough questions about these issues. Academics and nonacademics alike have filled the pages of The New York Times, Inside Higher Ed, The Chronicle of Higher Education, The Atlantic and scores of other outlets with meditations on the costs of higher education humanities study, who is served, who is left out and the role of the humanities in shaping young minds or good citizens or brilliant scientists or desirable employees.

We are encouraged by this general willingness to engage in these tough intellectual conversations. At the same time, we are disheartened by the propensity of so many, both within higher education and outside of it, to rely upon the dismissive premise that academics largely exist in a secluded world in which they care only about their own infinitesimal research interests, which are esoteric at best, incomprehensible and a waste of taxpayers’ money at worst.

This is not to suggest we should not all, as professionals, always strive to make our practices better, to keep pace with the times, to question our own assumptions and habits, to identify honestly what is not working, and to change it where we can. However, it is to suggest that perhaps a better model for doing so is one that is based on the notion that academics -- as teachers, researchers, mentors and institutional colleagues -- go into their chosen profession with the desire to advance knowledge through collaborative means.

Contrary to the misrepresentation of academic conferences as attended only by dreary caricatures of the out-of-touch professor rambling on about irrelevant ideas, most conferences we attend are places where we try out ideas among our colleagues, launch collaborations, consider the pedagogical and public import of our findings, mentor graduate students, and participate in the transformations of our fields in ways that make us better teachers and better researchers.

Many of us value conferences for both private and professional reasons, as David M. Perry points out in his May 6 Chronicle of Higher Education response to Wampole’s essay, and as Devoney Looser has recently enumerated in her Chronicle guide to conference etiquette. We, like both of them, encourage thinking about conferences as an important means of entry into our disciplinary communities.

Conferences help to provide what many faculty cannot find at their home institutions: a community of minds focused on a particular issue. For faculty members everywhere but the Ivy League or a very well-funded public university, inviting speakers to campus who can give lectures and seminars on the latest research ideas or programmatic innovations is not a given, nor is access to a world-class research library. These facts are especially true in the context of many states’ perilous hollowing out of the financial support for public colleges and universities.

As faculty numbers continue to shrink, academics often find themselves a party of one in their departments, working as the sole representative of a particular field, without immediate access to colleagues in their fields of expertise. Done well, an academic conference offers a chance for collegial dialogue of the sort that can lead to tangible progress. When faculty members attend conferences, students and their institutions also directly reap the benefits.

Conferences can be particularly important for scholars of color and others who find themselves disenfranchised by administrations and by institutionalized injustices on their own campuses. Although we recognize that unfortunately many conferences have a long way to go to truly support marginalized academic communities, we are encouraged by those we have seen working explicitly to foster this kind of inclusivity.

For many faculty members struggling with the isolation of being seen as a “representative” member of an underrepresented group, conference networking can be a crucial path for figuring out how to navigate their own institutions, for dealing with the microaggressions of students, administrators and other faculty, and for coping with the additional and unique responsibilities they often face alone of mentoring minority student populations or administering programs. Conferences also have the potential to be sites for the birth of activism, where communities both formal and informal unite to make changes in how things are done, how people are treated and how certain ideas are valued.

Conferences are, in other words, even more important for those not privileged by mainstream academic cultures than they are for the elites. A researcher at Princeton has regular access to communities of scholarship that would be completely unknown to most attendees of a scholarly conference. Perhaps most depressingly, such intellectual communities are often nonexistent for the contingent faculty who are rarely fully integrated members of their own departments and who, despite being engaged in rigorous research, cannot attend conferences even when they want to because their institutions do not support the professional development of these integral members of their communities.

Wampole submits that “conferences feel necessary, but their purpose is unclear.” While the exact form that conference collaborations take might usefully be retooled, their purpose in supporting innovations in research, teaching, administration or activism could not be more clear. The process of making a productive contribution to research depends upon knowing what people already know, and this is significantly aided by the feedback of other scholars working on similar or related questions. Even as we acknowledge the legitimate problem of the environmental impact of that much travel, we don’t think anything can fully substitute for the intellectual experience of hearing a good plenary talk followed by a vigorous debate that is the catalyst for deeper conversations throughout the conference. Published scholarship is essential, but it takes time to develop, and face-to-face conversations and the accountability conferences provide are a great way to incubate ideas that are just being formed.

Could conferences be better? Of course they could, but they are organized and run by groups of committed faculty members or the staff they have hired to help them, who do their best despite inevitable budget constraints and competing time demands. Instead of focusing on the problem of boredom, how about addressing truly meaningful problems, like the economic barriers to participation for graduate students and less financially privileged researchers, or lessening the impact of mass travel on the environment, or the lack of child care resources, or the way such conferences are misrepresented in the anti-intellectual popular media?

Here is the bottom line: conferences are created by the faculty they serve. They are not merely events where we put ourselves on display or where we criticize from an outside position -- they are collaborative ventures. Faculty researchers do not just attend their conferences; they own them. And so, we offer the following countermanifesto.

A Conference Manifesto for the Rest of Us:

(1) We will consider the quality of the conferences we attend as our own responsibility. If we are unhappy with the structure, we will contact the organizing committee or form a coalition to initiate changes to the obstacles that limit the conference’s success. (We know of one such coalition currently forming in response to a lack of female presenters at a major conference, and this is not an isolated example.)

(2) We will strive to be precise and productive. We will offer meaningful rather than petty critiques, strive not to generalize from extreme examples and, as much as possible, focus on useful alternatives rather than finger pointing.

(3) If we are not in a position of power, or we feel too disaffected to contribute to positive solutions at a structural level, then we will be the change we seek in our individual interactions. If a scholar presents a paper in which the larger purpose is not clear, we will ask him about that purpose during the Q and A. If it is clear that a speaker is having trouble articulating an argument, we will help her see what it is. We will attend as many events as we can, offer real feedback and participate in real discussions. Put more simply, we will continue to be generous.

(4) We will acknowledge academic generosity where we find it, namely:

  • in the organizers who laboriously put together meeting programs, speakers and events to foster collaborative dialogue and the exchange of ideas;
  • in the keynote speakers, senior colleagues and established scholars who routinely engage more junior members of the profession in meaningful conversations;
  • in the conference-goers who ask thoughtful questions;
  • in the professors who mentor students;
  • and, institutionally beyond the world of conferences, in the faculty who work to improve conditions on their campuses, in the anonymous reviewers who provide constructive feedback on essays and in the adjuncts who spend endless unremunerated hours facilitating learning.

(5) We will be humble. We will recognize that although humanists are excellent at being critical, we are fortunate to have these communities to help us improve our research.

(6) We will attempt always to get over ourselves. Our presentations may be great, but they aren’t perfect.

(7) And finally, we will be aware. We will continue to think carefully about how we use the resources invested in us as scholars. It appears to us that the humanities are at least beginning to be recognized as having both intrinsic and extrinsic values, and it is up to us to communicate those values to people who doubt both, rather than to reinforce stereotypes through exclusionary rhetoric or condescension. We posit that there is real value in the thoughtful public intellectual, and we will work to be scholars who are willing to ask hard questions about our own work, to engage in thorny debates about priorities, to radically reimagine what higher education might look like in the 21st century and to challenge the parameters or privileges of our own positions. We will make sure that we can clearly show why our work matters, because no matter how frustrating conferences can be, they are places where humanities scholarship does some of its most important work.

Cora Fox is an associate professor of English and associate director of the Institute for Humanities Research at Arizona State University. Andrea Kaston Tange is a professor of English at Eastern Michigan University and editor of the Journal of Narrative Theory. Rebecca A. Walsh is an assistant professor of English at North Carolina State University and co-chair of The H. D. International Society.

Section: 
Editorial Tags: 

Scholar discusses his book on the creation of the research university and disciplines

Author of new book on the creation of the research university discusses the role of disciplines and information overload -- from the 18th century to the rise of MOOCs.

Image: 

Essay criticizes studies that claim to show Shakespeare is ignored by English departments

Were it so… that some little profit might be reaped (which God knows is very little) out of some of our playbooks, the benefit thereof will nothing near countervail the harm that the scandal will bring unto the library, when it shall be given out that we stuff it full of baggage [i.e., trashy] books.

-- Sir Thomas Bodley, founder of the University of Oxford’s Bodleian Library, explaining why he did not wish to keep English plays in his library (1612).

On William Shakespeare’s birthday this year, the American Council of Trustees and Alumni (ACTA) issued a report, “The Unkindest Cut: Shakespeare in Exile in 2015,” which warned that “less than 8 percent of the nation’s top universities require English majors to take even a single course that focuses on Shakespeare.” Warnings about the decline of a traditional literary canon are familiar from conservative academic organizations such as ACTA and the National Association of Scholars. What increasingly strikes me, however, is how frozen in amber these warning are.

In a nation obsessed with career-specific and STEM education, there is scant support for humanities in general. Where are the conservative voices advocating for the place of English and the humanities in the university curriculum? One would think this advocacy natural for such academics and their allies. After all, when Matthew Arnold celebrated the “best that has been thought and known,” he was proposing cultural study not only as an antidote to political radicalism but also to a life reduced, by the people he called philistines, to industrial production and the consumption of goods.

We have our modern philistines. Where are our modern conservative voices to call them out? Instead, on the shrinking support for the liberal arts in American education -- the most significant issue facing the humanities -- organizations such as ACTA and NAS mistake a parochial struggle over particular authors and curricula for the full-throated defense of the humanities.

Worse, these organizations suggest that if one does not study Shakespeare or a small set of other writers in the traditional literary canon (moreover, in only certain ways), then literature and culture are not worth studying -- hardly a way to advocate for literary studies.

The requirements at my own institution suggest how misleading the ACTA position is, and how thin a commitment to the humanities it represents. With no Shakespeare requirement in the George Mason University English department, it is true that some of our majors won’t study Shakespeare. However, because our majors must take a course in a pre-1800 literature -- nearly all the departments ACTA examined have a similar requirement -- that means they’ll study Chaucer, or medieval intellectual history, or Wyatt, Sidney, Donne, Jonson, Milton, etc. (The study of Spenser, however, appears to me somewhat in decline; ACTA, if you want to take up the cause of The Faerie Queene, let me know.)

How can writers as great as these be off ACTA’s map? Is it because ACTA doesn’t really value them? Its Bardolatry is idolatry -- the worship of the playwright as wooden sign rather than living being, a Shakespeare to scold with, but no devotion to the rich literary and cultural worlds of which Shakespeare was a part. Hence, too, the report maintains that a course such as Renaissance Sexualities is no substitute for what it calls the “seminal study of Shakespeare” -- though certainly such a course might feature the Renaissance sonnet tradition, including Shakespeare’s important contribution to it, not to mention characters from Shakespeare’s plays such Romeo and Juliet or Rosalind and Ganymede.

ACTA also warns that rather than Shakespeare, English departments are “often encouraging instead trendy courses on popular culture.” This warning similarly indicates the narrowness of ACTA’s commitment to literary study. As anyone who’s ever taken a Shakespeare course should know, not only were Shakespeare’s plays popular culture in his own day (English plays were scandalous trash, thought Thomas Bodley), but also the very richness of Shakespeare’s literary achievement comes from his own embrace of multiple forms of culture. His sources are not just high-end Latin authors but also translations of pulpy Italian “novels,” English popular writers, folktales, histories and travelogues, among others. The plays remain vibrant today because Shakespeare allows all these sources to live and talk to one another.

Indeed, the literary scholars William Kerrigan and Gordon Braden point out that in this quality Shakespeare was typical of his age, for the vibrancy of the Renaissance derives in part from its hybridity. The classical was a point of departure, but neither Shakespeare nor Renaissance culture was slavishly neoclassical. Modern English departments, in their embrace of multiple literary cultures, in their serious study of our human expression, evince the same spirit. 

Conservatives have suggested that the hybridity of the modern English major is responsible for declining interest in the major. That claim cannot be proved. Anecdotes and intuitions are insufficient to do so. Data on trends in the number of majors over time can only show correlation, not causation.

And in terms of correlation, here are four more likely drivers of the decline in the percentage of students majoring in English: students are worried about finding jobs and are being told (wrongly, according to the actual statistics) that the English major is not a path to one; students now have many new majors to choose from, many no longer in the liberal arts; English has traditionally had more female than male majors, and women now pursue majors, such as in business or STEM fields, from which they used to be discouraged (a good change); political leaders have abandoned the liberal arts in favor of STEM and career-specific education and are advising students to do the same (even President Obama jumped on this bandwagon, though he later apologized).

Regarding this last cause, the voices of organizations such as ACTA and NAS could particularly help, since many of these politicians are conservatives, and leaders of these academic organizations have ties to conservative political circles. In doing so, conservatives could help reclaim a legacy. In 1982, William Bennett, as chair of the National Endowment for the Humanities, urged colleges to support the humanities against “more career-oriented things.” By 1995, Bennett had become disgusted with what he saw as an overly progressive agenda in the humanities. Picking up his marbles and going home, Bennett urged Congress to defund the NEH. More recently, Bennett agreed with North Carolina Governor Pat McCrory that the goal of publicly funded education should be to get students jobs. “How many Ph.D.s in philosophy do I need to subsidize?” Bennett asked.

Shakespeare was generous in his reading and thinking. We can be, too. Literary scholars may disagree on many things -- on the values to be derived from a particular literary work, on the ways it ought to be framed, on which literary works are most worthy of classroom reading. But such disagreements are just part of the study of the humanities in a democratic society. When we support the humanities, we support an important public space to have these disagreements. We also support Shakespeare -- who really isn’t going away from the English curriculum -- and the study of literature more generally.

The ACTA study, as far as I can tell, was mainly met with silence. That’s because the study is a rehash of an earlier one from 2007, itself a rehash of the culture wars of the 1980s and ’90s. No one cared, because most people have moved on from the culture wars, and for many of our political leaders, culture itself doesn’t much matter anymore. Culture wars have become a war on culture. In that battle, all lovers of literature should be on the same side. Advocating for the humanities, even as we argue about them, is walking and chewing gum. We should be able to do both at the same time. I appeal to conservative academic organizations that we need to. The one-sided emphasis on majors that lead directly to careers and the blanket advocacy of STEM fields are far greater threats to the humanities than sustainability studies. And without the humanities, there is no institutionalized study of Toni Morrison. Or pulp fiction. Or Sidney. Or Shakespeare.

Robert Matz is professor of English, with a focus on English Renaissance literature, at George Mason University. He currently serves as senior associate dean for George Mason’s College of Humanities and Social Sciences.

Editorial Tags: 
Image Source: 
Wikimedia Commons

Pages

Subscribe to RSS - Humanities
Back to Top