In her president’s column in the spring 2006 Modern Language Association newsletter, Marjorie Perloff focuses on the expansion of Ph.D. programs in creative writing (including doctorates in English that allow for a creative dissertation). Perloff argues that the growth of creative-writing doctorates was a reaction to politicization and specialization within the English discipline: “An examination of the catalogues of recently established Ph.D. programs in creative writing suggests that, in our moment, creative writing is perhaps best understood as the revenge of literature on the increasingly sociological, political, and anthropological emphasis of English studies.”
She also cites recent job advertisements in English calling for candidates specializing in a range of theoretical approaches, which relegate the teaching of literature to “a kind of afterthought, a footnote to the fashionable methodologies of the day.”
Perloff is right on both counts: These are central factors that have led to the growth of creative writing Ph.D.s. But she also misses an important element, one that grows out of, but also underlies, the others. It is that people want what they think and write to matter, not just to their colleagues, but also to the world at large. Creative work, and the doctorate in creative writing, holds out this hope.
The doctorate in creative writing comes in various forms, but most are very similar to literary studies doctorates. I myself am a doctoral candidate in English at the University of Denver, writing a creative dissertation -- a book of poetry, accompanied by a critical introduction. As a graduate student, I’ve fulfilled the same coursework requirements as my literary studies peers, with the addition of four writing workshops. I’ve taken comprehensive exams in the same format as my literary studies peers. I’ve taught more or less the same courses as my literary studies peers. The only significant difference between my doctoral work and my literary studies colleague is in the dissertation.
Sometimes, in fact, it strikes me as a bit comic to be doing the creative dissertation, but then I think about the fate of my work. I want my work to find its audience, though I realize that poetry has lost more readers, perhaps, than scholarship has over the last 50 years. Yet I believe that creative writing holds out more hope of finding readers, and of gaining readers back, than scholarship does. Hundreds or thousands of poetry books are published each year, and are more likely to find their way onto the shelves of bookstores than are scholarly studies. For fiction writers, the prospects are even better -- after all, there’s still a market for novels and short fiction.
However, it’s not just for readerly recognition that I want to do this creative work. It is because literature matters to how people live their lives, not just emotionally but intellectually. I speak here specifically of literature, but I think the principle holds true for any kind of creative work, even works we wouldn’t ordinarily think of as artistic, such as historical or psychological or anthropological studies.
Just a few days ago I was talking with a good friend of mine, a fellow graduate student working on her dissertation. My friend’s enthusiasm for the work and the discoveries that she is making, her eloquence on her subject, and her physical animation in talking about it were obvious, even if some of the nuance of her project was lost on me. But then she stopped herself and said, “Of course, nobody really cares about this.”
She described the frustration of talking about her project with non-academic friends and family members, how it takes too long to explain the work she is doing to people outside her specialty area, how their faces fall blank as she goes on too long in explaining the foundations of the debate in which she is involved. She laughed and said archly, “It’s not so bad once you get used to the idea that no one is ever going to read your dissertation except your committee.”
I have had similar conversations with other friends working on dissertations, not just in English, but across the humanities, though the sense of writing into the void is particularly marked among those in my discipline. Let me say here that I don’t want to challenge the value of discipline-specific, specialized scholarship -- after all, it would be foolish to say that the intellectual work of teaching and writing does not require specialist knowledge, or that the ideas formulated in scholarly work don’t find their way to non-specialists through good teaching or through popularizers or public intellectuals, though we could stand a few more of them. Those academics who write for an extra-disciplinary audience, as Mark Oppenheimer pointed out in a recent essay in The Chronicle of Higher Education, play an important part in connecting the academy with the non-academic world, and shaping common conceptions of disciplines such as history. He wrote: “They have the influence that comes with writing for journals at the intersection of academe and the culture at large. They interpret scholarship for people who prefer to read journalism, and their opinions reverberate and multiply, if in ways that we cannot measure.”
This is not a plea for greater “accessibility” or for a return to a “generalist” approach to English. Nor will I rehearse the yearly mocking that the titles of papers at the MLA convention get in major newspapers across the country. But I do think that the sense that nobody’s listening or reading the work of scholars outside their specialized communities points to a real problem of the contemporary humanities department: the loss of audience, and with it, the loss of a sense that the work should matter to a larger, educated, non-academic audience.
There’s no doubt that scholars have produced critical work in humanities subjects that does matter. I think of Raymond Williams, never the easiest of writers, but one who rewards the effort made in engaging with his work and who, perhaps because of his quasi-academic status, writes in such a way that his ideas could be understood outside the academy. I also think of John Berger, Susan Sontag and Fredric Jameson. These are writers who can be exciting for readers coming from outside the academy, and who can influence the way readers experience texts, and even life.
However, with the increasing professionalization of the university, the potential audience for scholarly work has diminished as scholarly writing has become more specialized and jargon-ridden. None of what I say is news, I know. But the creative doctorate as an approach to making scholarly research and thinking matter in the world is news, and very good news.
I think it is important that what we do, literary scholars and creative writers both, makes a difference to how people outside academy walls think. In the history of rhetorical theory, there is a recurring, commonplace idea that the person trained in rhetoric will, through the ethical training in that discipline, constitutionally be able to contribute only to good actions or ideas that improve the state or the community. Cicero put the idea most succinctly, and most famously, in his definition of the ideal orator/citizen as “the good man speaking well.” Learning to “speak well,” however, required years of intense training in the minutiae of the discipline, a close study of the history of oratory.
While this ideal resolutely -- and somewhat courageously -- ignores what we know about human behavior, I do think that as an ideal it offers an important model to live up to. I see the Ph.D. in creative writing as an opportunity to undertake the same kind of close study of literature and writing as ancient rhetoricians would have undergone in their study of oratory, and as a way to position myself to bring that knowledge and experience into both my writing and the classroom without having to give up, or shelve for a long period, my creative work. In fact, it was in support of my creative work that I took up doctoral study.
Cicero’s formulation of the ideal citizen leads me back to my own ideals about the creative dissertation. The creative writer makes a difference not by telling people how to vote, or by engaging in the public sphere with anti-government screeds. Rather, the way literature can matter is by offering a model of the world as it is, in the hope that readers will be moved to action in the real world. Literature is a form of epideictic rhetoric, perhaps the form par excellence of the epideictic:a poem or a novel or a film argues for the values that its authors believe are important to the way we live our lives.
For example, Bertolt Brecht, in his essay “The Modern Theater is the Epic Theater,” makes a list of what it is that epic theater does. According to Brecht’s list, the epic theater:
turns the spectator into an observer, but arouses his capacity for action forces him to take decisions [provides him with] a picture of the world he is made to face something… brought to the point of recognition
I read Brecht’s description of epic theater’s functions as a modernist reworking of the ideal orator tradition, the tradition of the artist offering his readers more than polemic -- offering his readers an experience from which they can learn about their own lives.
The creative Ph.D. is vital to making this possible, if it is possible, because literature (any art, in fact) does not come from nowhere. Or, more importantly, it should not come from nowhere. Good writing comes from intense study and reading, the kind of reading that people don’t typically have time for in the frenetic world of contemporary business or the professions. Moreover, what I would call good writing, the kind of writing that, regardless of genre, has something in common with Brecht’s epic theater, requires its author to have a sense of its location between the past and the present.
The Ph.D. in creative writing gives writers the time and training to explore their fields that they may not get in M.F.A. programs, no longer get as undergraduates, and certainly do not get in high school. At the very least, doctoral work exposes writers and artists to a liberal education that prepares them for analyzing, framing and being in the world in any number of different ways. Doctoral-level reading, doctoral-level thinking, doctoral-level writing will make possible the art that creative Ph.D.s will produce. I think here of Flaubert’s quip that, in preparation for Bouvard and Pecuchet, he had to read 300 books to write one (though the reference might cut both ways, as Julian Barnes has described that book as challenging in being “a vomitorium of pre-digested book learning”). I could call on Matthew Arnold and T.S. Eliot as well, were I eager to lay myself open to misguided charges of cultural conservatism.
But the human need for learning through art goes beyond liberal or conservative approaches to writing and teaching. The experience of literary study at the highest level gives writers the cognizance of literary history they need to produce the epic theater, the epideictic, of our time -- to be good men and women speaking well, writing well, leading and teaching.
The issue for creative writing is that of quality. The value of the creative doctorate is in the opportunity it offers to unite the best elements of the scholarly study of literature or art with the best elements of the study of craft. The writers and artists who come out of creative Ph.D. programs will not only be better guardians of our various and multiform cultural heritage, but they will be better teachers, better thinkers, better innovators. Their research and learning, in the form of creative and critical work, will matter both in the academy and beyond.
In her column, Perloff poses the rhetorical question of where the doctorate in creative writing leaves the idea of the doctorate as such. “Hasn’t the doctorate always been a research degree?” her concerned professor asks in the face of invading creative writers. Yes, it has been, and for creative writers, it remains vitally so.
David Gruber is assistant to the director of the University Writing Program and a graduate teaching assistant in English at the University of Denver.
The title of this column is the title of a manuscript three of us dreamed up some eight years ago. I liked the "how-I-spent-my-summer-vacation" jangle of the words, suggesting something at once so obvious as to be dumb and so dumb as to seem clever. Narrative essays on how a group of people actually wrote their dissertations! Who would have thought? And yet, who could not have thought? The very idea seemed to fit into a mood of exploring all sorts of unconsidered academic practices, a few seemingly invisible.
So we drew up a call for papers. Meanwhile, my two colleagues set about writing their own narratives, as we all canvassed our friends. Gradually, contributions appeared. Organizing principles took shape. Editing began. We actually had a manuscript! Not all of the contributions were as strong as we'd hoped. But most were. And at least the whole didn't suffer from a problem I had been warned plagues all essay collections: sounding as if each essay has been written in the same voice.
Finally, the existential moment drew nigh -- the pitch to a publisher. I began with one whose senior editor I chanced to know. He called for the manuscript, he secured a reader. Was our idea actually going to see the light of published day? Could the process be so smooth? Alas, no. The reader was cool. The idea, it seemed, was interesting. But not all the individual contributions were up to it (excepting a couple of the ones I thought weakest, though including a couple I thought strongest). Worse, the manuscript needed the sort of heft that can only be provided by big names.
This last objection especially maddened me. A section of our introductory rationale explicitly addressed this question. None of us believed in big names for this project because writing a dissertation abides in the profession as something you do in order to get past it (and ideally on to the next stage, publication as a book). The only people who would be interested in writing about how they wrote their dissertations would be people who were not destined to be "names."
The subsequent fate of this manuscript is simply told. It never got published. It never even got a reading from another publisher. Was our pitch letter unsatisfactory? Was the whole idea just a non-starter? In my pitch experience, you never know why, if a publisher's door doesn't swing open. Your manuscript is "just not right for our list." This is usually as specific as a letter of response will be, although sometimes there will be something additional about financial exegencies, worthy manuscripts, and the parlous state of academic publishing today, not to say life itself.
I tell this story for a complicated knot of reasons, having to do with a belief in the power of narrative, a horror of wasted effort, and an acquiescence to the enduring prospect of rejection in professional life. The nice thing about writing a dissertation -- as opposed to writing about writing it -- is that it appears at first to swing free of any of these things, beginning with the fact that nobody ever reads of not successfully writing a dissertation; to write one is perforce to complete it -- and to defend it successfully and finally to receive the doctorate.
What if you fail, and then attempt to write about it? Does anybody actually do this? Whether or no, good luck trying to publish it. Bad enough to try concerning a successful dissertation. Although an account of an unsuccessful one might reveal more about the conditions of writing a dissertation in the first place -- according to a logic whereby failure (or defeat) reveals more about success than success (or victory) itself -- the whole power of the disciplinary narrative embedded in the dissertation is that you complete it, period. Then, perhaps, an individual story begins, albeit again one only possible to relate as a story of success; "How I Wrote My Book," though, is less promising a title than "How I Wrote My Dissertation."
Yet I continue to believe that a narrative -- carefully conceived, creatively organized, and searchingly set out-- about virtually anything possesses an undeniable power of its own. Moreover, some of the best narratives have to do with subjects heretofore disdained, marginalized, or suppressed. Within academic life, a narrative of how you wrote your dissertation constitutes, I think, one of those subjects. How else to demonstrate why to date the story of actual dissertation writing appears to be such an unworthy one?
It's long been a fancy of mine that anything to do with dissertations participates very deeply and mysteriously with waste. Even to complete one efficiently is to have had to keep at bay all manner of false starts, misconceived research, sloppy organization, and other things dissertated flesh is heir to, including inflexible dissertation committees and absent dissertation directors. It's as if to begin in the first place is to have to ignore all this. Many can't. These include people who get to dissertation stage and stop as well as those who never get started.
Another fancy: How I Wrote My Dissertation failed as a project in part because it aimed to explore the waste implicit in writing a dissertation. This was not our intention. (Nor was it the purpose of any of the individual essays.) Yet one reason the very subject appears unworthy is because it cannot avoid bringing to light factors that the profession prefers be suppressed. These include everything from how much time the writing of a dissertation actually takes to how idle is the relation between the completed doctoral degree and a job -- any job.
Writing a dissertation is of course in large part a ritual. It was a ritual when the research it takes to write one could still be expected to inaugurate a scholarly career. Today, when even those who still have some legitimate claim to such a career (because of their institutional pedigree or the disciplinary networks of their directors) can easily wind up as adjuncts, the research seems more hollow than ever. How I Wrote My Dissertation becomes Why I Wrote My Dissertation -- and the reasons emerge as so individual or distinctive (at least this was so in our collection) that ritual efficacy itself is threatened.
Everybody in higher education has an investment in maintaining this efficacy, which is ostensibly so crucial that it cannot be exposed to the vicissitudes of personal experience, as any personal narrative is bound to do. Indeed, personal experience lies at one end of a division encompassing the whole of academic life, at the other end being impersonal professional authority. This authority can of course be questioned -- and personally -- at many levels. But there are levels below which no questioning goes.
A dissertation apparently occupies one of these levels. We don't care Why I Wrote Mine because we care so much instead about the dissertation itself -- whether as the means of authorized entry into a career in higher education or just as a criterion for sorting out prospective adjuncts in terms of their highest degrees. To care about the dissertation is not to care why you or anybody else either did or didn't write one. To care about the dissertation means to believe that even the individual waste involved in writing one can be in some way recuperated.
Curiously, not one of the contributors to How I Wrote My Dissertation would disagree with the last statement. (As I once secretly hoped a few would.) To each, writing a dissertation was worth it, even if it took too long, cost too much, and did or didn't matter with respect to a job. Yet, alas, in the public forum that only publication can command, everybody got rejected together anyway. This brings me to a final point: rejection itself. You've got to be prepared for it in professional life -- the article you can't get published, the class with which you can't connect, the tenure you are denied, the position for which you not got an interview. Arguably, in the construction of a career, the dissertation represents its initial moment, because a dissertation can be rejected.
How I Wrote My Dissertation didn't -- or doesn't -- disturb this moment. And yet in presuming to tell a group of individual stories of how dissertations were accepted, the manuscript does implicitly comport with another story, about how each one could have been rejected. Once more, I think, it is apparently central to the profession that the actual basis of rejection or acceptance not be explored too closely, lest the line between the two grow indistinct or arbitrary. (Was this why the publisher's reader called for narratives of "names," as if to guarantee the boundary?) Part of caring about the importance of a dissertation means upholding both the standards it presumes and the integrity of these standards.
Nobody wants to hear about rejection. Not only because it is always judged to smack of "sour grapes," but because virtually each time rejection threatens to edge up uncomfortably beside acceptance -- and then, although all is not lost, much might well become confounded. The profession after all is full of people who have been rejected in some significant way. (Or in the case of people who choose not to attempt to write a dissertation, effectively self-rejected.) We teach right alongside them. They are part of who we are. No, they are who we are, whether, for starters, we have written dissertations or not. But we don't know many of their -- our -- stories, especially those that courted, or continue to court, rejection.
I've lost touch with the majority of the contributors (and one of the editors) to How I Wrote My Dissertation. I don't know if the rejection of our manuscript bothers any of them; most of the rest I do know seem to have forgotten about it or at least don't bring it up. Why bother? Anyway, in academic publishing, collections of essays especially constitute a crapshoot. (At the moment, I have, let's see, four respective essays with four proposed collections, and haven't so much as heard from the editors of two for a couple of years.) You lose, you move on. What else to say? Not all rejection is worth pondering. Not all rejection is worth narrating.
There are two reasons for offering something of mine. One is that the subject of the rejection marks perhaps the profoundest disconnect in higher education between a professionally authorized project (writing a dissertation) and a personally imagined one (writing about how you wrote it). The second reason follows from the first: Anything to do with dissertations -- ranging from how their content has changed or how they are monitored though what functions they serve -- occupies one of the great mystified spaces. It is mystified because it is uncontested. And it is uncontested, I believe, because it is still not subject to narrative.
People often ask me why I serve on so many committees. I usually tell them a story about my grandfather. When I was a young child, I often saw him in a T-shirt that read, “Don’t ask me, I’m not on a committee.” Beneath this motto was a trail of enigmatic paw prints. To my young eyes, the paw prints seemed to indicate a level of playfulness and mischief, but also perhaps an element of dehumanization. Even before I knew what a committee was, I made up my mind that I wouldn’t make the same mistake. I would be on a committee.
For most of my life, this thought remained dormant. All that changed, however, when I finished my M.A. at Chicago Theological Seminary, having submitted a translation and commentary on an essay Derrida added to the French edition of The Gift of Death, and made the decision to stay for my Ph.D. as well. Perhaps unexpectedly, given the connotations that a “seminary” calls to mind, my motive in staying there was the intellectual freedom provided by the interdisciplinary Ph.D. program, which would allow me to pursue my interest in contemporary continental philosophy and to seek out resources in the Christian tradition that would resonate with the interest in St. Paul shown by Badiou, Agamben, and others. (My interest has since shifted somewhat, but of course that is one of the benefits of intellectual freedom.)
Having made a significant commitment to the institution, I decided that I would become more involved. The easiest way to do that seemed to be to volunteer as a student representative to Academic Council. I was one of several student representatives, and though there was a place on the agenda for us to bring up matters of student concern, we most often had very little to contribute. I attended very faithfully, though, as a way of getting a feel for how faculty self-governance works in an independent seminary.
The following fall I signed up for a second term on Academic Council. Starting the previous spring and continuing into the fall, there was considerable controversy at the seminary about the decision to convert student housing into a commercial rental property, and I worked with some of my fellow students to attempt to put together an “open letter” from the student representatives to the Academic Council and the leaders of student groups to the Board of Trustees, expressing our concern about the situation. Due to my involvement, the dean named me as one of two student representatives to Academic and Student Affairs Committee of the Board of Trustees.
My service on Academic Council also made me eligible to serve on the search committee for an open faculty position in New Testament. That same year, I began a two-year term as the seminary’s student liaison to the American Academy of Religion, which required submitting various reports and -- of course -- serving on a committee at the national meeting, which that year largely served as an opportunity for us to ask a high-ranking administrator in the academy questions about the organization and its future.
As I reflect on the events of the last year, then, one thing seems pretty obvious: I’ve served on a lot of committees. Now that I am making the transition toward my comprehensive exams and dissertation, I am planning on retiring from student leadership (with the exception of serving the remainder of my term as student liaison), and this seems like an appropriate time to reflect on what I’ve done in the course of serving on these committees. First, I’ve become acquainted with some of the routine tasks of faculty self-governance and with the role of the board of trustees. I couldn’t have chosen a better time to be involved -- the seminary was in the process of adopting a new strategic plan and going through its periodic re-accreditation. I’ve also served my primary professional organization at the national level and seen a faculty search from the inside. The search committee in particular was truly a great opportunity for me. I got to look through applicants’ files, giving me a chance to see what kinds of qualifications applicants for a competitive position generally have, to assess what seemed to be effective cover letters, and to see what kinds of things recommenders say. Beyond that, I was able to sit in on a few informal interviews with our most promising candidates at the national meeting of the American Academy of Religion and the Society of Biblical Literature.
All of this was very valuable experience, and although it sounds like a lot of work, it really wasn’t. Much of the actual decision-making, for both the faculty and the board, took place in the closed executive sessions. Thus the responsibilities of students, and so also the expectations of outside preparation work, were limited: Our primary role was to allow student voices to be involved in the conversation. Even at the peak of my involvement, I was averaging under two hours a week, and most of the time it was considerably less. Since I was in my coursework stage, I was normally on campus anyway on the days when the committees met.
Several of my fellow student representatives complained that Academic Council seemed to be a waste of time because we never “did” anything, but I came to view it as a kind of informal apprenticeship, somewhat similar to the two teaching assistantships that I held that same year. Unlike at some institutions, where the TA is expected basically to teach the entire class, I served as a true assistant, taking care of grading and other clerical tasks and also attending all the class sessions. At first, I viewed the class sessions as a boring ordeal, but gradually my perspective changed and I realized that it was a great opportunity to shift my focus away from the course content and observe the professor’s teaching style -- what works, what doesn’t, what I’d want to adopt, what I’d do differently.
In addition to providing a service to the professor, then, the teaching assistantship helped me to shift gradually from the mindset of a student to that of a teacher. Most grad students are aware of the need for this process, but few seem to be very conscious of the fact that teaching and research are not the entirety of what an academic does -- the nuts and bolts of administration are a major factor as well. Certainly few pursue an academic career because they want to do committee work, but it is an integral part of what it means to be part of a self-governing faculty. Taking the opportunity to participate, by necessity largely as an observer, in the various administrative processes was a very helpful way of getting a realistic view of what the professional life of an academic is really like.
A big part of that for me was simply observing how much time faculty had to devote to meetings and to preparing for them, particularly during the re-accreditation process. Perhaps more important, though, was the kind of informal “ethnography” of committees that I developed over time -- the politics of what is said and what remains unsaid, the role of the moderator in keeping the meeting moving and setting the tone, and a whole variety of other factors that an observer is able to pick up on in a way that someone suddenly thrown into the midst as a more active participant might not be able to.
Above all, I became convinced that patience and a sense of humor are the most important qualities to have in committee work. Patience allows one to see the value in the function of periodic meetings as a way of checking in and making sure that even matters that might be taken for granted are explicitly addressed -- that is, to appreciate the role of regular committees as helping to make sure that things continue to function smoothly and the way that not having to “do” anything can often be a positive sign. A sense of humor works to help maintain that level of patience by keeping what can easily become a tedious process from becoming too burdensome.
For my part, I often found humor in observing the small details of what was going on -- the way that certain seemingly simple decisions could be indefinitely deferred, the people who seemed to enjoy the sound of their own voice and the people who made it their goal to say as little as possible in each meeting, the occasional surreptitious piece of reading material smuggled into the meeting. Much of the time, these small observations served only as an occasion to chuckle to myself, but on rare occasions, I have experienced moments that approach the sublime.
The best such moment came in the course of the meeting of the student liaisons at the AAR. The administrator who had come to our meeting was discussing concerns about how certain institutions were conducting their interview processes, including meeting in inappropriate settings, asking inappropriate questions (particularly about sexual orientation), and basically engaging in a wide panoply of inappropriate behaviors. He assured us all that the AAR was doing everything possible to crack down on such behavior among the users of its job listing service, and speaking on behalf of the AAR more generally, he said, “We are committed to being appropriate.”
“We are committed to being appropriate” -- it is a line I have treasured in my heart and meditated upon ever since. Perhaps I should make a T-shirt.
The first anthology of criticism I read in college was a low-budget volume edited by David Lodge entitled 20th-Century Literary Criticism. It was for an undergraduate class, the first one that spotlighted interpretation and opened a window onto graduate topics. A year later, this time an M.A. student at the University of California at Los Angeles, I took a required course on literary theory, with the anthology Critical Theory Since Plato (1971) edited by Hazard Adams. In a seminar not long after we toiled through Critical Theory Since 1965 (1986), edited by Adams and Leroy Searle, and another class selected Contemporary Literary Criticism: Literary and Cultural Studies (1989), edited by Ron Schleifer and Robert Con Davis. After I left graduate school, more literary/cultural criticism anthologies appeared along with various dictionaries and encyclopedias. The process seems to have culminated in The Norton Anthology of Theory and Criticism (ed. Vincent Leitch et al), whose publication in 2001 was momentous enough to merit a long story by Scott McLemee in The Chronicle of Higher Education that included the remark, “An anthology stamped with the Norton brand name is a sure sign of the field’s triumph in English departments.”
For McLemee to speak of “stamping” and “branding” was apt, more so than he intended, for every anthology assigned in class carries institutional weight. From the higher floors in the Ivory Tower, anthologies may look like mere teaching tools, and editing them amounts to service work, not scholarship. But while professors may overlook them except at course adoption time, for graduate students just entering the professional study of literature and culture, anthologies serve a crucial guiding function. Students apply to graduate school in the humanities because of their reading, the inspiration coming usually from primary texts, not critical works -- Swift not Barthes, Austen not Bhabha. They go into English because they like to read novels, or history because the past intrigues them, or philosophy because they want to figure out the great questions of life. Soon enough, they realize that joy, appreciation, moral musing, and basic erudition don’t cut it, and the first year or two entails an adjustment in aim and focus. The discourse is more advanced and specialized, critical and ironic. New and exotic terms emerge -- “hyperreal,” “hegemony,” “postcolonial” -- and differences between contemporary academic schools of thought matter more than differences between, say, one epoch and another.
Fresh students need help. What the anthologies do is supply them with a next-level reading list. The tables of contents provide the names to know, texts to scan, topics to absorb. In spite of the radical and provocative nature of many entries, the volumes mark a canon formation, a curriculum-building activity necessary for doctoral training. Plowing through them is not only a course of study but also a mode of professionalization, a way to join the conversation of advanced colleagues. As tutelage in up-to-date thinking, they strive for coverage, and to help young readers take it all in, they arrange the entries by chronology and by different categories. The Norton, for instance, contains an “Alternative Table of Contents” that divides contributors up by 42 classifications including “The Vernacular and Nationhood,” “Gay and Lesbian Criticism and Queer Theory,” and “The Body.”
As a poor and insecure 25-year-old in the mid-80s, I slogged through the selections one by one, and I thought that completing them would acquaint me with every respectable and serious current thread in literary and cultural thinking. But when I look back at them today, the anthologies look a lot less comprehensive. In fact, in one important aspect, they appear rather narrow and depleted. The problem lies in the sizable portion of the contributions that bear a polemical or political thrust. These pieces don’t pose a new model of interpretation, redefine terms, outline a theory, or sharpen disciplinary methods. Instead, they incorporate political themes into humanistic study, emphasize race/class/gender/sexuality topics, and challenge customary institutions of scholarly practice. When they do broach analytical methods, they do so with larger social and political goals in mind.
The problem isn’t the inclusion of sociopolitical forensic per se. Rather, it is that the selections fall squarely on the left side of the ideological spectrum. They are all more or less radically progressivist. They trade in group identities and dismantle bourgeois norms. They advocate feminist perspectives and race consciousness. They highlight the marginalized, the repressed, the counter-hegemonic. And they eagerly undo disciplinary structures that formed in the first half of the 20th century.
Reading through these titles (in the Norton: “On the Abolition of the English Department,” “Enforcing Normalcy,” “Talking Black,” “Compulsory Heterosexuality and Lesbian Existence,” etc.), one would think that all decent contemporary criticism stems from adversarial leftist impulses. There is nothing here to represent the conservative take on high/low distinctions, or its belief that without stable and limited cultural traditions a society turns vulgar and incoherent. Nothing from the libertarian side about how group identities threaten the moral health of individuals, or how revolutionary dreams lead to dystopic policies. The neoconservative analysis of the social and economic consequences of 1960s countercultural attitudes doesn’t even exist.
And yet, outside the anthologies and beyond the campus, these outlooks have influenced public policy at the highest levels. Their endurance in public life is a rebuke to the humanities reading list, and it recasts the putative sophistication of the curriculum into its opposite: campus parochialism. The damage it does to humanities students can last a lifetime, and I’ve run into far too many intelligent and active colleagues who can rattle off phrases from “What Is an Author?” and Gender Trouble, but who stare blankly at the mention of The Public Interest and A Nation at Risk.
This is a one-sided education, and the reading list needs to expand. To that end, here are a few texts to add to this fall’s syllabus. They reflect a mixture of liberal, libertarian, conservative, and neoconservative positions, and they serve an essential purpose: to broaden humanistic training and introduce students to the full range of commentary on cultural values and experience.
T.E. Hulme, “Romanticism and Classicism” (first published 1924). This essay remains a standard in Anglo-American modernist fields, but it seems to have disappeared from general surveys of criticism. Still, the distinctions Hulme draws illuminate fundamental fissures between conservative and progressive standpoints, even though he labels them romantic and classical. “Here is the root of romanticism: that man, the individual, is an infinite reservoir of possibilities; and if you can so rearrange society by the destruction of oppressive order then these possibilities will have a chance and you will get progress,” he says. The classicist believes the opposite: “Man is an extraordinarily fixed and limited animal whose nature is absolutely constant. It is only by tradition and organization that anything decent can be got out of him.” That distinction is a good start for any lecture on political criticism.
T.S. Eliot, “Tradition and the Individual Talent” (1919). Eliot’s little essay remains in all the anthologies, but its central point about the meaning of tradition often goes overlooked. Teachers need to expound why tradition matters so much to conservative thinkers before they explain why progressives regard it as suspect. Furthermore, their students need to understand it, for tradition is one of the few ideas that might help young people get a handle on the youth culture that bombards them daily and nightly. They need examples, too, and the most relevant traditionalist for them I’ve found so far is the Philip Seymour Hoffman character (“Lester Bangs”) in the popular film Almost Famous.
F.A. Hayek, The Counter-Revolution of Science (U.S. edition, 1952). Most people interested in Hayek go to The Road to Serfdom, but the chapters in Counter-Revolution lay out in more deliberate sequence the cardinal principles behind his philosophy. They include 1) the knowledge and information that producers and consumers bring to markets can never be collected and implemented by a single individual or “planning body”; and 2) local customs and creeds contain values and truths that are not entirely available to “conscious reason,” but should be respected nonetheless. Such conceptions explain why in 1979 Michel Foucault advised students to read Hayek and other “neoliberals” if they want to understand why people resist the will of the State. We should follow Foucault’s advice.
Leo Strauss, “What Is Liberal Education?” (1959). For introductory theory/criticism classes, forget Strauss and his relation to the neoconservatives. Assign this essay as both a reflection on mass culture and a tone-setter for academic labor. On mass culture and democracy, let the egalitarians respond to this: “Liberal education is the necessary endeavor to found an aristocracy within democratic mass society. Liberal education reminds those members of a mass democracy who have ears to hear, of human greatness.” And on tone, let the screen-obsessed minds of the students consider this: “life is too short to live with any but the greatest books.”
Raymond Aron, The Opium of the Intellectuals (English trans. 1957). Aron’s long diagnosis of the intellectual mindset remains almost as applicable today as it was during the Cold War. Why are Western intellectuals “merciless toward the failings of the democracies but ready to tolerate the worst crimes as long as they are committed in the name of the proper doctrines”? he asks, and the answers that emerge unveil some of the sources of resentment and elitism that haunt some quarters of the humanities today.
Francis Fukuyama, The End of History and the Last Man (1992). First formulated just as the Berlin Wall came tumbling down, Fukuyama’s thesis sparked enormous admiration and contention as the interpretation of the end of the Cold War. When I’ve urged colleagues to read it, though, they’ve scoffed in disdain. Perhaps they’ll listen to one of their heroes, Jean-Francois Lyotard, who informed people at Emory one afternoon that The End of History was the most significant work of political theory to come out of the United States in years.
Irving Kristol, Neoconservatism: The Autobiography of an Idea (1995). With the coming of the Bush administration, the term neoconservative has been tossed and served so promiscuously that reading Kristol’s essay is justified solely as an exercise in clarification. But his analyses of the counterculture, social justice, the “stupid party” (conservatives), and life as a Trotskyist undergraduate in the 1930s are so clear and antithetical to reigning campus ideals that they could be paired with any of a dozen entries in the anthologies to the students’ benefit. Not least of all, they might blunt the aggressive certitude of political culture critics and keep the students from adopting the same attitude.
David Horowitz, Radical Son: A Generational Odyssey (1997). Many people will recoil at this choice, which is unfortunate. They should not let their reaction to Horowitz’s campus activism prevent them from appreciating the many virtues of this memoir. It is a sober and moving account of America’s cultural revolution from the moral high points to the sociopathic low points. At the core lies the emotional and ethical toll it took on one of its participants, who displays in all nakedness the pain of abandoning causes that gave his life meaning from childhood to middle age. Students need an alternative to the triumphalist narrative of the Sixties, and this is one of the best.
Professors needn’t espouse a single idea in these books, but as a matter of preparing young people for intelligent discourse inside and outside the academy, they are worthy additions to the syllabus. Consider them, too, a way to spice up the classroom, to make the progressivist orthodoxies look a little less routine, self-assured, and unquestionable. Theory classes have become boring enough these days, and the succession of one progressivist voice after another deadens the brain. A Kristol here and a Hayek there might not only broaden the curriculum, but do something for Said, Sedgwick & Co. that they can’t do for themselves: make them sound interesting once again.
Mark Bauerlein is professor of English at Emory University.
Thirteen years ago I began graduate school, and 24 years ago I was commissioned a second lieutenant in the U.S. Army. Of the two institutions -- graduate school and the Army -- perhaps surprisingly, my military experience has been most important in shaping my practices in the classroom. That may be because I teach survey courses at a community college rather than upper-level classes to interested majors at a research university. But, it is also because the military has honed the delivery of training over many decades, and, as I’ve discovered, military training methodology can work well outside of a military environment.
Every year, the Army recruits, at great expense, tens of thousands of young men and women. Given the costs of recruitment (and the dearth of eligible recruits), the Army cannot afford to lose many of these new soldiers. Army training is designed to take recruits who may know nothing about military life, discipline, or maneuvers, and mold them into warriors. Likewise, my task is to mold nascent scholars out of the under-performing, ill-prepared students who frequently show up in my community college classroom. I’ve found three Army practices most useful: making expectations explicit, the “crawl-walk-run” methodology, and formal evaluation of training.
Too often, we as instructors fail to adequately communicate our expectations to our students. Yes, we want a five-page analytic essay, but what does that look like? What are the components of a successful paper? And how do those components fit together? What sort of material should students use as sources? And how will students be assessed on this assignment? The army uses two tools to help its soldiers understand what’s expected of them in a specific task. First, an Army trainer shows soldiers what success looks like by performing the task correctly in front of soldiers so that soldiers “see” success. In my classroom, students see -- when the assignment is given -- what success looks like. In the case of a formal essay assignment, I hand out a similar assignment which has received an A and we, as a class, discuss what makes this worthy of an A. At this point, I also hand students a rubric that delineates exactly how I will grade the assignment.
After doing this, I deploy the Army’s second tool for communicating expectations -- a checklist to make sure that the task or assignment is completed properly. This list tells students exactly what they need to do to insure their work meets the specifications of the assignment. Giving out a checklist may seem like it inhibits students’ creativity, and I would agree in part with this criticism. But my students are more likely to leave key components of a task out than they are to be extraordinarily creative -- and for me, making sure students have a “cheat sheet” that spells out how to meet the standard is a fair trade-off. My students need to build their self-confidence, and this checklist gives them that needed boost, visibly letting them know they are meeting the requirements of the class.
“Crawl, walk, run” is both a philosophical and practical approach to assignments that works as well in my college classroom as it did for small-unit and individual training in the Army. In terms of Army training, doing a task at “crawl” speed means moving slowly and methodically through all steps, perhaps using a sand table to show individual or small unit movement through a field problem. In the classroom, it may mean taking a class through the steps of a research assignment -- going to the library, using the search tools, writing a thesis statement, assessing primary sources, evaluating the utility of secondary sources, and preparing footnotes and a bibliography.
Once students understand the individual steps, then they are ready for the “walk” phase. In Army terms, the walk iteration means that soldiers perform the task on their own, at a slow speed, with careful evaluation by leaders. In my classroom, the walk phase usually means that students, working in groups, do several of the component tasks -- select or analyze sources, write a thesis statement, or outline an argument, for instance -- and then present their work to me and the rest of the class. This practice enables students to learn from each other’s work while allowing me to critique each group’s efforts in detail, so students also get the benefit of extensive feedback from me.
Now students are ready for the “run” -- performing the assignment to standard on their own. “Crawl, walk, run” methodology allows under-prepared students the chance to build necessary skills incrementally, and it allows students who are already proficient to focus on individual steps that they may not have learned as well. Programmed correctly within the context of the course, such a methodology can also enhance a student’s understanding of course content -- the crawl and walk phases can be used in earlier sections of the course so that students are working with different material each time while still honing academic skills.
Finally, the Army stresses constant evaluation of training effectiveness. Likewise in my college classroom, I constantly evaluate my own performance as well as that of my students. I evaluate myself in several ways -- evaluating questions I get from students to see what was unclear, actively soliciting feedback from students about what was effective and what could be restructured for clarity or efficiency, and asking trusted colleagues to critique both assignments and my classroom delivery. The Army taught me to have a thick skin, and I appreciate receiving constructive criticism from students and peers. That criticism helps shape my approach to assignments in future semesters.
None of these techniques were either implicitly or explicitly taught to me in graduate school. As a teaching assistant, I watched instructors craft the delivery of their course content, but think very little about how individual assignments fit into the broad goals of their courses. Lectures, textbooks, exams and papers were all components of the course, but how they meshed together was often not clear -- to me or to the students. This methodology may not make sense at colleges with exceptionally well prepared undergraduates. But at community colleges like mine, institutions that reach many students who either didn’t have great high school preparation or for whom it was a long time ago, the training methodology I learned in the Army can be invaluable. My first department chair at Suffolk Community College used to tell me and my colleagues that our real focus should be on the middle third of the class. These Army practices help me do just that by showing capable but under-prepared students methods of achieving success using methodical guidelines. And what they learn in my class about studying and preparing assignments they can use in future classes.
Martha Kinney is an assistant professor of history at Suffolk County Community College and a lieutenant colonel in the Army Reserves. For more information about the Army's approach to training, a guide may be found here.
After Sidonie Smith, president of the Modern Language Association, took on the herculean task of asking the profession to rethink the shape of the dissertation, Arnold Pan at Post Academic took up the MLA’s call to respond. Among his suggestions was “legitimating non-academic options for Ph.D. students, beyond the more practical advice offered by the campus job center.” Much attention has also been devoted lately to what Bethany Nowviskie calls “#alt-ac,” the alternative academic track for humanities scholars.
But humanities education needs to do more than change the shape of the dissertation, legitimate non-academic jobs, or validate academic jobs that are not tenure-track teaching posts. The crisis in academic humanities, brought on by years of focus on nothing but turning out professor-wannabes, has to be addressed long before the job-placement stage. Long before the dissertation stage. We need to train Ph.D. students differently from the first day of graduate school.
If we value the humanities enough to teach them at the undergraduate level, if we believe that humanities education produces thoughtful, critical, self-aware global citizens, then we need to recognize that advanced training in the humanities cannot be simply the province of aspiring tenure-track faculty members. If there’s no prospect of a tenure-track job in the humanities, and humanities graduate programs train students for nothing but tenure-track jobs, how long can these programs be sustainable?
The current job crisis may be just the impetus graduate humanities education needs in order to recognize that what it has to offer is essential to this democracy, and essential to training leaders in a whole range of fields, far beyond academics.
As Pan, Nowviskie, and others point out, if most graduate programs devote a thought to "non-academic" careers for their Ph.D.s, they make very clear that there are indeed only two categories — academic and undesirable, i.e., everything else in the entire world. It’s that everything else we should be addressing, though.
Among my own friends I count a director of a state humanities council, a director of a university women's center, and a director of a state Center for the Book. Graduate work in the humanities has been absolutely essential to each of those professionals, in the sense that they learned writing, research, and analytical skills that they use every day. They discussed values, ethics, and aesthetics, and they applied abstract theory to concrete texts. They learned to develop complex arguments, to balance competing claims, to present clear positions. Yet in none of their graduate careers did any of them get any acknowledgment that such preparation might be of good use in any number of professional contexts. And in none of their graduate careers were any of them offered any coursework or workshops that focused on anything other than their academic disciplines.
What would a humanities Ph.D. program look like if it saw itself as preparing professional humanists rather than simply humanities professors? Courses from outside our departments could complement our intensive training in a chosen area of specialization. Deep work in a specialized area is most valuable, teaching us organization, research, writing, and often collaboration skills that are necessary in any humanities field. But how many of us, even in academic positions, would have benefited from a graduate course in organizational structures? In grant writing? In state and federal government? In arts administration?
A doctoral program that allows such courses to count toward the degree would be the stronger for it, I believe. If programs allowed two or three of these pre-professional courses in three years' of coursework, the loss of discipline-based courses would be more than made up for by the benefits of increased job prospects. Students who didn't want the courses needn't be force-marched into them, but the humanities departments would need to endorse the new approach. That will be the tough part -- getting faculty who might be unaware of these humanities-based professional careers to steer students in this new direction.
And it’s not just coursework that should change. Graduate student employment would need a shift in emphasis as well. Most grad student work in the humanities is teaching and research assistantships, of course. These jobs are not designed to prepare graduate students for careers as faculty members, though; they’re designed to teach the undergraduates at a very low rate of pay. But there are other jobs at the university, jobs that are equally designed to exploit graduate student labor but that offer training in a bigger variety of skills. When I was in grad school, I did survey research in the school of education and taught outside my department, in both the journalism school and the business school. I had friends who worked in administrative offices in women’s studies and African American studies. A guy in my department worked in the university’s foundation office and eventually went there full-time.
The one thing that all those jobs had in common, however, was that my home department neither placed us in the jobs nor recognized that the jobs offered anything of value to a humanities degree. Imagine a humanities department that assembled a list of jobs from all over campus and asked graduate students to consider what they might learn from each. Or, even better, a department that asked its graduate students to compile an electronic portfolio that collected work from both humanities courses and graduate employment. The portfolio could include an essay in which the student reflected on the skills and knowledge he or she was acquiring and the ways those things might be useful after the degree. It wouldn’t have to be a job portfolio, but it would have to ask the student to think about what he or she was learning, beyond the theory and content in the discipline.
A humanities department that really saw the value in placing thoughtful, well-trained humanists in government, nonprofit associations, and even business or the military, could shape a graduate experience around the idea of the humanist at large. Such a direction need not, and indeed should not, be a separate track. These wider opportunities and broader coursework should be available to all humanities graduate students. How much better would academics be in our committee work or as department chairs or in national organizations if we had been prepared in our graduate programs for those parts of our jobs that did not revolve around research or teaching?
We are beginning to acknowledge that the graduate training we offer in the humanities is simply not fair to our students, the vast majority of whom will never get tenure-track jobs in their disciplines. But the worth of humanities graduate education need not depend on the number of tenure-track humanists it produces. Graduate education in the humanities is an excellent preparation for many, many careers. But our students should not have to find those careers on their own, and they should not have to think of those careers as “non-academic” careers—the jobs we take when we can’t get the jobs we’ve been trained for. Humanities education needs to take itself seriously. We believe that undergraduate humanities programs produce thoughtful, informed, global citizens. Now we need to decide what we really want graduate humanities programs to produce.
Paula Krebs is professor of English at Wheaton College, in Massachusetts. She serves on the board of the Rhode Island Council for the Humanities.