Criticism, Character and Tenure

The word “criticism” shares the same root as “crisis” -- a bit of fortuitous etymology that everyone in literary studies remembers from time to time, whether in the context of sublime theoretical arguments (interpretation at the edge of the abyss!) or while dealing with the bottom-line obstacles to publishing one more monograph. Not to mention all the “criticism/crisis” musing that goes on at this time of year as people finish their papers for MLA, sometimes with minutes to spare.

Once this season of crisis management is past, I hope readers will turn their attention to Geoffrey Galt Harpham’s new book The Character of Criticism (Routledge). Harpham, who is president and director of the National Humanities Center, offers a meditation on what happens (in the best case, anyway) when a literary scholar encounters literary text. Most of the book consists of close examination of the work of four major figures -- Elaine Scarry, Martha Nussbaum, Slavoj Žižek, and Edward Said – who bring very different methods and mores to the table when performing the critic’s task. The contrast between Nussbaum and Žižek, in particular, seems potentially combustible.

But the book is not a study in the varieties of critical engagement possible now, given our capacious theoretical toolkits. Harpham’s argument is that literary criticism is a distinct type of act performed by (and embodying) a specific type of agent. We don’t read criticism just for information, or to see concepts refined or tested. Criticism is, at its best, a product of “cognitive freedom,” as Harpham puts it.

“Interpretation represents a moment at which cognition is not absolutely bound by necessity to produce a particular result,” he writes, “...and this moment serves as a portal through which character, an individual way of being in the world, enters the work.”

In the week just before the MLA convention, I interviewed Harpham by email about his book -- a discussion that led, in due course, to asking him for his thoughts on the MLA's recent report on scholarship and tenure. A transcript of the discussion runs below.

But first I want to quote some favorite lines in The Character of Criticism. They appear in a section drawing out, at some length, the parallel between literary criticism and the kinds of responsiveness and responsibility before “The Word” one finds in, say, Saint Augustine.

“The act of writing a critical text,” as Harpham puts it, “reaches deep into oneself, testing one’s acuity, responsiveness, erudition, and staying power. But critical writing also tests attributes normally considered as moral qualities, including the capacity to suspend one’s own interests and desires and to make of oneself a perfect instrument for registering the truth of The Word.”

Easier said than done, of course. Harpham goes on to describe the obligations thus imposed on the critic, thereby fashioning a new identity in the process. Here’s a passage in a format suitable to be printed out, clipped, and posted near one’s computer monitor for sober contemplation:

“One must .... wish to be regarded as a person who can overcome insubordinate impulses, remove clutter and distractions from the field of vision, isolate the main issues, set aside conventional views, persevere through difficulties, set high standards, see beneath appearances, form general propositions from particulars, see particulars within the context of general propositions, make rigorous and valid inferences from concrete evidence, be responsive without being obsessive, take delight without becoming besotted, concentrate without obsession, be suspicious without being withholding, be fair without being equivocal, be responsive to the moment without being indiscriminate in one’s enthusiasms, and so forth.”    --Geoffrey Galt Harpham

That final clause -- “and so forth” -- is really something. Talk about criticism and crisis! The prospect of adding more to that list of demands is either inspiring or terrifying, I suppose, depending on the state of one’s character....

Here's the interview:

Q: We use the word "character" as a way of talking about a fictive person. We also use it, when talking about real people, to refer to a definitive pattern of behaviors and attitudes (something durable, if not inflexible, about how they deal with other people). And then, of course, there's the old-fashioned, moralistic sense -- as in referring to someone "having character" or "being of weak character." When you write about the role of character in academic literary criticism, which of these usages fits best? Any secret yearning to be William Bennett motivating your work?

A: Since I’m talking about the character of criticism, your second version, the “definitive pattern of behaviors and attitudes,” is the most pertinent for my purposes. But the first usage, referring to fictive people, is also relevant, because fictive characters have to exhibit more consistency than real people, just in order to be recognizable from one textual moment to the next. I’m willing to entertain the possibility that the two are linked, that personal consistency is a self-imposed constraint or “fiction” that makes us recognizable to ourselves and others.

To me, the most powerful instances of criticism are those in which the drama of perception and understanding, which is also a moral drama in the broadest sense, is somehow visible in a shadowy way, encoded or encrypted in the critical text. I’ve always been struck by the fact that the criticism that impressed me most deeply managed to suggest an intimate encounter, even a kind of wrestling, between a strong, committed, informed, and responsive mind and a cultural text that probed and tested that mind, revealing its powers, limitations, and dispositions -- in short, its character. Part of the character of criticism is its capacity to reveal the character of the critic, even in ways the critic has no knowledge of. In fact, I think that criticism is, or can be, one of the most interesting ways of manifesting character.

Any yearning I had to be William Bennett was more than satisfied when I became president of the National Humanities Center: He was one of my four predecessors, before he went to Washington to serve in the Reagan administration. He is, however, interesting in terms of all three of your definitions of character. Because he does not display consistent behaviors (scolding people about their lack of moral strength on the one hand, compulsive gambling with horrific results in Vegas on the other), he has come to be seen as a kind of “fictive person,” one that exists only in books -- his books. Some people, inspired perhaps by those very books, might draw old-fashioned moral conclusions.

Q: Your first chapter has a long section describing a sort of ideal-typical "critical character" (so to speak) through an account of the act or process of critical writing as testimony to the power of a definitive encounter with a text. It’s powerful. But it’s also utterly inapplicable to an awful lot of critical prose one comes across, whether in academic books or journals or at sessions of MLA. The tenure-driven critical encounter often seems like an effort to apply some exciting new theoretical gizmo to a problem that would otherwise be uninteresting except as an occasion for trying out said gizmo. Or is that completely wrong? Is criticism as vocation (the response to a call) actually surviving amidst all the so-called "professionalization"?

A: I agree that the optimal “critical character” is rare, and for good reason. First, one has to be not only a critic, with a certain kind of education and professional opportunities, but also an unusually interesting person, one whose responses to the world are consistent, valuable, and meaningful, significant in a larger sense because they seem to proceed from some set of commitments and convictions rooted in human experience. Then, one has to be willing and able to expose oneself to a text, to respond without defensiveness, to be alive to a challenge. And lastly, one has to be able to write in such a way that both adheres to professional decorums and does something more by giving the reader some sense of the experience of coming to grips with an object of great significance and value.

In addition to the critics I discuss in my book (Scarry, Nussbaum, Žižek, Said), I can think of a number of others, but really, it’s a wonder anybody can do this. Much of what goes on in the world of literary studies (including gizmo R & D) supports the very best work by providing a professional context for it. Such work can be honorable without being heroic; it can, of course, also be neither. But the best work is done by those who are personally invested in it. I think if more people felt this way about criticism, their work and even their careers would profit and the whole field would be more interesting.

I have learned a great deal about the profession of literary studies from studies of professionalization, but I do not think that criticism benefits from a heightened awareness among critics of their status as professionals. It’s a difficult situation. As marginal and undervalued as literary scholars are at most colleges and universities, they need to develop their own credentialing structures just to keep their sense of dignity intact. But nothing kills the authentic spirit of criticism faster, or deader, than a consciousness of one’s own professional circumstances. Criticism is a professional discourse, but the sternest test of criticism is whether it can communicate even its most refined or challenging thinking in the vernacular.

I would not call criticism a vocation in the Weberian sense; nor would I call it a calling, as if it were a summons you could not refuse without disgracing yourself or violating your own deepest nature. But the greatest critics, the ones who animate and advance the discussion, do seem to have a certain need or urgency to communicate in this form that comes from within.

Q: Well, I want to challenge you a bit on part of that last answer. "Criticism is a professional discourse," you say. But that calls to mind R.P. Blackmur's statement to the contrary: his definition of criticism as "the formal discourse of an amateur." He meant, among other things, that the critic's role was connected pretty closely to the activity of the artist -- that it is a loving ("ama-teur") participation in the making and assimilation of literary form (even if at a certain, well, formal distance). Besides, the idea that there is anything particularly academic about literary criticism is a very recent development in cultural history. In 1920, an English professor who wrote criticism was doing something a little undignified and certainly "unprofessional." So how is it that all of this has changed? Or has it? If asked to name a recent critic whose work really manifested a strong sense of character as you've described it, I'd tend to think of James Wood, who's never been an academic at all.

A:  I'll push back a bit on that one, even if it forces me to defend what I have just criticized.  Blackmur began his career of poetry and editing in the 1920's; his critical career was finished over a half-century ago. And he was unusual even in the company of amateurs that dominated the literary scene at that time in that he did not have a B.A. Moreover, at the same time as Blackmur was advocating critical amateurism, John Crowe Ransom was writing "Criticism, Inc.," an early manifesto for professional academic criticism (1938). So even in Blackmur's time, his position was not the only, or even the dominant, position being enunciated.

I doubt that most people today would find criticism written in 1920 particularly interesting unless it was written by T. S. Eliot. Come to think of it, with the exception of Eliot's The Sacred Wood, I don't know of one durable, much less memorable piece of criticism that appeared in that year. Modern literature (post-Wordsworth) was not taught in universities, and criticism was necessarily confined to newspapers and journals like Hound and Horn, Blackmur's journal. The total situation today is different, and I don't think that we get a purchase on the present by reminiscing about the old days. Nor is James Wood an argument on your side. He is comfortable outside the academy, as is Louis Menand. But today, they're both at Harvard, Wood in a non-tenure-track position. They are part of the reason that (I contend) Harvard has, right now, the greatest English department ever assembled.

Universities provide jobs and -- in the case of Harvard -- ask little in return. Of course, the university does determine, in large ways and small, what goes on in criticism. Still, precisely because so little is explicitly demanded, an individual critic should find it possible to cultivate that "ama-teur" orientation that -- as I gather you feel -- is the precondition of character in criticism. If it were impossible, I would expect and even hope that talented young people would leave the profession (as I'll call it) in droves.

Q: The question of what counts as scholarship, and how it gets counted, is very much in the air, now, given the recent MLA task force report. The four figures whose work you examine in The Character of Criticism (Elaine Scarry, Martha Nussbaum, Slavoj Žižek, and Edward Said) have produced work in the usual venues and formats of scholarly publication. But all of them have been active in other ways -- through public-intellectual commentary, but also as activists, at least to some degree. Can you draw any lessons from their examples that might be useful now, as other critic try to figure out how to respond to the felt need to change the circumstances of academic work?

A: This question approaches some very swampy ground, and my response may not get us on dry land altogether.

One easy response to the general problem you describe would be to declare that "the circumstances of academic work" have already changed, and that blogging, chatting, intervening in online discussions, and "public intellectual commentary" conducted in non-academic forums should be recognized by promotion-and-tenure committees as valid academic work, to be considered alongside books and articles in scholarly journals.

Even though this, too, is an easy response, I disagree. Universities pay you to do university work and they are not obliged to accept just any view of what counts. And, as an abstract proposition, it is important, both to oneself and one's readers, that one has established one's scholarly credentials before one weighs in. I say "as an abstract proposition" because I'm all too aware that our credentialing procedures, even at the very best universities, are, shall we way, non-ideal. But in theory the discipline and skills acquired in the course of mastering a certain body of knowledge and finding one's voice in an established discourse serve one very well. None of the people I discuss in my book were public intellectuals at the beginning of their careers, with the exception of Žižek, who was operating in a very different environment. Nor, for that matter, were Noam Chomsky, Stanley Fish, Walter Benn Michaels, Skip Gates, Paul Krugman, or even Michael Bérubé.

One may think that it's stifling to insist that gifted young people hold their tongues until they prove themselves to their elders, but I don't see it that way. They aren't holding their tongues; they're doing what they were hired to do, and what they presumably love doing; and in the process they are preparing themselves so that if and when they do speak out on public matters in a public forum, they speak with an authority gained over years of reflection on the archive of human creative accomplishment. A distinguished professor, enraged, is a force to be reckoned with.

I know that the real effects of tenure, from an institutional point of view, are to depress faculty pay and encourage people to serve on committees. But among its side effects is a certain measure of protection for people who exercise their freedom of speech in oppositional ways. In fact, I think that tenure imposes a certain burden on one's conscience to do what one can when the situation calls for action.

Q: OK, but the new venues and potentials for digital publication represent only one part of the changing circumstances in academic work. The task force addressed the larger question of what kinds of scholarly activity count for tenure. Any thoughts on the rest of the report?

A: I've thought about tenure a good deal, especially in 2000-1, when I headed a university-wide committee on faculty evaluations and rewards at Tulane. Tulane was a perfect place for this debate to take shape because it was not an elite institution, but routinely compared itself to Brown, Northwestern, Emory, Rice, and Vanderbilt. In other words, faculty were encouraged to think of themselves as serious researchers, even though most of them were not -- if they were, the comparisons would have been more realistic.

What I found over the course of that year and a half was that the contemporary debate on tenure was being driven by a variety of forces, including state legislatures hostile to academia in general, conservative academics hostile to elite institutions, high-powered researchers at those very elite institutions, and a great many ordinary academics who were doing lots of committee work and teaching and wanted to be recognized, with promotions and salary increases, just like those who were publishing regularly. "Flexibility" was the key phrase: universities were encouraged to reward flexibility, as individuals realized themselves in their various ways. Our committee found several problems associated with "flexibility," each one of which we considered insurmountable.

The first was that it granted extraordinary powers to department chairs to work out individualized agreements with faculty members, and that was a recipe for corruption and cynicism. Second, it eroded faculty governance by making department chairs into members of the administration, rather than volunteers arising within the faculty. Third, it meant that the rank of professor at an AAU, Carnegie I institution would not mean anything in particular, and that would lead to a loss in status for all.

In principle, I was not opposed to "flexible" rewards for faculty, but I thought that each institution had to decide what it wanted to be, and how its faculty should be expected to think of themselves. At the top research universities, flexibility is a very bad idea: All faculty should be seen as having jumped over the same bars. At flagship state institutions, it's still a bad idea. But from there on down -- and at Tulane, one of the questions we had to face was exactly where we stood -- the issue was not so clearcut. Many colleges and universities may wish to reward superb teaching or loyal service to the institution with rank and salary increases.

The MLA recommendation that speaks most clearly to this issue is the one about the "letter of understanding" that institutions should issue to their faculty, outlining the expectations. But such explicitness would cause as much grief as it alleviated. It's a buyer's market for faculty, so lower-down institutions have a realistic chance to staff their faculties with Ph.D.'s from top-tier universities, and many do. These young stars may arrive still thinking of themselves as eminent-scholars-in-the-making. If they were given an official document stating that they were not to think of themselves in that way, it would have a demoralizing effect on them, their colleagues, and their students; it would be seen as a way of capping aspiration and upward mobility, and that would be inconsistent with the very idea of higher education.

If the letter of understanding outlined strict requirements for tenure and promotion, it would encourage precisely the wrong state of mind (checking the boxes) for real scholarship or intellectual inquiry. And if it said that there are many excellent self-realizing things you can do to be rewarded, then it would in effect abandon the very concept of "standards," and that, too, would be destructive.

Q: Whatever its potentially morale-killing effect, the "letter of understanding" would at least be explicit. Do you have an alternative in mind?

A: In a sense I do.

Each institution has to come to a rough understanding of itself, leaving enough room for anomalous individuals to be judged on terms appropriate to their contribution. I'm afraid there is no substitute for the act of judgment exercised case by case by people who are presumed to be competent. Though that presumption can be challenged in individual instances, it must be maintained, because it and it alone ensures faculty governance.

I speak from experience here. I -- like Martha Nussbaum, Louis Menand, M.H. Abrams, and many others -- was denied tenure (many years ago, at Penn), so I know how difficult it can be to maintain one's faith in the competence and judgment of one's betters. But the experience builds and tests character. Which is where we began, isn't it?

(A number of Harpham’s recent papers -- several of them overlapping with the themes of his new book – are available here.)

Scott McLemee
Author's email: 

Scott McLemee writes Intellectual Affairs each week. Suggestions and ideas for future columns are welcome.

A New Form of Academic Engagement

In her president’s column in the spring 2006 Modern Language Association newsletter, Marjorie Perloff focuses on the expansion of Ph.D. programs in creative writing (including doctorates in English that allow for a creative dissertation). Perloff argues that the growth of creative-writing doctorates was a reaction to politicization and specialization within the English discipline: “An examination of the catalogues of recently established Ph.D. programs in creative writing suggests that, in our moment, creative writing is perhaps best understood as the revenge of literature on the increasingly sociological, political, and anthropological emphasis of English studies.”  

She also cites recent job advertisements in English calling for candidates specializing in a range of theoretical approaches, which relegate the teaching of literature to “a kind of afterthought, a footnote to the fashionable methodologies of the day.”

Perloff is right on both counts: These are central factors that have led to the growth of creative writing Ph.D.s. But she also misses an important element, one that grows out of, but also underlies, the others. It is that people want what they think and write to matter, not just to their colleagues, but also to the world at large. Creative work, and the doctorate in creative writing, holds out this hope.

The doctorate in creative writing comes in various forms, but most are very similar to literary studies doctorates. I myself am a doctoral candidate in English at the University of Denver, writing a creative dissertation -- a book of poetry, accompanied by a critical introduction. As a graduate student, I’ve fulfilled the same coursework requirements as my literary studies peers, with the addition of four writing workshops. I’ve taken comprehensive exams in the same format as my literary studies peers. I’ve taught more or less the same courses as my literary studies peers. The only significant difference between my doctoral work and my literary studies colleague is in the dissertation.  

Sometimes, in fact, it strikes me as a bit comic to be doing the creative dissertation, but then I think about the fate of my work. I want my work to find its audience, though I realize that poetry has lost more readers, perhaps, than scholarship has over the last 50 years. Yet I believe that creative writing holds out more hope of finding readers, and of gaining readers back, than scholarship does. Hundreds or thousands of poetry books are published each year, and are more likely to find their way onto the shelves of bookstores than are scholarly studies. For fiction writers, the prospects are even better -- after all, there’s still a market for novels and short fiction.

However, it’s not just for readerly recognition that I want to do this creative work. It is because literature matters to how people live their lives, not just emotionally but intellectually. I speak here specifically of literature, but I think the principle holds true for any kind of creative work, even works we wouldn’t ordinarily think of as artistic, such as historical or psychological or anthropological studies.

Just a few days ago I was talking with a good friend of mine, a fellow graduate student working on her dissertation. My friend’s enthusiasm for the work and the discoveries that she is making, her eloquence on her subject, and her physical animation in talking about it were obvious, even if some of the nuance of her project was lost on me. But then she stopped herself and said, “Of course, nobody really cares about this.”

She described the frustration of talking about her project with non-academic friends and family members, how it takes too long to explain the work she is doing to people outside her specialty area, how their faces fall blank as she goes on too long in explaining the foundations of the debate in which she is involved. She laughed and said archly, “It’s not so bad once you get used to the idea that no one is ever going to read your dissertation except your committee.”  

I have had similar conversations with other friends working on dissertations, not just in English, but across the humanities, though the sense of writing into the void is particularly marked among those in my discipline. Let me say here that I don’t want to challenge the value of discipline-specific, specialized scholarship -- after all, it would be foolish to say that the intellectual work of teaching and writing does not require specialist knowledge, or that the ideas formulated in scholarly work don’t find their way to non-specialists through good teaching or through popularizers or public intellectuals, though we could stand a few more of them. Those academics who write for an extra-disciplinary audience, as Mark Oppenheimer pointed out in a recent essay in The Chronicle of Higher Education, play an important part in connecting the academy with the non-academic world, and shaping common conceptions of disciplines such as history. He wrote: “They have the influence that comes with writing for journals at the intersection of academe and the culture at large. They interpret scholarship for people who prefer to read journalism, and their opinions reverberate and multiply, if in ways that we cannot measure.”  

This is not a plea for greater “accessibility” or for a return to a “generalist” approach to English. Nor will I rehearse the yearly mocking that the titles of papers at the MLA convention get in major newspapers across the country.  But I do think that the sense that nobody’s listening or reading the work of scholars outside their specialized communities points to a real problem of the contemporary humanities department: the loss of audience, and with it, the loss of a sense that the work should matter to a larger, educated, non-academic audience.  

There’s no doubt that scholars have produced critical work in humanities subjects that does matter. I think of Raymond Williams, never the easiest of writers, but one who rewards the effort made in engaging with his work and who, perhaps because of his quasi-academic status, writes in such a way that his ideas could be understood outside the academy. I also think of John Berger, Susan Sontag and Fredric Jameson.  These are writers who can be exciting for readers coming from outside the academy, and who can influence the way readers experience texts, and even life.  

However, with the increasing professionalization of the university, the potential audience for scholarly work has diminished as scholarly writing has become more specialized and jargon-ridden. None of what I say is news, I know. But the creative doctorate as an approach to making scholarly research and thinking matter in the world is news, and very good news.  

I think it is important that what we do, literary scholars and creative writers both, makes a difference to how people outside academy walls think. In the history of rhetorical theory, there is a recurring, commonplace idea that the person trained in rhetoric will, through the ethical training in that discipline, constitutionally be able to contribute only to good actions or ideas that improve the state or the community. Cicero put the idea most succinctly, and most famously, in his definition of the ideal orator/citizen as “the good man speaking well.” Learning to “speak well,” however, required years of intense training in the minutiae of the discipline, a close study of the history of oratory. 

While this ideal resolutely -- and somewhat courageously -- ignores what we know about human behavior, I do think that as an ideal it offers an important model to live up to. I see the Ph.D. in creative writing as an opportunity to undertake the same kind of close study of literature and writing as ancient rhetoricians would have undergone in their study of oratory, and as a way to position myself to bring that knowledge and experience into both my writing and the classroom without having to give up, or shelve for a long period, my creative work. In fact, it was in support of my creative work that I took up doctoral study.  

Cicero’s formulation of the ideal citizen leads me back to my own ideals about the creative dissertation. The creative writer makes a difference not by telling people how to vote, or by engaging in the public sphere with anti-government screeds. Rather, the way literature can matter is by offering a model of the world as it is, in the hope that readers will be moved to action in the real world. Literature is a form of epideictic rhetoric, perhaps the form par excellence of the epideictic:a poem or a novel or a film argues for the values that its authors believe are important to the way we live our lives. 

For example, Bertolt Brecht, in his essay “The Modern Theater is the Epic Theater,” makes a list of what it is that epic theater does.  According to Brecht’s list, the epic theater:

turns the spectator into an observer, but
arouses his capacity for action
forces him to take decisions
[provides him with] a picture of the world
he is made to face something…   
brought to the point of recognition

I read Brecht’s description of epic theater’s functions as a modernist reworking of the ideal orator tradition, the tradition of the artist offering his readers more than polemic -- offering his readers an experience from which they can learn about their own lives.    

The creative Ph.D. is vital to making this possible, if it is possible, because literature (any art, in fact) does not come from nowhere. Or, more importantly, it should not come from nowhere.  Good writing comes from intense study and reading, the kind of reading that people don’t typically have time for in the frenetic world of contemporary business or the professions. Moreover, what I would call good writing, the kind of writing that, regardless of genre, has something in common with Brecht’s epic theater, requires its author to have a sense of its location between the past and the present.  

The Ph.D. in creative writing gives writers the time and training to explore their fields that they may not get in M.F.A. programs, no longer get as undergraduates, and certainly do not get in high school. At the very least, doctoral work exposes writers and artists to a liberal education that prepares them for analyzing, framing and being in the world in any number of different ways. Doctoral-level reading, doctoral-level thinking, doctoral-level writing will make possible the art that creative Ph.D.s will produce. I think here of Flaubert’s quip that, in preparation for Bouvard and Pecuchet, he had to read 300 books to write one (though the reference might cut both ways, as Julian Barnes has described that book as challenging in being “a vomitorium of pre-digested book learning”). I could call on Matthew Arnold and T.S. Eliot as well, were I eager to lay myself open to misguided charges of cultural conservatism. 

But the human need for learning through art goes beyond liberal or conservative approaches to writing and teaching. The experience of literary study at the highest level gives writers the cognizance of literary history they need to produce the epic theater, the epideictic, of our time -- to be good men and women speaking well, writing well, leading and teaching.  

The issue for creative writing is that of quality. The value of the creative doctorate is in the opportunity it offers to unite the best elements of the scholarly study of literature or art with the best elements of the study of craft. The writers and artists who come out of creative Ph.D. programs will not only be better guardians of our various and multiform cultural heritage, but they will be better teachers, better thinkers, better innovators. Their research and learning, in the form of creative and critical work, will matter both in the academy and beyond.

In her column, Perloff poses the rhetorical question of where the doctorate in creative writing leaves the idea of the doctorate as such. “Hasn’t the doctorate always been a research degree?” her concerned professor asks in the face of invading creative writers. Yes, it has been, and for creative writers, it remains vitally so.

David Gruber
Author's email: 

David Gruber is assistant to the director of the University Writing Program and a graduate teaching assistant in English at the University of Denver.

Double Haiku on MLA Interviewing

Drinking half moon night
Career candidates awake
Morning thwap, thwap, thwap

Twinkling here and there
Helicopters fly like stars
Of English commerce

Will Hochman
Author's email: 

Will Hochman is associate professor of English at Southern Connecticut State University.

Critical Condition

One of the sadder comic novels I’ve ever read (and the qualities of humor and melancholia do tend to go together) is Wilfrid Sheed’s Max Jamison, which appeared in Britain in 1970 as The Critic. The title character is a prominent cultural journalist, and sometime university lecturer, who is at the peak of his career -- meaning it’s all downhill from there. And he knows it. He’s becoming a parody of himself. In fact, the process is more or less complete. He imagines one of his old professors saying, “Jamison has this rigid quality, sometimes known as integrity, sometimes known simply as ‘this rigid quality.’ ”

The novel is set during the high tide of the 1960s counterculture. But Jamison is much too cerebral to go hippy, even for a little while. He’s read too much, seen too many plays, made too much a religion of the Higher Seriousness to tune in, turn on, or drop out.

“He was in love with the way his mind worked,” the narrator says, “and he was sick of the way his mind worked. The first thing that struck you about it, wasn’t it, was the blinding clarity, like a Spanish town at high noon. No shade anywhere. Yet not altogether lacking in subtlety. Very fine filigree work in the church. This was the mind they were asking him to blow.”

By the end of the novel, Jamison carves out a niche in academe and gets bogged down while writing a book called The Fallacy of the Post-Modern. (That would have been pretty avant garde in 1970, not like now.) I’m told that it was once common knowledge, in certain circles anyway, that the book is based on the career of Richard Gilman, a professor of drama at Yale who died last year.

If Sheed's novel holds up remarkably well after almost four decades, though, it's not for any roman àclef revelations about a specific person. Max Jamison is the intimate portrait of a mind at the end of its tether -- a mind not quite willing (or able) to cut that tether, and so condemned to circle around and around, at whatever limit it can reach, thereby digging itself into a rut. This is not an uncommon situation.

Rereading the novel this week, I winced at one line in particular about Jamison’s routine as a cultural journalist: “He doggedly went on reviewing, getting better, he thought, in a field where improvement is seldom noticed.”

Sheed himself has written eloquent and sharp-eyed commentary on books. His instinct as a satirist is not savage; he feels compassion, and even some indulgence for Jamison’s self-pity. But the man does know how to land a dart beneath the skin.

Well, perhaps criticism is “a field where improvement is seldom noticed” – but seldom doesn’t mean never.

Earlier this month, a party was held at a bookstore in downtown New York to announce the finalists for the National Book Critics Circle awards. The winners in each category will be honored when the final decisions are revealed at the awards ceremony on March 3. The event a couple of weeks ago was kind of a warm-up -- part press conference, part excuse for New York literary folk to drink and mingle.

The selection of finalists by the NBCC board -- narrowed down from a list of titles nominated by the organization’s 500 or so members -- sounds like a grueling process, so there was a lot of steam to blow off.

The evening was also the occasion for announcing this year’s winner of the Nona Balakian Citation for Excellence in Reviewing, named after a longtime editor at The New York Times Book Review. I received the award a few years go (one of those rare moments when the tether seems to stretch a little bit). And so, for continuity’s sake, they asked if I would make the announcement. A few minutes before going to the microphone, I was handed a folded piece of paper that identified the winner as Steven G. Kellman. The name seemed vaguely familiar, but it took me a minute to place it.

“He’s in San Antonio?” I asked an NBCC board member. “A professor of literature?”

Yes, and yes. Small world! In the late 1980s, Kellman had been a contributor to a little magazine in Texas with which a mutual friend was involved. (I think I still owe them a manuscript.) Kellman has also co-edited Magill’s Literary Annual -- a useful work found in the reference section of any good university library. He recently published Redemption: The Life of Henry Roth (Norton, 2005), an acclaimed biography of the author of Call It Sleep, one of the classic American novels of immigrant life.

The full bibliography of Kellman’s work runs to appalling length. The list of his scholarly works alone would be impressive. Once you count his pieces for newspapers and magazines, the question of whether he can somehow write in his sleep does come up. As another Balakian winner who saw the list told me, “He’s reviewed more books than I’ve ever read.”

Since that announcement, I’ve spoken with Kellman by phone. We’ve also exchanged some e-mail. Perhaps I was expecting to interview the critic in Sheed’s novel -- someone glad to be honored, yet also a little tired of playing the game. But Kellman doesn’t sound that way. The man's energy level is alarming.

“Pauline Kael, who was honored by the NBCC with a lifetime achievement award," he said, "once said that she considered herself a writer whose subject happened to be movies.... What hooked me on bookery was the exhilaration of slinging words on the page and making them prance. The impulse is the same whether I am writing an academic monograph whose print run is in the low four figures or a guest column for Newsweek.”

He calls it “dispiriting" that "so many of those who profess literature, who have dedicated their lives to discovering and sharing the delicacies and intricacies of verbal art, display dull indifference to their own use of language.” The result, he says, is usually prose as succulent as a bowl of mashed turnips.

I asked him what models he followed in creating a style distinct from the monographic monotone.

One was Edmund Wilson, “the patron saint of public intellectuals, before that portentous term needed to be coined.” Others included Irving Howe, Susan Sontag, and George Orwell.

All of them being the usual suspects, of course. But one figure he mentioned as an inspiration did stand out: Leslie Fiedler, who was professor emeritus of English at the State University of New York at Buffalo when he died, four years ago this month. Fiedler won the lifetime achievement award of the National Book Critics Circle in 1998. (Again, small world.)

In his prime, Fiedler was an intellectual wild-man -- a critic who began writing about gay multicultural subtexts in classic American literature as early as 1948, when most of his readers thought he must be joking; who focused on the images and archetypes found in both serious literature and pop culture, as if both were necessary to understanding the human condition; and who started publishing work in the field known as disability studies before there was a field known as disability studies.

Kellman co-edited a festschrift called Leslie Fiedler and American Culture, published by the University of Delaware Press in 1999. As it happens, when that book appeared, I attempted to persuade the culture editor of an American magazine to let me review it. “Oh no,” she said, “we don’t want that. I mean, isn’t Leslie Fiedler crazy?”

Sure he was -- like a fox. Fiedler is one of those authors you can turn to for energy when your own is flagging. Kellman told me he was “inspired and humbled by the stunning example” of Fiedler’s criticism. He wrote “with zest, and not just about love and death in the American novel, as if that were not enough, but also about Dante, Shakespeare, science fiction, Siamese twins, and much else.” 

He also seems to have absorbed Fiedler’s willingness to address a large audience, whenever the chance presented itself. Aside from his literary journalism, Kellman has written a newspaper column (for which he received the H.L. Mencken Award in 1986) and reviewed more than a thousand films. I asked if such extracurricular efforts had caused him any trouble -- disapproving noises from colleagues, worries for the condition of his soul, etc.

“It doesn’t really come up,” he said. “Sometimes it’s as if each side doesn’t know about the other, since people in journalism don’t seem to pay attention to my academic work, either. I’d like to think that my journalism is made sharper and stronger by the discipline I come from. It seems to me like I benefit from being in both worlds. But they don’t really meet.”

Maybe that is for the best. It means nobody is asking him to make a choice between academe and the public sphere.

“I think living exclusively in either one,” he said, “would start to feel claustrophobic.”

Scott McLemee
Author's email: 

A Dean's View of the MLA Report

In one of his Meditations, Marcus Aurelius, the Roman emperor and Stoic philosopher, wrote, "Time is like a river made up of events which happen, and a violent stream; for as soon as a thing has been seen, it is carried away, and another comes in its place, and this will be carried away too." This sense of being a part of a time of incessant change animates the 2006 report of the MLA Task Force on Evaluating Scholarship for Tenure and Promotion. Begun in 2004, it is a rich, important document for anyone who wishes to reflect upon the contemporary rivers and streams of change of the academy.

I come to the report as a dean, specifically a graduate dean of arts and science in a large research university. Unlike Marcus Aurelius, I am no emperor. I find it a privilege to be a dean, even though the job has tempered my habitual optimism with stoicism. To oversimplify, the report treats the theme of change in the profession of modern languages and literature in three ways: the structural changes in United States higher education since World War II and their consequences for the humanities, especially for humanities faculty members; changes in the granting of tenure and promotion that people feared might happen but that seem not to have happened, at least not yet; and changes that ought to happen if the profession is to be wise, academically and socially useful, and robust.

Among the most important changes that the report explores is the well-documented rise of positions, full-time and part-time, that are off the tenure ladder. Tenure is increasingly limited to research universities and more affluent liberal arts colleges. Yet again, the rich are getting richer.  As a dean, I miss in the report a passionate yet logical definition and defense of tenure that I might use for several audiences --- the tuition-paying students who quickly turn to instant messaging in a class taught by a member of the Dead Wood Society, the trustees who wonder why academics should have job security when almost no one else does.  I can make such a defense, and have, but if tenure matters -- and an implicit conviction of the MLA task force is that it does -- then the defense must emanate from all of us who believe in it.

One pervasive anxiety explored in the report concerns a student’s life after the doctorate. The MLA report estimates that of every 100 English and foreign language doctoral recipients, 60 will be hired to tenure-track positions within 5 years. Of them, 38 will be considered for tenure at the institution where they were hired. Of them, 34 will be awarded tenure. The report, unfortunately, cannot say what happens to the 22 who leave the institution where they were hired before the tenure ordeal.  In my experience, some get recruited to another institution. Some drop out because they will believe they will not get tenure. Some take administrative jobs within higher education, and are judged as administrators, but still do vital scholarship and teaching. Some go on to non-academic careers, for which graduate school in the humanities still insufficiently prepares them.

As a graduate dean, even as I wonder about the 22 doctoral recipients who leave the institution that first offered them a tenure-track job and even as I celebrate the 34 Ph.D.s who do get tenure-track jobs, I feel that now well-honed guilt, anger, and concern about the 40 who are not hired to tenure-track positions within 5 years. To be sure, some deliberately and happily choose not to go on in academic life, but others would prefer to become academics. Despite all the national studies, including this report, about the oversupply of doctorates in the humanities, self-interested, faculty-controlled graduate programs are still too reluctant to limit admissions, still suspicious about doing regional coordination of graduate curricula and courses, and still petitioning for more financial aid and more students to teach. It is vulgar to call this a case of “Bring in the clones,” but the phenomenon yet again reveals, I have sadly concluded, how much easier it is to act on behalf of one’s self and one’s family, here the department or program, than on behalf of more abstract and psychologically distant goods, here the well-being of potential graduate students and of the profession as a whole.

The MLA report’s signal contribution is the call, by an impeccable committee of leading humanists, for a serious rethinking of scholarship and scholarly inquiry, which would then have ramifications for the conduct of academic institutions. I can see nothing but good coming out of such a rethinking, to be undertaken both nationally and locally, faculty member by faculty member, department by department, and institution by institution, as each articulates its particular role in the academic and social landscape. These roles will and should differ. Each will be important. The royal road to national prominence can take a number of routes and be paved with a variety of materials --- from yellow bricks to high-tech composites.

More specifically, the MLA report urges us to ask why the monograph has become the pinnacle of scholarly achievement, “the gold standard.” Why not the essay, or a series of linked essays? Why not other forms of scholarly achievement? And why must the dissertation be a “proto-book?” Why indeed? Is there any other form that the dissertation might take? I once had a conversation with a leading Renaissance scholar shortly after I became a graduate dean. “What is the most important reform in graduate education?” I asked. “Change the dissertation,” she said. Surely what matters about the dissertation is less the exact format than a form that displays what this capstone activity must display: respect for past work coupled with originality, independence of thought, and the capacity for sustained inquiry. Rhetorical flair would be nice, too. I have also argued for some years that the humanities graduate curriculum needs a vigorous overhaul, offering more common courses that programs share, including some introductory courses that would comprise a general education for graduate education. Among them could be, at long last, a required course in the ethics and history of scholarship.

Moreover, because of those new communications technologies, much scholarly inquiry is now being done digitally. Some of the most important work about and in digitalized scholarship is appearing from university presses, an invaluable resource that the task force correctly praises and for which it seeks more institutional resources. Yet many departments are clueless, all thumbs in the old-fashioned sense of the phrase, in doing evaluations of digital scholarship that respect peer review. Of the departments in doctorate-granting institutions that responded to the MLA’s survey, 40.8 percent report no experience evaluating refereed articles in electronic format, and 65.7 percent have no experience evaluating monographs in electronic format. This finding is similar to that of another useful study, here of five departments, including English-language literature, at the University of California at Berkeley. It concludes that what matters most in judging scholarship is peer review, but e-publishing is still tainted because peer review does not seem to have touched it sufficiently. Scholars are willing to experiment with digital communications. However, for nearly all, the “final, archival publication” must still appear in a traditional format. Only if faculty values change, the Berkeley report correctly suggests, will scholarly communications change. Deans may propose, but faculty actually dispose in questions of academic and curricular values.

The MLA report rightly argues that the academy tightly couples the canons of scholarly accomplishments with the awarding of tenure and promotion.  In brief, a faculty member gets the latter if s/he respects the former. Even as the report asks for a re-evaluation of these canons, it offers a series of recommendations for the administering of a transparent, fair tenure and promotion process. For the most part, these are sensible, and indeed, I was surprised that they are not already installed as best practices at most institutions. Of course, if possible, institutions should give junior faculty start-up packages if the institution is to require research and publication.  Of course, “collegiality” should not be an explicit criterion for tenure, because it might reward the good child and punish the up-start. However, a dean cautions, because tenure is forever, at least on the part of the institution, it is legitimate to ask how a candidate will contribute to the institution’s long-term well-being.

From this admonitory dean’s perspective, the report strays into boggy ground in its brief analysis of appropriate relations between someone up for tenure and the external letters that a tenure dossier now requires. “Candidates,” it states, “should have the privilege and the responsibility of naming some of their potential reviewers (we recommend half)." Candidates, the report further argues, should be able to exclude one or two figures whom they believe might be prejudicial.  This is a really bad idea. If tenure candidates were to have this power, the dispassionate and collective objectivity that is the putative value of peer review would be lost, and self-interest would fill the vacuum. Moreover, the temptations of cronyism, which external letters were meant to squash but which still flourishes among tenured faculty, might appear in a junior guise, accompanied by various modes of ingratiation with the powerful in a field who might then write a sweetly affirming letter.

Strangely, sensitive though the MLA report is to the growth in the number of non-tenure track jobs, and to the meaning of this growth, it is less radical than it might be in imagining the role of full-time, non-tenured scholars within an institution. The report argues, “The dramatic increase in the number of part-time non-tenure-track faculty members puts increased demands and pressure on all full-time tenure-track and tenured faculty members in many areas for which the casualized work force is not -- and should not be -- responsible: service on department committees and in departmental governance; student advising; teaching upper-level undergraduate and graduate courses; directing dissertations; and, less concretely but no less importantly, contributing to intellectual community building in the department and outside it, in the college and university….” But surely a qualified non-tenured faculty member should be able to be a significant academic citizen. Surely the report does not mean to construct such a hierarchy of faculty members with the tenure-track faculty as the philosopher kings and queens and the non-tenure-track professors as credentialed drones. If the report had more fully defined and defended tenure, it might have explored more adequately the distinctions and the overlap between not having and having tenure.

Let me not end with caviling and quibbling, but instead reiterate my respect for the conviction expressed by the task force about the profession’s relation to change.  It concludes, “It is up to us, then, the teacher-scholars of the MLA, to become agents in our academic systems and effect changes that reflect and instantiate appropriate standards of scholarly production and equity and transparency for our colleagues, our institutions, and our society.” Or, if a mere dean might revise the language of both a strong committee and an emperor, we neither helplessly observe nor flaccidly drift in the rivers of time. We shape their banks. We dam them or divert them or find new springs with which to refresh them. We build our rafts of thought and boats of words and navigate them. Bon voyage to us all.

Catharine R. Stimpson
Author's email: 

Catharine R. Stimpson is dean of the Graduate School of Arts and Science at New York University and a past president of the Modern Language Association.

Remember Baudrillard

A few days ago, I tried the thought experiment of pretending never to have read anything by Jean Baudrillard – instead trying to form an impression based only on media coverage following his death last week. And there was a lot more of it than I might have expected. The gist being that, to begin with, he was a major postmodernist thinker. Everyone agrees about that much, usually without attempting to define the term, which is probably for the best. It also seems that he invented virtual reality, or at least predicted it. He may have had something to do with YouTube as well, though his role in that regard is more ambiguous. But the really important thing is that he inspired the "Matrix" movie franchise.

A segment on National Public Radio included a short clip from the soundtrack in which Lawrence Fishburn’s character Morpheus intones the Baudrillard catchphrase, “Welcome to the desert of the real.” The cover of Simulacra and Simulation -- in some ways his quintessential theoretical text, first published in a complete English translation by the University of Michigan in 1994 -- is shown in the first film. Furthermore, the Wachowski brothers, who wrote and directed the trilogy, made the book required reading for all the actors, including Keanu Reeves. (It is tempting to make a joke at this point, but we will all be better people for it if I don’t.)

There was more to Baudrillard than his role as Marshall McLuhan of the cyberculture. And yet I can’t really blame harried reporters for emphasizing the most blockbuster-ish dimensions of his influence. "The Matrix" was entertainment, not an educational filmstrip, and Baudrillard himself said that its take on his work “stemmed mostly from misunderstandings.” But its computer-generated imagery and narrative convolutions actually did a pretty decent job of conveying the feel, if not the argument, of Baudrillard’s work.

As he put it in an essay included in The Illusion of the End (Stanford University Press, 1994): “The acceleration of modernity, of technology, events and media, of all exchanges – economic, political, sexual – has propelled us to ‘escape velocity,’ with the result that we have flown free of the referential sphere of the real and of history.” You used to need digitalized special effects to project that notion. But I get the feeling of being “flown free of the referential sphere of the real and of history” a lot nowadays, especially while watching certain cable news programs.

Some of the coverage of Baudrillard’s death was baffled but vaguely respectful. Other commentary has been more hostile – though not always that much more deeply informed. A case in point would be an article by Canadian pundit Robert Fulford that appeared in The National Post on Saturday. A lazy diatribe, it feels like something kept in a drawer for the occasion of any French thinker’s death – with a few spots left blank, for details to be filled in per Google.

A tip-off to the generic nature of the piece is the line: “Strange as it seems, in the 1970s much of the Western world was ready to embrace him.” Here, Fulford can count on the prefab implication of a reference to that decade as a time of New Left-over radicalism and  countercultural indulgence. In fact Baudrillard was little known outside France until the 1980s, and even then he had a very small audience until late in the decade. The strong mood coming from most of Baudrillard’s work is that of bitter disappointment that oppositional social movements of earlier years had been neutralized – absorbed into academic bureaucracy and consumer society, with no reason to think that they would revive.

And if we are going to play the game of periodization-by-decade, well, it is perhaps worth mentioning that “much of the Western world was ready to embrace him" only after several years of watching Ronald Reagan -- a man whose anecdotes routinely confused his roles in motion pictures with actual experiences from his own life -- in a position of great power. The distinction between reality and simulation had been worn away quite a bit, by that point. Some of Baudrillard’s crazier flights of rhetoric were starting to sound more and more like apt descriptions of the actual.

Even then, it was by no means a matter of his work persuading university professors “that novels and poems had become irrelevant as subject matter for teaching and research,” as the macro setting for culture-war boilerplate on Fulford’s computer puts it.

Enthusiasm for Baudrillard’s work initially came from artists, writers, and sundry ne’er-do-wells in the cultural underground. The post-apocalyptic tone of his sentences, the science-fictionish quality of his concepts, resonated in ways that at least some people found creatively stimulating, whether or not they grasped his theories. (True confession: While still in my teens, I started writing a novel that opened with an epigraph from one of his books, simply because it sounded cool.)

Baudrillard’s work played no role whatever in the debates of “the canon” to which Fulford alludes. But he was, in a different sense, the most literary of theorists. He translated Bertolt Brecht, among other German authors, into French. Some of his earliest writings were critical articles on the fiction of William Styron and Italo Calvino. In 1978, he published a volume of poems. And a large portion of his output clearly belongs to the literary tradition of the aphorism and the “fragment” (not an unfinished work, but a very dense and compact form of essay). These are things you notice if you actually read Baudrillard, rather than striking po-faced postures of concern about how literature should be “subject matter for teaching and research.”

Besides, it is simply untrue to say that Baudrillard’s reception among American academics was one of uncritical adulation. If there was a protracted lag between the appearance of his first books in the 1960s and the dawn of interest in his work among scholars here in the 1980s, that was not simply a matter of the delay in translation. For one thing, it was hard to know what to make of Baudrillard, and a lot of the initial reception was quite skeptical.

In the mid-1960s, he became a professor of sociology at the University of  Paris at Nanterre , but the relationship of his work to the canon of social theory (let alone empirical research) is quite oblique. It’s also difficult to fit him into the history of philosophy as a discipline. Some of his work sounds like Marxist cultural theory, such as the material recently translated in Utopia Deferred: Writings for ‘Utopie’ 1967-1978 -- a collection distributed by MIT Press, a publisher known, not so coincidentally, for its books on avant-garde art. Still, there is plenty in Baudrillard’s work to irritate any Marxist (he grew profoundly cynical about the idea of social change, let alone socialism). And he delighted in baiting feminists with statements equating femininity with appearance, falsehood, and seduction.

Baudrillard was, in short, a provocateur. After a while that was all he was – or so it seemed to me, anyway. The rage of indignant editorialists notwithstanding, a lot of the response to Baudrillardisme amounted to treating him as a stimulating but dubious thinker: not so much a theorist as a prose-poet. A balanced and well-informed critical assessment of his work comes from Douglas Kellner, a professor of philosophy at UCLA, who wrote Jean Baudrillard: From Marxism to Postmodernism and Beyond (Stanford University Press, 1989), the first critical book on him in English. Kellner has provided me with the manuscript of a forthcoming essay on Baudrillard, which I quote here with permission.

“So far,” he writes, “no Baudrillardian school has emerged. His influence has been largely at the margins of a diverse number of disciplines ranging from social theory to philosophy to art history, thus it is difficult to gauge his impact on the mainstream of philosophy, or any specific academic discipline.”

At this point I’d interject that his questionable position within the disciplinary matrix (so to speak) tends to reinforce Baudrillard’s status as a minor literary figure, rather than an academic superstar. Kellner goes on to note that Baudrillard “ultimately goes beyond conventional philosophy and theory altogether into a new sphere and mode of writing that provides occasionally biting insights into contemporary social phenomena and provocative critiques of contemporary and classical thought. Yet he now appears in retrospect as a completely idiosyncratic thinker who went his own way....”

Not that Baudrillard exactly suffered for going his own way, however. A self-portrait of the postmodern intellectual as global jet-setter emerges in the five volumes of his notebook jottings published under the title “Cool Memories.” You get the sense that he spent a lot of time catching planes to far-flung speaking engagements – not to mention seeing various unnamed women out the door, once they had been given a practicum in the theories worked out in his book De la Séduction.

Many of the writings that appeared during the last two decades of his life simply recycled ideas from his early work. But celebrity is a full-time job.

One offer he did turn down was the chance to do a cameo in one of the Matrix sequels. (Instead, it was Cornel West who did his star turn onscreen as gnomic philosophical figure.) Still the appearance of "Simulacra and Seduction" in the first film greatly increased the book’s distribution, if not comprehension of its themes.

According to Mike Kehoe, the sales manager for the University of Michigan Press, which published the English translation, sales doubled in the year following “The Matrix.” The book had often been assigned in university courses. But those sales, too, jumped following the release of the film.

Rather than indulging my own halfbaked quasi-Baudrillardan speculations about how his theories of media metaphysics were reabsorbed by the culture industry, I decided to bring the week’s musings to a close by finding out more about how the book itself ended up on screen.

“It wasn’t the usual sort of product placement,” LeAnn Fields, a senior executive editor for the press, told me by phone. “That is, we didn’t pay them. It was the other way around. The movie makers contacted us for permission. But they reserved the right to redesign the cover for it when it appeared onscreen.”

The familiar Michigan edition is a paperback with bergundy letters on a mostly white cover. “But in the film,” said Fields, “it become a dark green hardcover book. We were quite surprised by that, but I guess it’s understandable since it serves as a prop and a plot device, as much as anything.” (If memory serves, some kind of cyber-gizmo is concealed in it by Keanu Reeves.)

I asked Fields if the press had considered bringing out a special version of the book, simulating its simulation in a deluxe hardback edition. “No,” she said with a laugh, “I don’t think we ever considered that. Maybe we should have, though.”

Recommended Reading: Mark Poster's edition of Baudrillard's Selected Writings, originally published by Stanford University Press in 1988, is now available as a PDF document. The single best short overview of Baudrillard's work is Douglas Kellner's entry on him for the Stanford Encyclopedia of Philosophy. There is an  International Journal of Baudrillard Studies  that publishes both commentary on his work and translations of some of his shorter recent writings. 

Scott McLemee
Author's email: 

Beatles vs. Stones

This morning I received an e-mail from a new colleague of mine about some workshop topics on writing. I met her last week at the massive Conference on College Composition and Communication (CCCC), in New York City. We’re both new members of the executive board of the Assembly for Expanded Perspectives on Learning (AEPL), a National Council of Teachers of English (NCTE) affiliate organization that is interested in promoting teaching and learning beyond traditional disciplines and methodologies.

She’s the recently elected associate chair trying to brainstorm some ideas for upcoming workshops and conferences. I’m the new treasurer trying to get my Excel columns to add up right.

At our meeting in the escalatored bowels of the Manhattan Hilton, the board agreed that the 2008 workshop would be titled “The Rhetorical Art of Reflection,” but in her e-mail today to me and the other board members, she suggested that the 2009 workshop might be on a topic related to the connections between music and writing.

This e-mail popped up as I was sitting here at my laptop in my university office  listening to Van Morrison's album “Tupelo Honey” and writing copy for a Web site for our recently approved general education program and curriculum.

I wrote back to her wondering if anyone else like me had this kind of continual digital soundtrack running through their media players while tapping along on their keyboards and wristing red laser mouse pods. I thought it would be interesting to find out what other folks listen to when they write, headphoned or not. I also recommended a new income-generating idea for our little AEPL assemblage: a CD collection of greatest hits for writing, recommended by the usual galaxy of comp/rhet stars. Hey, Peter Elbow! What are you listening to? Cheryl Glenn? Raul Sanchez?

My preferences for writing of course are situational, just like they should be for any good rhetorician. As I’m writing this essay, I’m listening to “Ethiopiques, Vol. 4: Ethio Jazz & Musique Instrumentale, 1969-1974” by musician-arranger Mulatu Astatqe. My daughter sent it to me last year, and I ripped it immediately into my playlists. Other writing favorites in jazz include “Consummation” by the Thad Jones & Mel Lewis Orchestra, passed on to me by my neighbor Bill, Lionel Hampton’s “Mostly Ballads” and “Mostly Blues,” and some other favorites from the early 70’s: Keith Jarrett’s “The Köln Concert,” and “The Colours of Chlöe” by Eberhard Weber.

Here at my desk with the tangle of wires running from the scanner, printer, PDA cradle, and leftover Gateway 2000 speakers, I start off the day usually with something to get the blood moving, like Los Pregoneros Del Puerto and their traditional music of Veracruz, Paco de Lucia’s “Anthologia Vol. 1,” or that dobro-infused live double play by Alison Krauss and Union Station.

Or if I’m particularly stressed out and need to write and relax, I click on “Union” or “Devotion” by Rasa, R. Carlos Nakai’s “Cycles. Vol. 2,” or Clannad’s “Landmarks.”

But if I’m just chugging along during the day, I go to the old faithfuls: the soundtrack from Ken Burns’ “Lewis and Clark: The Journey of the Corps of Discovery,” Mary Chapin Carpenter’s “Stones in the Road,” Dylan’s “Blood on the Tracks,” some Puccini or Neil Young’s “Comes a Time.”

Given the slice and dice randomized nature of iTunes and Napster, I realize that speaking of music in terms of albums is very old school, but the extended play of the 50 to 60 minute tune after tune fits my writing rhythm pretty well. Once a playlist is over, I know it’s time to take a break, push away from my desk, stand up and lean back to stretch out my stiff back, wander out into the hallway of that other world, or walk downstairs and check my campus mailbox to see what junk I can toss into the recycling bins nearby.

When I was a longhaired college kid, I had Crosby, Stills, and Nash, Marvin Gaye, Cat Stevens, and Joni Mitchell in pretty much constant rotation on my scratchy stereo, one skewered vinyl dropping down on the next until it was time to flip the stack over again. In those days, I was listening for lyrics and rhyme as much as anything, thinking I was a writer in the company of writers who also happen to play music. These days I’m listening for melody and rhythm as much as anything, thinking I’m a writer in the company of musicians who also happen to keep me writing.

I guess I don’t know if a workshop on music and writing is such a good idea after all. Right now I’m thinking it would be just about as useful as any other workshop on the preferences folks have about writing: pencil vs. pen, medium vs. fine tip, black vs. blue, laptop vs. desktop, blank pad vs. college-ruled vs. yellow legal pad, at the desk vs. in bed, PC vs. Mac, Bach vs. Mozart. Seems all too personal, finicky, and idiosyncratic to me. Kind of like writing, if you know what I mean.

Laurence Musgrove
Author's email: 

Laurence Musgrove is an associate professor of English and foreign languages at Saint Xavier University, in Chicago.

Hard Wordes in Plaine English

Longtime readers of Intellectual Affairs may recall that this column occasionally indulges in reference-book nerdery. So it was a pleasant but appropriate surprise when the Bodleian Library of the University of Oxford  provided a copy of its new edition of the very first dictionary of the English language. It has been out of print for almost 400 years, and the Bodleian is now home to the one known copy of it to have survived.

Available now as The First English Dictionary, 1604 (distributed by the University of Chicago Press), the work was originally published under the title A Table Alphabeticall. It was compiled in the late 16th century by one Robert Cawdrey. The book did not bring him fame or fortune, but it went through at least two revised editions within a decade. That suggests there must have been a market for Cawdrey’s guide to what the title page called the “hard usuall English wordes” that readers sometimes encountered “in Scripture, Sermons, or elswhere.”

Cawdrey had the misfortune, unlike fellow lexicographer Samuel Johnson, of never meeting his Boswell. Yet he had an eventful career – enough to allow for a small field of Cawdrey studies. An interesting introduction by John Simpson, the chief editor of the Oxford English Dictionary, sums up what is known about Cawdrey and suggests ways in which his dictionary may contain echoes of his life and times.

At the risk of being overly present-minded, there’s a sense in which Cawdrey was a pioneer in dealing with the effects of his era’s information explosion. Thanks to the printing press, the English language was undergoing a kind of mutation in the 16th century.

New words began to circulate in the uncharted zone between common usage and the cosmopolitan lingo of sophisticated urbanites who traveled widely. Learned gentlemen were  traveling to France and Italy and coming back “to powder their talk with over-sea language,” as Cawdrey noted. Some kinds of “academicke” language (glossed by Cawdrey as “of the sect of wise and learned men”) were gaining wider usage. And readers were encountering words like “crocodile” and “akekorn” which were unfamiliar. Cawdrey’s terse definitions of them as “beast” and “fruit,” respectively, suggest he probably had seen neither.

Booksellers had offered lexicons of ancient and foreign languages. And there were handbooks explaining the meaning of specialized jargon, such as that used by lawyers. But it was Cawdrey’s bright idea that you might need to be able to translate new-fangled English into a more familiar set of “plaine English words.”

Cawdrey also found himself in the position of needing to explain his operating system. “To profit by this Table,” as he informed the “gentle Reader” in a note, “thou must learn the Alphabet, to wit, the order of the Letters as they stand....and where every Letter standeth.” Furthermore, you really needed to have it down cold. A word beginning with the letters “ca,” he noted, would appear earlier than one starting with “cu.” After using the “Table” for a while, you probably got the hang of it.

Who was this orderly innovator? Cawdrey, born in the middle of England sometime in the final years of Henry VIII, seems not to have attended Oxford or Cambridge. But he was learned enough to teach and to preach, and came to enjoy the patronage of a minister to Queen Elizabeth. He married, and raised a brood of eight children. In a preface to the dictionary, Cawdrey acknowledges the assistance of “my sonne Thomas, who now is Schoolmaister in London.”

Cawdrey published volumes on religious instruction and on the proper way to run a household so that each person knew his or her proper place. He also compiled “A Treasurie or store-house of similies both pleasant, delightfull, and profitable, for all estates of men in generall.” (Such verbosity was quite typical of book titles at the time. The full title page for his dictionary runs to about two paragraphs.)

Whatever his chances for mobility and modest renown within the Elizabethan intelligentsia were severely limited, however, given his strong religious convictions. For Cawdrey was a Puritan – that is, someone convinced that too many of the old Roman Catholic ways still clung to the Church of England.

Curious whether "Puritan" (a neologism with controversial overtones) appeared in dictionary, I looked it up. It isn’t there. But Cawrey does have “purifie,” meaning “purge, scoure, or make cleane” -- which is soon followed by “putrifie, to waxe rotten, or corrupted as a sore.” By the 1580s, Cawdrey had both words very much in mind when he spoke from the pulpit. When he was called before church authorities, one of the complaints was that he had given a sermon in which he had “depraved the Book of Common Prayer, saying, That the same was a Vile Book and Fy upon it.” He was stripped of his position as minister.

But Cawdrey did not give up without a fight. He appealed the sentence, making almost two dozen trips to London to argue that it was invalid under church law. All to no avail. He ignored hints from well-placed friends that he might get his job back by at least seeming to go along with the authorities on some  points. For that matter, he continued to sign his letters as if he were the legitimate pastor of his town.

No doubt Cawdrey retained a following within the Puritan underground, but he presumably had to go back to teaching to earn a living. Details about his final years are few. It isn’t even clear when Cawdrey died. He would have been approaching 70 when his dictionary appeared, and references in reprints of his books a few years later imply that they were revised posthumously.

In his introductory essay, John Simpson points out that the OED now lists 60,000 words that are known to have been in use in English around the year 1600. Cawdrey defines about 2,500 of them. “We should probably assume that he was unable to include as many words as he would have liked,” writes Simpson, “in order to keep his book within bounds. It was, after all, an exploratory venture.”

But that makes the selection all the more interesting. It gives you a notion of what counted as a “hard word” at the time. Most of them are familiar now from ordinary usage, though not always in quite the sense that Cawdrey indicates. He gives the meaning of “decision” as “cutting away,” for example. Tones of the preacher can be heard in his slightly puzzling definition of “curiositie” as “picked diligence, greater carefulnes, then is seemly or necessarie.”

Given his Puritan leanings, it is interesting to see that the word “libertine” has no specifically erotic overtones for Cawdrey. He defines the word applying to those “loose in religion, one that thinks he may doe as he listeth.” One of the longest entries is for “incest,” explained as “unlawfull copulation of man and woman within the degrees of kinred, or alliance, forbidden by Gods law, whether it be in marriage or otherwise.”

It is a commonplace of much recent scholarship that, prior to the mania for categorizing varieties of sexual desire that emerged in the 19th century, the word “sodomy” covered a wide range of non-procreative acts, heterosexual as well as homosexual. Cawdrey, it seems, didn’t get the memo. He defines “sodomitrie” as “when one man lyeth filthylie with another man.” Conversely, and rather more puzzling, is his definition of “buggerie” (which one might assume to be a slang term for a rather specific act) as “conjunction with one of the same kinde, or of men with beasts.”

In a few entries, one detects references to Cawdrey’s drawn-out legal struggle of the 1580s and '90s. He explains that a "rejoinder" is “a thing added afterwards, or is when the defendant maketh answere to the replication of the plaintife.” So a rejoinder is a response, perhaps, to “sophistikation” which Cawdrey defines as “a cavilling, deceitful speech.”

Especially pointed and poignant is the entry for “temporise,” meaning “to serve the time, or to follow the fashions and behaviour of the time.” Say what you will about Puritan crankiness, but Robert Cawdrey did not “temporise.”

Particularly interesting to note are entries hinting at how the “new information infrastructure” (circa 1600) was affecting language. The expense of producing and distributing literature was going down. “Literature,” by the way, is defined by Cawdrey here as “learning.” Cawdrey includes a bit of scholarly jargon, “abstract,” which he explains means “drawne away from another: a litbooke or volume prepared out of a greater.”

Some of the words starting to drift into the ken of ordinary readers were derived from Greek, such as “democracie, a common-wealth gouerned by the people” and “monopolie, a license that none shall buy and sell a thing, but one alone.” Likewise with terms from the learned art of rhetoric such as “metaphor,” defined as "similitude, or the putting over of a word from his proper and naturall signification, to a forraine or unproper signification.”

Cawdrey’s opening address “To the Reader” is a manifesto for the Puritan plain style. Anyone seeking “to speak publiquely before the ignorant people,” he insists, should “bee admonished that they never affect any strange inkhorne termes, but labour to speake so as is commonly received, and so as the most ignorant may well understand them.”

At the same time, some of the fancier words were catching on. The purpose of the dictionary was to fill in the gap between language that “Ladies, Gentlewomen, or any other unskilfull persons” might encounter in their reading and what they could readily understand. (At this point, one would certainly like to know whether Cawdrey taught his own three daughters how to read.) Apart from its importance to the history of lexicography, this pioneering reference work remains interesting as an early effort to strike a balance between innovation and accessibility in language use.

“Some men seek so far for outlandish English,” the old Puritan divine complains, “that they forget altogether their mothers language, so that if some of their mothers were alive, they were not able to tell, or understand what they say.” Oh Robert Cawdrey, that thou shouldst be alive at this hour!

Scott McLemee
Author's email: 

Scott McLemee writes Intellectual Affairs each week. Suggestions and ideas for future columns are welcome.

In Praise of Small Conferences

This last fall I attended the 2006 TYCA-West conference. It was held in beautiful Park City, Utah in October. About 60 two-year college English faculty, graduate students, and even some university professors gathered to discuss the study and practice of teaching English. It’s hard to imagine a more beautiful setting than Park City in the fall. The crisp mountain air and the burnt orange and red scrub oak painting the surrounding mountains lend a….

What? You mean to tell me that you’ve never heard of TYCA-West? TYCA is the Two-Year College English Association, which is a group of the National Council of Teachers of English. It is comprised of seven regions, and each region holds an annual conference. TYCA-West is the regional organization that includes Utah, Idaho, Nevada, Arizona, Wyoming, and (who would have guessed), Hawaii.

Let’s be honest, as far as conferences go, it’s difficult to imagine a less prestigious conference than a regional two-year college English conference. You aren’t likely to rub shoulders with star scholars in the field. Nor will you encounter presentations that will help you sort through the talked about new book or intellectual movement of the year. For that, go to MLA. I’m not against the big conference. But I’ve come to appreciate the strengths of the small conference, and for professors dedicated to teaching, regional conferences may in fact be more valuable and more rewarding than higher profile conferences.

At the TYCA-West conference, we tend to focus on practical issues associated with teaching English. This last year, our keynote address was by Sharon Mitchler, the past TYCA-National chair. She addressed the larger economic and demographic trends associated with teaching English in the two-year college. I learned, for instance, that two-year colleges “teach an estimated 50 percent of all college-level composition and 70 percent of all developmental composition courses,” and I learned that “college participation rates among low-income students peaked in 1998 and have been falling since then.” Mitchler’s presentation had a refreshingly empirical cast, something I’m not accustomed to at humanities conferences. But she effectively embedded those facts within a larger argument about how these trends will ultimately determine what we do in the classroom, whether we realize it or not. At the 2005 TYCA-West conference, we were treated to an excellent presentation by Kathleen Blake Yancey on the changing nature of literacy. It was followed by an engaging and pleasingly cant-free discussion about what we’re currently experiencing in the classroom. Many of the challenges associated with teaching writing persist. Instructors shared stories about how difficult it is to get students to become critical readers and writers. Many instructors, however, pointed to newer trends in writing instruction, like service learning, which offer students more authentic scenarios of composition.

I’ve also formed lasting friendships at TYCA-West.  Since becoming involved in TYCA-West, I now know and correspond with faculty members from each of the states within my region, from places like Yavapai College of Arizona, Community College of Southern Nevada, Western Wyoming Community College, and Dixie State College. We share an identity as two-year college English faculty, joined in a common enterprise. As faculty members who share similar economic and demographic challenges, we have also formed a regional identity, something not typically encouraged by the larger conferences. I feel like I have developed an authentic network through my experience at TYCA-West. From Jeff Andelora who teaches at Mesa Community College, I’ve learned about the history of community college English. From Bradley Waltman at the Community College of Southern Nevada, I’ve learned about the challenges associated with placing students in writing courses.

Here’s what you won’t find at TYCA-West or most other smaller, regional conferences. You won’t be subjected to the name-badge-glance-and-turn, a move I’ve always for some reason viewed as akin to a basketball player’s expert pivot. (If only the Utah Jazz center could pivot like that.) Instead, you will encounter colleagues at peer institutions genuinely interested to meet you and hear what you have to say.

Neither will you attend presentations obviously constructed for the sole purpose of CV fodder. No counterintuitive readings of canonical texts that strain credulity. No impotent counter-hegemonic posturing. Presentations tilt toward the practical rather than the theoretical. Though, believe it or not, two-year college English professors are interested in theory, but we typically put theory in the service of practical considerations. In my experience, you are more likely to hear what Joseph Williams called the “So what?” question at smaller conferences. Taken together, the presentations at our TYCA conferences soberly address the perennial challenge of how we get our students to become more effective writers and readers.

Finally, regional conferences are cheap. I briefly considered attending this year’s Conference on College Composition and Communication in New York City. But rooms at the conference hotel are $300 a night and the flight would have cost me about $500 round trip. The total cost of the conference would have easily exceeded $1,500 and, though I am lucky enough to get support from my college to attend conferences, I decided that it just wasn’t worth it. For those faculty members who receive little or no support from their institutions, this year’s 4Cs conference is probably out of reach.

In contrast, let me present, Thoreau-like, the costs of my 2005 TYCA-West in Prescott, Arizona:

  • Travel $230 (round trip to Phoenix plus a shuttle to and from Prescott).
  • Hotel (shared room with a colleague) $75.
  • Conference Registration $140 (included breakfast and lunch on both days of the conference).
  • Food $75 (including a beer and scrumptious burger at The Saloon, which has a wall-sized painting of Steve McQueen worth seeing).

For around $500 I enjoyed a conference where I connected with professors from the region, went to Prescott for the first time -- a beautiful little college town in the mountains northwest of Phoenix -- and learned a little more about how to become a more effective English teacher. The 2006 TYCA-West conference in Park City was a 30-minute drive from my house.

Regional organizations can languish, though. Anyone who has been involved in the organization and promotion of a regional conference can tell you that it’s sometimes difficult to generate interest and attendance. Because the large, national conferences exert such a big influence over the discipline, it is often a challenge to persuade professors that small conferences are worth their time. After all, what will a presentation at TYCA-West do for your CV? But I am excited about next year’s TYCA-West conference in Las Vegas. (I suggested we adopt the line, “What happens at TYCA-West stays at TYCA-West,” in order to generate greater participation.)

Large conferences will always be important, and I still plan on attending them. But the academic work done by many college professors happens primarily in the classroom. The small conference provides an ideal forum for them to share this important work.

Jason Pickavance
Author's email: 

Jason Pickavance is an instructor in the English department at Salt Lake Community College, where he teaches courses in writing and American literature.

The Eternal Sophomore

“It has been my experience with literary critics and academics in this country,” wrote Kurt Vonnegut in an essay published in 1981, “that clarity looks a lot like laziness and childishness and cheapness to them. Any idea which can be grasped immediately is for them, by definition, something they knew all the time. So it is with literary experimentation, too. If a literary experiment works like a dream, is easy to read and enjoy, the experimenter is a hack. The only way to get full credit as a fearless experimenter is to fail and fail.”

The anger in that statement had been building up for at least a couple of decades. Much of Vonnegut’s early work was classified as science fiction – a filing-cabinet drawer that, as he once put it, academics tended to confuse with a urinal. He was later discovered by people who didn't read science fiction, and most of his books stayed in print. But that just meant he had failed to fail, so the charge of being a hack was still in the air.

In some respects, though, his complaints were already out of date when he made them; for by the early 1980s, there was already a scholarly industry in Vonnegut criticism. It now runs to some three dozen books, not to mention more journal articles than anyone would want to count.

During the original wave of speculation on postmodernism during the 1960s and early ‘70s – when that notion was relatively untheoretical, a label applied to emergent literary tendencies more than the name for some vast cultural problematic – it was very often the work of Kurt Vonnegut that people had in mind as an exemplary instance. Parataxis, metafiction, blurring of the distinction between mass-culture genres and modernistic formal experimentation -- all of this, you found in Vonnegut. His novels were chemically pure samples of the postmodern condition.

And then came the definitive moment documenting Vonnegut’s place in the literary curriculum: the film "Back to School" (1986), in which the author had a cameo role.

In that landmark work, as you may recall, Rodney Dangerfield played Thornton Mellon, a millionaire who returns to college for the educational opportunities involved in partying with coeds in bikinis. When an English professor assigns a paper on Vonnegut’s fiction, Dangerfield hires the novelist himself to write the analysis. The paper receives a failing grade. (Someone in Hollywood must be a fan of Northrop Frye, who once said that whatever else one might say about Wordsworth’s preface to the Lyrical Ballads, as a piece of Wordsworth criticism it only merited a B plus.)

Given such clear evidence of canonization, it was a surprise to notice that a couple of friends responded to the news of Vonnegut’s death last week with slightly embarrassed sadness. Both are graduate students in the humanities. One called his novels a “guilty pleasure.” Another mentioned how much Vonnegut’s work had meant to him “even if he’s not considered that great or serious a writer.”

I suspect that such feelings about Vonnegut are pretty widespread -- that the shelves of secondary literature don’t really quell a certain ambivalence among readers who feel both deep affection for his work combined with a certain keen nervousness about his cultural status. Unfortunately Vonnegut did not make things any easier by publishing so many novels that devolved into self-parody. If he had quit after Cat’s Cradle and Slaughterhouse Five, the ratio of wheat to chaff in his fiction would be much more favorable.

But the ambivalence itself is not, I think, a response to the uneven quality of his work -- nor even the product of some misguided notion that a funny author can’t be taken seriously. Rather, the problem may be that Vonnegut is an author one tends to discover in adolescence. Defensiveness about the attachment one feel to his work is, in part, a matter of wanting to protect the part of oneself that seemed to come into being upon first reading him. “I deal with sophomoric questions that full adults regard as settled,” he told an interviewer once.

He had, for example, a large capacity for facing brute contingency as part of human existence. A great deal of life is chance. (The fact that you were born, for example. Think how arbitrary that is.) And much of the rest of life consists of learning to evade that truth – walling it off, away from consciousness, because otherwise the reality of it would be too hard to fathom. Instead, we throw ourselves into fictions of power and belonging: nationalism, militarism, religion, the acquisition of cool stuff. These are ways to contain both the vulnerability before chance and the terrors of loneliness. In Vonnegut’s understanding of the world, loneliness is a fundamental part of human experience that became much, much worse in the United States, somehow, during the second half of the twentieth century – with no particular reason to think it will get better anytime soon.

As contributions to the cultural history of mankind, such thoughts are pretty small beer. On the other hand, just try to escape their implications. To call a point simple is the cheapest and least effective means of gainsaying it.

On Monday, at about the time I sat writing that paragraph about chance and terror and helplessness, someone was walking around a university campus shooting people at random. This was a coincidence. It was chance. That thought is no comfort. As one of the Tralfamidorians says in Slaughterhouse Five: “Well, here we are, Mr. Pilgrim, trapped in the amber of this moment. There is no why.” So it goes.

Vonnegut (who once called himself “a Christ-worshiping agnostic”) drew from the ground truth of existential terror a moral conclusion that it made sense to try to love your neighbor as yourself – or at least to treat other people with radical decency. This sounds simplistic until you actually try doing it.

He was a socialist in the old Midwestern tradition best expressed in a famous statement by Eugene Debs that went: "Years ago I recognized my kinship with all living beings, and I made up my mind that I was not one bit better than the meanest on earth. I said then, and I say now, that while there is a lower class, I am in it, and while there is a criminal element I am of it, and while there is a soul in prison, I am not free." Quoting that was about as close to a theoretical statement as Vonnegut ever got. The rest of his outlook he regarded as common sense.

“Everything I believe,” he said, “I was taught in junior civics during the Great Depression – at School 43 in Indianapolis, with full approval of the school board. School 43 wasn’t a radical school. American was an idealistic, pacifistic nation at that time. I was taught in the sixth grade to be proud that we had a standing Army of just over a hundred thousand men and that the generals has nothing to say about what was done in Washington. I was taught to be proud of that and to pity Europe for having more than a million men under arms and spending all their money on airplanes and tanks. I simply never unlearned junior civics. I still believe in it. I got a very good grade.”

Someone with such attitudes must necessarily be an anachronism, of course, and anachronisms tend to be either funny or sad. His books, at their best, were both. A few of them will survive because they hold those qualities in such beautiful proportion. “Laughter,” as Vonnegut once put it, “is a response to frustration, just as tears are, and it solves nothing, just as tears solve nothing. Laughter or crying is what a human being does when there’s nothing else he can do.”

Scott McLemee
Author's email: 


Subscribe to RSS - English
Back to Top