The other day, I received an e-mail from a professor at another university who had read a recent article I had written on student writing. He asked me if I could provide some advice to a committee he was serving on that was, as he put it, “looking into the problem of student writing.” He listed a number of questions that he asked me to respond to, including:
What is your perception of student writing?
Does it involve poor understanding of style, grammar, usage, and syntax?
Does it involve the lack of understanding of the form and procedures for producing research papers?
Does it involve the inability to put down coherent and clear thoughts throughout a paper?
Does it involve a lack of understanding of the concentrated thought process required to produce good writing?
Is the problem simply that they are not readers and have little grammar and writing training?
What kinds of programs do you know about that have dealt successfully with the problem?
In response, I wrote,
“I think there is a larger issue that you may have not noted in your series of questions, and that is the notion that student writing in the university is primarily prompted and evaluated by faculty. By that I mean, if all of our questions point to what students can and can't do, we forget to focus on what faculty can and can't do. That is, if student writing is always a response to faculty assignments and faculty evaluation, to examine student writing alone misses quite a bit of the overall problem that exists at most universities. The answer to most of your questions is, ‘Yes, many students at the college level do struggle with form and content.’“
“But I would also say that it's important for us to understand that we get the students we get, and they come with a wide range of skills and attitudes and experiences in writing. It's our job to help them continue to develop as writers and provide the right kinds of assignments and assessments that help them on that path.”
“I believe that most professors aren't trained to design effective writing assignments or know what it means to evaluate their students’ writing fairly. In other words, most student writing problems identified by faculty are caused by faculty. Sloppy assignments and grading policies lead to sloppy student writing.”
“So I would say the long and short of it is that the most effective way to improve student writing is to improve faculty performance.”
I realize this sort of response may fall on deaf ears or anger professors who believe their students are not adequately prepared. Nevertheless, I find it curious that those who pose questions about “the problem of student writing” fail to consider that their students might serve as a source of knowledge about how to solve the problem.
Several years ago, in an effort to get an alternative view, I asked students enrolled in my “Teaching Writing” course, a mix of both graduates and undergraduates, to provide their perspectives on writing assignments and to make a list of rights they believed they should be granted when it came to writing assigned by their instructors.
Many of these students were eager to discuss the issue of rights in this context because they felt they had often received poorly designed assignments and had been graded unfairly. And they didn’t want to treat their future students in the same way. They wanted to be better English teachers than those they had encountered. In short, they wanted to have better relationships with their students, and they understood the power writing assignments had to promote or inhibit those relationships.
After a bit of class discussion and informal writing, I stood ready at the board to record their ideas on the board. They created a list of two dozen student writer rights, and as we reviewed and organized this list further, we discovered that this list fell into four main categories: rights related to assignments, the writing process, evaluation, and ownership.
Of these four types of student writer rights, the clear majority -- one half of the overall total -- were “assignment rights.” My students believed they had the right:
1. To know the writing workload for the term.
2. To assignments relevant to the course.
3. To understand how writing assignments would meet course objectives.
4. To assignments in writing.
5. To ask questions about assignments.
6. To clear explanations of assignments.
7. To evaluation criteria with assignments.
8. To models of effective response.
9. To adequate time to complete assignments.
10. To clearly outlined assignments.
11. To assignments that would not be modified by the instructor after students had already begun writing.
12. To write for real audiences.
In regards to the writing process, they wanted the right to individual conferences with their instructors on works-in-progress, the right to revise the first paper to better understand the instructor’s expectation, and the right to revise without penalty.
“Evaluation rights” included the right to clear policies on grading and late work, the right to objective evaluation, the right to evaluation not based upon the best performance in the class, the right to evaluation and response based upon each writer’s individual needs, the right to question the instructor about the grade received, the right to appeal a grade, and the right to have their work graded and returned promptly.
As for ownership, they believed that they had intellectual property rights over their work and the right to have it kept private.
Writing this long list of rights on the board, I was surprised at their sophisticated understanding of writing assignments. They knew just how dependent they were on good teachers for good writing experiences. The tone of our discussion also revealed how much resentment they felt about being subject to the whims of professors and their lack of knowledge about what students needed to succeed as writers.
Because most college students understand just how powerless they are in the classroom, it is unlikely that they will rise up and demand these rights. Nevertheless, I believe this list offers us powerful information on how we can better respond to the problem of student writing. We should take these rights seriously, try them on for size, and consider how our assignments foster or interfere with our students’ chances of success. I know it has dramatically improved the quality of my assignments and the work I get from my students.
In all of this talk about student writer rights, one might wonder if there should also be some attention paid to a writer’s responsibilities. What responsibilities could we point to that aren’t already in this list of rights? Well, those usually show up as admonitions in most assignments anyway, such as don’t plagiarize, proofread carefully, and provide support for your claims.
Still, I believe that when my students developed this list of rights, they were also asking for the right to know what their responsibilities were. They were asking for a kind of liberation, to be free to understand what is necessary and possible. They wanted to have better writing relationships with their professors, honorable relationships characterized by clear expectations, a fair reading, and respect for the complicated and time-consuming work of the individual writer.
Laurence Musgrove is an associate professor of English and foreign languages at Saint Xavier University, in Chicago.
Working as an archival assistant at the Library of Congress about a dozen years ago, I had the memorable and never-to-be-repeated experience of discovering a letter by Thomas Pynchon. It was written early in his career, when his aversion to the public spotlight was known only to friends -- rather than being, as it is now, a somewhat paradoxical claim to fame.
Pynchon has never given an interview. The most widely used portrait of him is taken from a school yearbook. (He was a member of the Class of 1953 at Oyster Bay High, on Long Island.) The biographical note accompanying Against the Day -- his sixth novel, to be published next week by Penguin -- lists only the titles of his earlier books and the fact that one of them, Gravity's Rainbow, won the National Book Award in 1974.
The NEW novel itself is long (not quite 1,100 pages) and dense, sometimes brilliant and sometimes tiresome, and occasionally very silly (the cameo appearance, for example, by Elmer Fudd). It is also remarkably resistant to capsule summary. Oh, what the hell. Here goes anyway: Against the Day is a historical novel about the secret relationship among dynamite, photography, and multidimensional vector spaces that treats the emergence of the 20TH century Zeitgeist from a clash between revolutionary anarchism and the plutocratic Establishment. See?
To discuss the book adequately would demand a seminar lasting four months, which is also the ideal period required for reading the book -- instead of the four days it took one reviewer, who then promptly had a mild nervous breakdown. Something about Pynchon's work incites academic commentary. At least four scholarly books have already been devoted to his last novel, Mason & Dixon (1997). Even someone who enjoys him without feeling the itch to exegesis will probably feel driven, at some point, to do supplemental reading. Partway through “Against the Day,” for example, I found it urgent to go read an encyclopedia article on the history of theories regarding ether, the substance once thought to permeate even "empty" space.
Pynchon doesn't simply drop references to (now-discredited) scientific concepts. Rather, he builds them into the imaginative architecture of his work. While reading his novels, some part of one's attention is inevitably kept busy drawing up a list of remedial reading assignments.
With Pynchon, then, we have a unique combination. He writes maximalist fiction -- each page covered in the stuff of his supersaturated brain -- while maintaining a minimalist public profile. That is no small trick, given a culture industry constantly driven to manufacture celebrities out of practically nothing. (Pynchon himself has pushed the paradox a little further by lending his voice to "The Simpsons," in a bit available here.)
The situation has had some curious effects, even among academics interested in his work. Maybe especially among them.
In the early 1990s, the story began circulating that Pynchon had published a large number of letters in a small-town newspaper in northern California under the pseudonym of Wanda Tinasky, a homeless and perhaps mildly deranged old woman with a strange sense of humor. Learned people argued about Pynchon's possible authorship with great passion and total seriousness. Covering the debate for Lingua Franca, I spoke with one estimable literary scholar who did not so much answer my questions as deliver a formal and exactly worded declaration, as carefully prepared as an official diplomatic statement about nuclear testing in North Korea.
It was all plenty strange. The debate over authorship turned out to be a lot more interesting than the letters themselves, which were eventually published as a book. While there must be a few die-hard Tinasky-ites still around, the matter is now largely forgotten. (For an update, check out this interesting Wikipedia article.)
But one small detail from the debate, mentioned almost in passing by someone I interviewed, has stuck in mind over the years. It seems that Pynchon, while living in California in the 1980s, had a driver's license. Not such a big deal, in itself, of course. But researchers knew this because one of them had acquired a copy of it (through what sounded like rather dubious means) from the database of the Department of Motor Vehicles.
As one of the "Proverbs for Paranoids" in Gravity's Rainbow says, "You hide, they seek."
As it happens, I was not actually seeking Pynchon when I came across an actual letter by him, sometime around 1993. It seems to have gone undiscussed in the secondary literature in the meantime. Consider the following, then, a modest contribution to the collective enterprise of Pynchon scholarship and/or stalking.
The discovery occurred while I was processing a collection for the Manuscript Division of the Library of Congress. Archival processing is an activity often best described as sorting dead people's mail. In this case, that was quite literally true. The deceased was Stanley Edgar Hyman, a contributor to The New Yorker and a professor of English at Bennington College, who died in 1970.
He had been married to Shirley Jackson, of "The Lottery" fame. Chances are, the library accepted his papers mainly to get hers. A senior archivist concentrated on organizing Jackson's manuscripts and scrapbooks. But I was more than content to get the job of going through the boxes of her husband's literary remains.
While not particularly well-remembered now, Hyman occupied an interesting place in American cultural history. His book The Armed Vision: A Study in the Methods of Modern Literary Criticism (1948), provided one of the standard postwar surveys of critical theory. (Irving Howe once compared reading it to taking the elevator at Macy's department store: "First floor, symbols. Second floor, myths (rituals to the rear on your right). Third floor, ambiguities and paradoxes....")
Hyman had been a student of the sui generis cultural theorist Kenneth Burke. And both were, in turn, friends of Ralph Ellison, well before the novelist published Invisible Man in 1952. All of them had spent time at Bennington, drinking hard. Then they went home and wrote him fantastic letters. My job was to sort them, of course, not read them. But, well....
Then one day, in a mass of miscellaneous items, there turned up a short letter of two paragraphs, typed on a piece of graph paper. It was dated 8 December 1965, and was signed "yours truly, Thomas Pynchon." At that point, he had published a handful of short fiction and one novel, V., which Hyman had reviewed, favorably, when it appeared in 1963.
Eyes wide, I read as Pynchon turned down the "flattering and attractive" offer to come teach at Bennington. He did so, he said, "with much pain, don't ask where" -- explaining that he had resolved, two or three years earlier, to write three novels all at the same time. Pynchon hinted that it was not going well, and called the decision "a moment of temporary insanity." But he also said he was "too stubborn to let any of them go, let alone all of them," and thought that teaching would distract him, "given the personal limitations involved." He thanked Hyman for the invitation, and also praised Hyman's analysis of V. as "criticism at its best."
It was modest. It was polite. A few months later, he published The Crying of Lot 49 -- maybe one of the novels driving him to distraction, maybe not. Unfortunately there were no more letters from Pynchon in the collection -- nor did this one provide any indication where he was when he wrote it. (The return address he gave was that of his literary agent in New York.)
As a clue into the mystery of Thomas Pynchon, then, it seems like a pretty small thing. I hadn't thought about it at all in a long time, in fact -- until a stray reference in his new book brought it back to mind.
As readers will soon be able to see for themselves, Against the Day certainly feels like a man writing two or three novels at the same time. Whole dissertations will be written about how the different parts and layers create a consciousness-bending structure in four-dimensional spacetime. But it was the passing mention of a two-dimensional surface that gave me a slightly deja vu-like feeling. At one point, a character reaches for "a block of paper quadrilled into quarter-inch squares."
Graph paper, that is, exactly like the kind Pynchon used for his letter. More than a coincidence, but less than meaningful? Like Oedipa Maas at the end of Lot 49, I'm really not sure.
The guy featured on the poster had a long white beard and dark black sunglasses, the kind worn by people too cool for any room they might ever enter. At first it looked like he might be the guitar player for ZZ Top. But on closer examination you saw that the event being advertised was not a rock concert but, rather, a "transdisciplinary celebration" called "Why Melville Matters Now.” The man behind those shades was the creator of tortured souls like Ishmael and Bartleby. And the homage to him in Albany this past weekend was designed to make him a local celebrity.
A transdisciplinary celebration is like an academic conference, only different. This one was organized with the support of the State University of New York at Albany -- especially its Center for Humanities, Arts, and TechnoSciences. But the gathering itself took place elsewhere, at the Albany Academy, a private school attended by Melville himself in the 1830s; and the event was open to the general public. Parents and alumni attended, as well as scholars giving papers. It soon became clear that the entire Albany Academy had undergone a recent bout of systematic Melville mania. A bulletin board outside the kindergarten and first grade classrooms showed a school of colorful construction-paper fish, beneath the words “Where’s Moby?” (I must admit that I could not find Moby.)
Herman Melville: great American novelist or great American hipster? Well, it isn’t an either/or kind of situation. Rereading Moby Dick for the first time in ages (now minus the English major’s mental tic of obsessing over how each little part fit into a vast symbolic architecture), I recently underwent the astonishing revelation that Melville (1) definitely has a sense of humor, (2) pretty much invented the postmodern “maximalist” novel of the sort we now associate with Thomas Pynchon, and (3) is so overtly gay and so stridently multiculturalist that Fox News should probably look into how he ever got into the canon.
You don’t have to interpret Melville to make him seem contemporary. He is way ahead of you on that score -- even in ways that can prove somewhat troubling to consider. (Since 9/11, we have gone through a serious bout of Ahab-ism, finding purpose and meaning in the prospect of vengeance, which is the sort of thing that tends not to end well.)
But getting people together to discuss him is no small trick. One of the chief organizers of the gathering, Mary Valentis, an associate professor of English at SUNY-Albany, told me about giving equal weight to two major phases of its preparation. The first was sending out the call for papers, then sorting through the responses to create a program. The other was doing as much as possible to make local people aware of the gathering -- a matter of getting publicity on radio, television, and in the newspapers.
For that, it helped to have things on the schedule other than papers on Melville and cognitive science, or the theological subtext of Benito Cereno. There were art installations, a dance recital, and a 24-hour marathon reading of Moby Dick. The latter even would feature Andy Rooney reading the novel's final chapter and epilogue.
Rooney had a certain amount of drawing power, of course. Apart from being on “60 Minutes,” he is a member of the Albany Academy’s class of ‘39. But it appeared that the single biggest turnout was for the keynote address by Andrew Delbanco, a professor of humanities at Columbia University and the author of a recent biography of Melville.
Any time a scholar can successfully compete with a TV curmudgeon, it seems like a good thing.
Delbanco provided an overview of Melville’s life and work, and discussed his posthumous emergence as an iconic figure in American literature. He also noted that every town seems to have at least one restaurant or bar named after Moby Dick. (In my neighborhood, it’s the Moby Dick House of Kabob.) It was a good example of a talk that could serve as an introduction to Melville for a complete novice while also holding the attention of someone who had read around in the secondary literature. Not the sort of thing you hear very often, alas. The 150 or so listeners, ranging from high school students to full professors, seemed to appreciate the effort.
But it left me wondering why there weren’t more people in the audience. Despite Moby-themed eateries, it seemed as if there might still be some barrier to wider interest in Melville. After all, an event devoted to Edgar Allen Poe would probably have drawn a larger turnout.
“I guess I would say that Poe might draw better,” Delbanco responded, “because he is a writer whom one encounters in childhood and, perhaps, because of the melodrama of his life -- I heard that Sly Stallone was thinking of making a movie about him -- while Melville is up against the general decline in serious reading.... Is there any demanding writer from the past who would bring out a bigger, more various audience? Henry James has a certain currency because of the Merchant-Ivory movies, but I don't think he'd pack 'em in even if Isabel Archer was from Albany.”
Fair enough. And in any case, I did get a glimpse of a potential way of building up non-academic interest in Melville later, while talking with Patricia Spence Rudden and Jane Mushabac, both of them professors of English at the New York City College of Technology, which is part of the City University of New York.
We had been having an entertaining ramble of a conversation at dinner, covering teaching loads, the scholarship on women in rock music (the subject of a book Rudden is editing), and Melville’s sense of humor (which it was good to learn was not just my imagination, since Mushabac has written a monographon the subject). At some point, one of them said: “They keep trying to make him a New Englander, but they can’t have him!”
Huh? They filled me in on the argument over whether New York City or New Bedford, Mass. gets to lay claim to Melville. Neither side, it seems, is much impressed by the Albany claim. (Still, it’s worth noting that a letter from Melville’s father described him as being, at age 7, “of the true Albany stamp.”) The dispute is now confined to scholarly circles, for the most part. But it seems like the kind of thing that could be transformed into a full-scale rivalry among the cities, complete with local reading clubs, public lectures and debates, and a certain amount of trash-talking.
Well, it’s an idea anyway. And the flow of benefits between scholars and the public might be a two-way transaction.
One paper “‘Hideous Progeny’: The Monstrous, Monomaniacal, and Gothic Themes of Mary Shelley’s Frankenstein as Echoed in Herman Melville’s Moby Dick” stirred up some interesting discussion afterward -- including a question by an audience member from Salem who preferred to emphasize the influence of Nathaniel Hawthorne. (A little of that territorial imperative going, maybe.)
The author of the paper, Phil Purser, is a graduate student in English as the University of West Georgia, and he fielded the question well. The influence of Hawthorne on Melville is a standard topic in the scholarship. But for that very reason, it’s not the sort of thing one expects to have to sum up immediately right after presenting an analysis of the relationship between the White Whale and Victor Frankenstein’s Creature.
Afterwards, Purser told me that he had just driven 22 hours from Carrollton, Georgia to attend the event. He expected it to be like other conferences he had attended -- the usual mixture of professors and graduate students. (Which also means, often enough, “questions” for which any possible answer is a minor distraction from the questioner’s performance of a professionalized identity.) The mixed nature of the event took him by surprise. He had to interact simultaneously with other scholars and non-specialist members of the public. “I was asked difficult questions,” he told me, “and I was on some level surprised at how I answered them.... This was definitely not a conference of one-upsmanship in which scholars vie for the spotlight; it was encouraging and intellectually invigorating.”
Take that, New Bedford! (For a full list of the panels and non-scholarly sessions that made up “Why Melville Matters Now,” check out its Web site.)
Keeping a commonplace book -- a notebook for copying out the striking passages you’ve come across while reading -- was once a fairly standard practice, not just among professional scholars but for anyone who (as the expression went) “had humane letters.” Some people still do it, though the very idea of creating your own customized, hand-written anthology does seem almost self-consciously old-fashioned now. Then again, that may be looking at things the wrong way. When John Locke circulated his New Method of a Common-Place Book in the late 17th century, he wasn’t offering Martha Stewart-like tips on how to be genteel. He had come up with a system of streamlined indexing and text-retrieval -- a way to convert the commonplace book into a piece of low-tech software for smart people on the go.
There is a fairly direct line running from Locke’s efficiency-enhancing techniques to The Yale Book of Quotations, a handsome and well-indexed compilation just issued by Yale University Press. That line runs through the work of John Bartlett, the prodigious American bookworm whose recall of passages from literature made him semi-famous in Cambridge, Mass. even before he published a small collection of Familiar Quotations in 1855. He included more and more material from his own commonplace book in later editions, so that the book grew to doorstop-sized. I don’t know whether or not Bartlett had read Locke’s essay. But he did index the book in a manner the philosopher would have found agreeable.
Following his death in 1905, “Bartlett’s” has become almost synonymous with the genre of quotation-collection itself – a degree of modest immortality that might have surprised him. (Chances are, he expected to be remembered for the fact that his friend James Russell Lowell once published a poem about Bartlett’s skill as a fisherman.)
The new Yale collection follows Bartlett’s example, both in its indexing and in sheer heft. It is not just a compilation but a work of scholarship. The editor, Fred R. Shapiro, is an associate librarian and lecturer in legal research at the Yale Law School; and his edition of The Oxford Dictionary of American Legal Quotations is well-regarded by both lawyers and reference librarians. In The Yale Book of Quotations, he proves even more diligent than Bartlett was about finding the exact origins and wording of familiar quotations.
The classic line from Voltaire that runs “I disapprove of what you say, but I will defend to the death your right to say it” does not appear among the selections from Voltaire, for the simple reason that he never actually said it. (According to an article appearing in the November 1943 issue of Modern Language Notes, it was actually coined by one of Voltaire's biographers, S. G. Tallentre.) Shapiro finds that the principle later known as “Murphy’s Law” was actually formulated by George Orwell in 1941. (“If there is a wrong thing to do,” wrote Orwell, “it will be done, infallibly. One has come to believe in that as if it were a law of nature.”)
In his posthumously published autobiography, Mark Twain attributed the phrase “lies, damned lies, and statistics” to Benjamin Disraeli. But the saying has long been credited to Twain himself, in the absence of any evidence that Disraeli actually said it. Thanks to the digitized editions of old newspapers, however, Shapiro finds it attributed to the former British prime minister in 1895, almost 30 years before Twain’s book was published.
It turns out that Clare Boothe Luce’s most famous quip, “No good deed goes unpunished,” first recorded in 1957, was actually attributed to Walter Winchell 15 years earlier. And as Shapiro notes, there is evidence to suggest that it had been a proverb even before that. Likewise, it was not Liberace who coined the phrase “crying all the way to the bank” but rather, again, Winchell. (Oddly enough, the gossip columnist -- a writer as colorful as he was callous -- does not get his own entry.)
The historical notes in small type -- elaborating on sources and parallels, and sometimes cross-referencing other quotations within the volume -- make this a really useful reference work. It is also a profitable (or at least entertaining) way to procrastinate.
At the same time, it is a book that would have bewildered John Bartlett – and not simply because it places less emphasis on classic literature than commonplace-keepers once did. The editor has drawn on a much wider range of sources than any other volume of quotations I’ve come across, including film, television, popular songs, common sayings, and promotional catchphrases. Many of the choices are smart, or at least understandable. The mass media, after all, serve as the shared culture of our contemporary Global Village, as Marshall McLuhan used to say.
But many of the entries are inexplicable -- and some of them are just junk. What possible value is there to a selection of 140 advertising slogans (“There’s something about an Aqua Velva man”) or 90 television catchphrases (“This is CNN”)? The entry for Pedro Almodavar, the Spanish director, consists entirely of the title of one of his films, Women on the Verge of a Nervous Breakdown. Why bother?
A case might be made for including the “Space, the final frontier...” soliloquy from the opening of Star Trek, as Shapiro does in the entry for Gene Roddenberry. He also cross-references it to a quotation from 1958 by the late James R. Killian, then-president of MIT, who defined space exploration as a matter of “the thrust of curiosity that leads me to try to go where no man has gone before.” So far, so good. But why also include the slightly different wordings used in the openings to The Wrath of Khan and Star Trek: The Next Generation?
The fact that quotations from Mae West run to more than one and a half pages is not a problem. They are genuinely witty and memorable. (e.g., “Between two evils, I always pick the one I’ve never tried before.”) But how is it that the juvenile lyrics of Alanis Mrrissette merit nearly as much space as the entry for Homer? (The one from Greece, I mean, not from Springfield.)
It is hard to know what to make of some of these editorial decisions. It’s as if Shapiro had included, on principle, a certain amount of the static and babble that fills the head of anyone tuned into the contemporary culture – “quotations” just slightly more meaningful than the prevailing media noise (and perhaps not even that).
But another sense of culture prevailed in Bartlett’s day -- one that Matthew Arnold summed up as a matter of “getting to know, on all the matters that concern us, the best which has been thought and said in the world.” That doesn’t mean excluding popular culture. The lines here from Billie Holiday, Bob Dylan, and "The Simpsons" are all worth the space they fill. But the same is not true of “Plop plopp, fizz fizz, oh what a relief it is."
All such griping aside, The Yale Book of Quotations is an absorbing reminder that all one’s best observations were originally made by someone else. And it includes a passage from Dorothy Sayer explaining how to benefit from this: “I always have a quotation for everything,” she wrote. “It saves original thinking.”
I had considered suggesting that it might make a good present for Christmas, Hanukkah, Festivus, etc. According to the publisher’s Web site, the first printing is already sold out. It is available in bookstores, however, and also from some online booksellers. Here’s hoping it goes through many editions -- so that Shapiro will get a chance to recognize that Eminem’s considerable verbal skills do not translate well into cold type.
In May 2002, Stephen Greenblatt, then president of the Modern Language Association, wrote a letter on behalf of his colleagues on the Executive Council that reverberated throughout departments of English and foreign languages. Drawing on conversations with university press editors and the members of the MLA Ad Hoc Committee on the Future of Scholarly Publishing (whose report was released later that year), Greenblatt noted that “university presses, which in the past brought out the vast majority of scholarly books, are cutting back on the publication of works in some areas of language and literature” and that “certain presses have eliminated editorial positions in our disciplines.” As a result, Greenblatt warned, junior faculty members whose departments require a book for tenure and promotion might be at risk, due not to any shortcoming in their scholarship but to a “systemic” crisis. “Their careers are in jeopardy, and higher education stands to lose, or at least severely to damage, a generation of young scholars.”
Greenblatt’s letter circulated widely in the profession. Within the year, the Committee on Institutional Cooperation, an association that includes Big Ten universities, decided that there was, in fact, no crisis in scholarly publishing. But university press directors continued to insist that their budgets were being trimmed, that university library purchases were down, and that they were compelled to publish cookbooks, or books about regional flora and fauna, to absorb the losses associated with your average scholarly monograph. Meanwhile, junior faculty members became even more worried about their prospects for tenure, while a few opportunistic departments took the occasion of the Greenblatt letter to raise the quantitative standards for scholarly production, on the grounds that if the monograph was the “gold standard” for tenure and promotion at major research universities, then clearly the way to clamber up the rankings was to demand more books from young faculty members.
For the next few years, debate spun off in a variety of directions. Greenblatt had mentioned the possibility that universities might provide “a first-book subvention, comparable to (though vastly less expensive than) the start-up subvention for scientists.” My own institution, Penn State, had a mixed reaction: When, as a newly elected member of the MLA Executive Council, I discussed the letter with my dean and with my colleagues, I was told that Penn State would not consider reverting to the bad old days in which assistant professors without single-authored books were considered for tenure -- but that the College of Liberal Arts would provide $10,000 in start-up costs to every newly hired junior faculty member, to be used for (among other things) book subventions. Across the country, however, the subvention suggestion drew a good deal of criticism. For some observers, it smacked too much of vanity publishing: If we are now in the position of paying presses to publish our work, critics cried, then surely this is a sign that our work is worthless and that the once-high scholarly standards of the discipline had been eroded by feminism and postmodernism and cultural studies and queer theory and Whatever Else Came to Mind Studies.
Remarkably, these critics did not stop to reflect on the fact that scholarly monographs have never sold very well and were kept alive only by the indirect subsidies thanks to which university libraries were able to purchase large numbers of new books. Since new monographs were no longer subsidized by academic library purchases, the MLA argued, it only made sense to support the production of monographs some other way -- particularly since many of the least “popular” monographs are produced not in the fields of queer theory and cultural studies but in medieval studies and foreign languages, fields whose precarious place in the system of academic publishing can hardly be blamed on their trendiness.
Likewise, many departments balked at the idea of “lowering” their tenure standards by relying on modes of scholarly production other than monographs -- things like journal essays, scholarly editions, translations, and online publications. Any move away from the monograph, these critics argued, would necessarily involve a decline in scholarly quality. This argument, it seems, is quite common among professors in the modern languages. It is also quite strange. Less than 30 years ago, the monograph was generally not part of the tenure-and-promotion apparatus: The book-for-tenure criterion is a recent blip in our history. And most academic disciplines, from sociology to linguistics to anthropology to philosophy, do not require books for tenure; yet tenure committees in those disciplines somehow remain capable of distinguishing excellent from mediocre scholarship.
The anecdotal information was piling up, and so were the critiques and countercritiques. The MLA wanted to figure out what was really happening, so the Executive Council created a Task Force on Evaluating Scholarship for Tenure and Promotion in 2004. We spent two years sifting through evidence, statistical and anecdotal; we commissioned a unprecedented study of the tenuring practices of 1,339 departments in 734 different institutions over the past 10 years; we read studies and reports on tenure and the production of scholarship over the past 40 or 50 years; and almost to our own amazement, we completed our report on schedule earlier this year.
The survey contains good news and bad news: The good news is that there is to date no “lost generation” of young scholars whose careers have been thwarted or blighted by the system of scholarly publishing. Tenure rates since 1994 have not changed appreciably, even as many institutions have demanded more published work for tenure and promotion. But there are other factors at work, long before the tenure review. MLA studies of Ph.D. placement show that no more than half, and often fewer, of any given year's Ph.D.’s are hired to tenure-track positions in the year they receive their degrees. Information is sketchy for career paths beyond the first year, but what information is available suggests that, on average, something on the order of 60 to 65 percent of all English and foreign language Ph.D.’s are hired to tenure-track positions within five years of receiving their doctorates and an estimated 38 percent are considered for tenure at the institution where they were hired. Of those 38 percent, 90 percent -- or 34 of every 100 doctoral recipients -- are awarded tenure. In other words, for a variety of reasons, many scholars simply drop off the tenure track long before they are reviewed for tenure and promotion; most of the people who stick it out do so in the belief that they have met the requirements. One might say that the tenure and promotion glass is 90 percent full -- or 66 percent empty, thanks to all the attrition along the way. But it seems clear that the people who are considered for tenure today have become so accomplished at meeting expectations that by the time they are reviewed, they are ready to clear almost any bar, no matter how high it is set. Thus, even as the system of scholarly publishing remains distressed, the scholars themselves seem to be finding ways to cope.
On the other hand, and this is the bad news, their coping mechanisms -- or, rather, the disciplinary practices that produce them -- seem to be rendering the system dysfunctional in important ways. For one thing, the press directors and librarians are not wrong: regardless of the fact that the campuses are not strewn with the bodies of young scholars turned down for tenure, the system of scholarly publishing is under severe financial pressure, and no one imagines that library and press budgets will be increasing significantly anytime soon. New monographs in the humanities now face print runs in the low hundreds and prohibitive unit costs. At the same time, over 60 percent of all departments report that publication has increased in importance in tenure decisions over the last 10 years, and the percentage of departments ranking scholarship of primary importance (that is, more important than teaching) has more than doubled since the last comparable survey was conducted in 1968: from 35.4 percent to 75.8 percent. Almost half -- 49.8 percent -- of doctoral institutions (which, because of their size, employ proportionally more faculty members than any other kind of institution) now require progress on a second book of their candidates for tenure.
So expectations are indeed rising, and most scholars are rising to the challenge. What’s the problem?
The problem is not simple. For one thing, departments are increasingly asking for books from junior professors without providing them the time to write books. It’s no surprise that 88.9 percent of doctoral institutions rate the publication of a monograph as “important” or “very important” for tenure, but it might be something of shock to learn -- it certainly was a shock to us -- that 44.4 percent of masters institutions and 48 percent of baccalaureate institutions now consider monographs “important” or “very important” as well. At the same time, 20 percent to 30 percent of departments -- at all levels -- consider translations, textbooks, scholarly editions, and bibliographic scholarship to be “not important.” About the digital age, most doctoral departments are largely clueless: 40.8 percent report no experience evaluating journal articles in electronic format, and almost two-thirds (65.7 percent) report no experience evaluating monographs in electronic format. This despite the fact that the journal Postmodern Culture, which exists only in electronic form, has just celebrated its 15th birthday. Online journals have been around for some time now, and online scholarship is of the same quality as print media, but referees’ and tenure committees’ expectations for the medium have lagged far behind the developments in the digital scholarly world. As Sean Latham, one of the members of the Task Force, said at the 2005 MLA convention in Washington, “If we read something through Project Muse, are we supposed to feel better because somewhere there is a print copy?” For too many scholars, the answer is yes: The scholarly quality of the .PDF on your screen is guaranteed by the existence of the print version, just as your paper money is secured by the gold of Fort Knox.
The Task Force report recommends that departments and colleges evaluate scholarly work in all its forms, instead of placing almost exclusive emphasis on the monograph. We have nothing against monographs; in fact, a few of us have written monographs ourselves. But our survey suggests that an increasing number of institutions expect more publications for tenure and promotion -- and substitute measures of quantity for judgments about quality. Most important, we believe there is a real and unnecessary disjunction between the wide range of scholarly work actually produced by scholars in the modern languages and the narrow way in which it is commonly evaluated.
We hope it will surprise some people that our recommendations go well beyond this. We attempted to review every aspect of the tenure process, from the question of how many external letters are too many (in most cases, more than six) to the question of how to do justice to new hires who change jobs at some point during their time on the tenure track or who are hired to joint appointments (with explicit, written letters of expectation stating whether and how each candidate’s work at other institutions or departments will be considered). We have recommendations for how departments can conduct internal reviews, so that they are not quite so dependent on the determinations of referees for journals and university presses; recommendations for how to evaluate scholarship produced in new media; and -- though we acknowledge that it’s just beyond our reach -- a recommendation that graduate programs in the modern languages begin deliberating about whether it is a good idea to continue to demand of our doctoral students a document that is, in effect, a protomonograph waiting for a couple of good readers and a cloth binding.
And though our report is complete and (we like to think) comprehensive, we know there is plenty of work left to do. The Task Force believes that the tenure system needs careful scrutiny at every level. Perhaps most important, we need to recognize the fact that two-thirds of college professors in the United States now teach without tenure (or hope of tenure) -- that may well be the “lost generation” on our campuses today -- and that there are few avenues available for the evaluation of their scholarly contributions to the profession. We wrote the report, finally, with multiple audiences in mind -- younger scholars, department chairs, and tenure committees, of course, but also upper-level administrators, graduate students, and the higher education press as well. We hope that all these audiences will find something of value in the report -- and will try, in whatever ways possible, to work with the MLA to implement the Task Force’s recommendations.
Many recent denunciations of Edward Said’s Orientalism are probably best ignored. After all, a stone-throwing incident hardly provides adequate grounds for criticizing one of the most influential books in the humanities published in recent decades. Said, who at the time of his death in 2003 was a professor of comparative literature at Columbia University, was an extremely tenacious and vocal supporter of the Palestinian nationalist cause. It gave even his scholarly work a degree of fame beyond the academic world. Looking over references to Orientalism, it is often clear that many of those ostensibly discussing Orientalism are actually much more concerned with that famous picture of Said at the Lebanese border in 2000, hurling a protest at the Israeli army.
First published by Pantheon in 1978 and eventually translated into some three dozen languages, Said’s book was an ambitious effort to use concepts from 20th century cultural theory to scrutinize the way Western academics and writers understood “the East” during the era of European imperial expansion. Said treated Western literature and scholarship as an integral part of the process of absorbing, assimilating, and policing the colonial Other. That interpretation is now often taken more or less for granted in some parts of the humanities.
Not that it has been immune to serious criticism – including the very sharp take-down in Aijaz Ahmad’s book In Theory: Classes, Nations, Literatures (Verso, 1994), which accused Said of helping to foster “postcolonial studies” as a form of pseudo-political academic politics. Another critique, of a different sort, appears in a new book called Dangerous Knowledge: Orientalism and Its Discontents (Overlook Press), by Robert Irwin.
A novelist and translator, Irwin has also taught medieval history at the University of St. Andrews. Unlike the neocons for whom Said-bashing in something of a sport, Irwin is sympathetic to Said’s political commitment, and praises his effort to defend the Palestinian cause in a hostile environment. “American coverage of the Middle East and especially of Palestinian matters,” writes Irwin, “has mostly been disgraceful -- biased, ignorant, and abusive.”
But Irwin regards Said’s interpretation of the history of Orientalism as unfair and, at times, lightly informed. Irwin writes less like a polemicist than a don. He quotes a passage in which Said -- commenting on the state of Middle Eastern studies in the 12th century -- stretches his erudition a little thin by referring to “Peter the Venerable and other Cluniac Orientalists.” You can almost see Irwin’s eyebrow arch. “Which other Cluniac Orientalists?” he asks. “It would be interesting to know their names. But, of course, the idea that there is a whole school of Cluniac Orientalists is absurd. Peter the Venerable was on his own.”
Beyond catching Said in various misstatements, Irwin’s argument is that the field of European research into Middle Eastern language, culture, and history was by no means so tightly linked to Western imperial ambitions as Orientalism suggests. He is also very skeptical of the value of analyzing Orientalist scholarship alongside Western literary texts devoted to the East -- evading the distinctions between kinds of texts by treating them all as manifestations of a colonialist discourse.
Said, as literary theorist, was prone to the sweeping generalization. Irwin, as historian, is the partisan of noisome little facts. I suppose his criticisms of Said will be well-received by some people in neoconservative circles – though only if they ignore Irwin’s palpable disdain for their policies. (Neocons also have reason to be wary of someone so insistent on factual accuracy, of course.) He agreed to answer a few questions by e-mail. Here’s a transcript of the discussion.
Q: Some criticisms of Orientalism use it as a pretext for denouncing Said's politics. In other cases, the complaints are more methodological -- pointing out that Gramsci and Foucault can't really be fused the way he tries to do, for example. Your approach is different. You share Said’s perspective on the Palestinian cause, and you don't seem all that interested in nuances of theoretical pedigree. So would you explain the source or motivation of Dangerous Knowledge? What made a criticism of Orientalism seem worthwhile or necessary enough for a book of your own?
A: Several things are relevant here. First I thought that Said had libeled several generations of Orientalist scholars, mostly good and honourable men, in some cases even saintly, if often a bit cranky. Since Silvestre de Sacy and Edward William Lane could not reply - or better yet sue for libel, I thought I should take on the job. I also got irritated with people who though that my researches in the Mamluk Sultanate in late medieval Egypt and Syria necessarily had some sort of sinister agenda. Even my wife seemed to favour this notion.
Secondly, there was no history of Arabic studies except Johannes Fück's Die arabischen Studien bis dem Anfang, but that was out of date, a bit dull and in a language most Anglophone students do not read.
Thirdly, I used to be an academic. I am interested then in how knowledge and intellectual skills are and were transmitted. I am very conscious of being part of a chain of historians. I did history at Oxford and I was taught by brilliant dons with strongly contrasting personalities and methodologies. So it is not that I think Orientalists never have agendas. Most of them do, but it is precisely the variety of those clashing agendas that is so interesting.
Finally I got irritated by the way some people in Eng Lit departments seemed to regard themselves as adversarial saints, robed in white and “speaking truth to power” because they read Conrad, Austen and Flaubert in strange ways. Whereas academics who read Masudi, Tabari and Ibn Khaldun were necessarily robed in black.
Q: Said's notion of Orientalism treats it as both a source and a consequence of Western imperialism. It's a more or less massive, homogenous, cohesive body of "power/knowledge" that both constituted "the East" as something that could be understood and colonized AND functioned as a well-entrenched part of the Western imperial machinery. Your account is quite different: the Orientalists you portray are, often enough, scholars who were quite minor figures within the academic and political establishments of their day. Some seem marginal to any institutional "power/knowledge" formation one could care to identify. So what's Said doing? Is it just anachronism -- seeing them as if they were part of a contemporary think tank?
A: To point out the obvious, Said was not a historian. He had no idea then how very few universities there were in 17th, 18th and 19th centuries -- and most of those that did exist were in Germany. Secondly, he obviously thought that these universities had flourishing Arabic or Middle Eastern departments, so that, for example, British academic orientalists were the primary audience for Lane's Manners and Customs of the Modern Egyptians. This flatly is not so. If there were as many as half a dozen academic Arabists in Britain in the early 19th century I would be astonished. Without having sat down to work it out, I would be mildly surprised if there were as many as two. So yes, Said anachronistically foisted 20th-century university politics on earlier centuries.
Until, say the 1940s, Orientalists were overwhelmingly interested in religious, mostly Biblical, issues and philology. Even when Britain and the US did begin to think of Orientalism as a potential handmaiden of Empire, the academic Orientalist temperament was such that they obstinately continued to study and teach cuneiform, pre-Islamic poetry and Abbasid history. And of course, Britain's empire in the Middle East was a short-lived thing. It was falling to pieces in the 1960s.
Imperialists did not read books by Wellhausen or Muir. They read rattling good yarns about soldiers of fortune, tiger-hunting, pig-sticking or polo playing -- unless they were seriously cultured, in which case they read Homer, Tacitus, and Gibbon. And reading the last of these must have given them premonitions of the decline and fall of the British Empire.
Q: Does that mean you see no connection at all between Orientalism of the old school (so to speak) and the later "area studies" work funded by Western governments? Do you see Said's book as having any critical value, at least for raising questions about the vested (geopolitical) interests sometimes at work in scholarship?
A: It is good to be alert to geopolitical interests behind university programmes, but the individuals Said attacked were not in area studies, conflict studies, development studies, or terrorism studies. That was after their time. The sort of Orientalism Said was attacking was already passing away (sadly).
And, in Britain at least, there is not much enthusiasm for funding Oriental studies. Departments are being closed down. It is even possible that Said's book may have contributed to be their being discredited and then closed down. No one in British government is prepared to confess that we are making a real mess of what we are doing in Iraq and Afghanistan and therefore we should train up some proper experts. Politics is too short-term for that.
Plenty of people had raised questions about the agendas of Orientalism before Said ever got to work. Some of them were Marxists, some were Islamicists, one was Bernard Lewis. The annoying thing about Said was that he wanted a debate based on false factual premises. Of course, there are vested interests in scholarship, but, for God's sake, if one is looking at vested interests in in Arabic and Islamic studies, most of the 'vesting' comes from Saudi Arabia, the Gulf States and Brunei with the establishments of chairs and lectureships which are implicitly circumscribed in what kinds of research they can initiate and publish.
Above all, it is a great waste of time attacking British, French, American US and Israeli scholars of Arab and Islamic culture. The people who should be attacked are Senators, MPs, Israeli generals, arms merchants, media hacks, etc. The academic dog fight is a fantastic diversion from the real horrors of what is happening in the Gaza Strip, the Left Bank and Lebanon. If one is serious about politics, the Orientalism debate is an intellectual substitute for engaging with real, non-academic issues.
Q: You note that the initial scholarly reception for Orientalism was quite critical. But the book went on to have considerable influence. How do you account for that?
A: The earliest reviewers were mostly people who knew a lot about the actual state of the field. The enthusiasts who came later did not know the field and were mostly too lazy to check Said's assertions. The book, by “speaking truth to power,” appeals to the adversarial mentality so common among students and radical lecturers.
Bashing Orientalism has seemed to be a natural intellectual accessory to opposing Israeli policies on the West Bank and Gaza Strip, American imperialism and British racism. It is much easier deliver patronizing lectures or essays about old-fashioned Orientalists than it is to actually do anything useful for Palestine (or come to that actually learn Arabic and become the “right” kind of Orientalist, whatever that would be). Also, for many students, Said's book, with its references to Foucault, Gramsci and Althusser, must have provided them with their first brush with critical theory. Exciting stuff.
Q: Much of the recent criticism of Orientalism -- at least in the United States -- treats Said as having been only too successful at severing any connection between scholarship on the Middle East and the needs of the American foreign policy establishment. You clearly aren't a supporter of the latter. But what do you make of that criticism? Is it fair? And do you have any concern about your critique of Said proving useful to, say, the neoconservatives?
A: Martin Kramer's book Ivory Towers on Sand has attacked U.S. Middle East and Islam specialists for having been useless as advisors of government and bad too at predicting what was going to happen next in the Middle East. While I enjoyed some of his criticisms, I am unsympathetic to this point of view. I do not think academics should serve as handmaidens to the State Department. It distorts research agendas and findings. Journalists, politicians and astrologers can try their hand at predicting the future. There is nothing in the history of historiography to suggest that historians (still less specialists in Arabic philology or Islamic prayer ritual) would be particularly good at that game. Research should be guided more by intellectual curiosity than by government funding.
As far as the current disaster in Iraq is concerned, the overwhelming majority of Orientalists -- more than 90 percent I guess -- were wholly opposed to the war. In Britain the expert advice that Blair asked for was entirely negative. Blair thanked them for their advice and then went his own infernal way. But, in general, it is quite rare for politicians even to go through the pretence of consulting Orientalists. And, in terms of the history of the field, this did not even begin to happen until the twentieth century and then a little more often after the Second World War.
Academics who suppose that the neocons in Washington were ever bothered by Said's book are living in fantasy land. (I feel ever so slightly tempted to join them in that territory as I fantasize about Dick Cheney and Donald Rumsfield cracking open a bottle of champagne after they have read and enjoyed Dangerous Knowledge.)
Said wrote lots of valid political polemic, but Orientalism was a rotten book and it was inevitable that someone should blow a whistle on it. It has been a delusive distraction. It converted real political and social issues into a campus dog-fight. Soothing displacement activity indeed.
As to whom my book may be useful to, Bishop Joseph Butler in the 18th century made the following observation: “Things and their actions are what they are and the consequences of them will be what they will be: why then should we wish to be deceived?”
Submitted by Amy Wink on December 21, 2006 - 4:00am
In the waning days of autumn, as the light moves further from us, color rises in the trees if the nights are cool and the days clear. One autumn, when I lived in Kansas, the silver maple outside turned such brilliant shade of golden yellow that my whole living room glowed in its radiance. Every afternoon for three weeks, I was awash in indescribable color and the memory of yellow warmed me through winter. Though we don’t get that kind of autumn display in Austin, I can still revel in the colors we do have: the waving hillsides of pale mauve grasses, the jade of Austin’s river-lake, the deep bright blue of the late November sky. Perhaps these colors become more vibrant in the days of retreating light, as we move toward the solstice, toward the moment when in it seems the sun may not return, and we celebrate to draw it back.
I have built a life around words, teaching literature and writing, to support my own writing. I can lose hours in the thesaurus or dictionary, distracted by nuanced and intricate meanings. My students know I can be equally engaged, and distracted, as I teach a poem or any example of language inventively expressed. I recently burst out with delighted laughter when a colleague described the process of “sales” as “buying facilitation," happily pondered the tiny shift of meaning in those words. But I often turn wordless during the dreary days of our winter, as a field lies fallow, resting before the spring. In my head, the poetry of the writers’ saint, John the Apostle: “and the light shineth in darkness, and the darkness did not comprehend it.” Waiting, I play in color, the play of light.
As our trees’ show of autumn color ends and the cedars’ rust with pollen, I look for natural color where it does not fade. As my black cat stretches under errant sunbeams, his coat glitters like a prism, becoming all colors other than black. In my horse’s plush auburn coat glint copper, gold, and brass. The subtleties of more intricate design appear in the feathers of the chickens my friends keep. Red Joe, the named rooster, bears a wondrous palette of russet and chestnut, with green-black tail feathers. Looking closely, I see the tiny dart of green centered in each feather draping his neck, a detail lost on less observant eyes. His flock of hens is no less brilliant -- amber, wheaten, and russet, and a the solitary Barred Rock a simple black and white, until the light hits her and she shimmers with serpentine green. Even while shopping, abundant color dazes my distractable eye, from the vegetable stacks of crook-neck squash, and bell pepper, to the subtle shadings of apples in reds, golds, and greens. I find myself standing in a stupor of color.
Eventually, I will be saturated and begin comprehending the light in new ways, moving from observing color to creating with color. Instead of writing, I start to conceive visually, exploring color and light, and exercising my creativity by expanding the territory in which I create. This play is essential to my writing precisely because it is play. There are no goals or desires other than creating visual beauty.Creating visual art employs a whole new portion of my brain and while my writer’s mind rests, my creativity is strength-training. Because I am not a visual artist, my play can produce whatever work I like, as simple as watercolor sweeps on paper. If I open a box of Crayola Crayons®, the scent returns me to a primordial state of exploration and creative chaos. With crayons, there is no grammar.
In this play, I achieve the “flow” Mihaly Czsikszentmihalyi notes is essential aspect of creativity, just as I also do in my writing. “Flow” is particularly powerful when I work with shards of cathedral glass, designing an ornamental piece like a puzzle of color and shape. Concentrating intently on how the pieces fit, the shapes I must cut to place in the design, how the colors balance and work together, I can lose myself for two or three good hours in each piece, and when I finally look up, my play has produced a piece to capture the light shining in darkness.
Like light refracted in a prism, separated into the different lengths of colored beams, delving into the visual is a way I can refract creativity. The long blue beam of my writing is complimented by the array of other colors, other expressions of creativity that balance and enhance my work by allowing me to explore new ways of seeing and re-creating the world in which I live. To enhance their art, painters might dance, musicians might paint, writers might sculpt, and then bring all those shades of creativity back to the art of their choosing. After my winter play, my words are strong and vibrant, rested and basking in the return of the strengthening sun, ready for the work of writing, but my crayons also stand ready.
Amy Wink teaches at Austin Community Colleges. She is completing her second book, Their Hearts’ Confidante: The Diaries of Henrietta Baker Embree and Tennessee Keys Embree, 1853-1884, for University of Tennessee Press.
The expression "Internet year" refers to a period of about two or three months -- an index of the pace of life online, in what the sociologist Manuel Castells has called the "space without a place" created by new media.
That means a decade has passed since Inside Higher Ed made its first appearance at the Modern Language Association, during the 2004 convention held in Philadelphia. So next week is a kind of homecoming. I'll be in Philadelphia starting on Tuesday and will not return home until sometime late on Saturday -- and hope to meet as many readers of Intellectual Affairs as possible along the marathon route in between.
The whole "space without a place" quality of online experience can, at times, prove more anomic than utopian. So here’s a thought: Inside Higher Ed will have a booth (#326) in the exhibit hall. I'll be there each afternoon between 2 and 4. Please consider this an invitation to stop by and say hello.
Tell me what you’re reading lately.... What sessions have blown your mind, or left you cursing under your breath.... Whether you think the report on tenure is going to make any difference or not.... What magazines or journals or blogs you read that I have probably never heard of....
And, by the way, if I ask you if you’ve heard any really interesting papers during the week, please don’t then go, "OK, what’s hot nowadays?" If I want to know what’s hot, I’ll go ask Paris Hilton. This peculiar insistence on mimicking the ethos of Hollywood (talking about "academostars,” “buzz,” hunting for the “hot new trend,” etc.) sometimes makes it seem as if Adorno was an optimist.
To put it another way: I’d much rather know what you’ve found interesting at MLA (and why) than hear you try to guess at what other people now think is exciting. Please come by the booth. But if you use the word “hot,” I hope it is only in the context of recommending someplace to get a burrito.
That sort of ersatz fashion-mongering is less a problem than a symptom. Lindsay Waters, the executive editor for the humanities at Harvard University Press, has been complaining for some time about the structural imperative for overproduction in some parts of the humanities -- a situation in which people are obliged to publish books, whether they have anything to say or not. And when scholarly substance declines as a definitive criterion for what counts as important, then hipness, hotness, and happeningness take up the slack.
“Few libraries will buy many of the books published now by university presses with booths at the MLA convention,” wrote Waters in an essay appearing in the May 2000 issue of PMLA. “Why should tenure be connected to the publication of books that most of the profession do not feel are essential holdings for their local libraries?”
He brooded over that question at somewhat more length in Enemies of Promise: Publishing, Perishing, and the Eclipse of Scholarship, a pamphlet issued by Prickly Paradigm Press a couple of years ago. You hear quite a few echoes of the booklet in the recommendations of the MLA task force on tenure. “Scholarship,” as the final report puts it, “should not be equated with publication, which is, at bottom, a means to make scholarship public, just as teaching, service, and other activities are directed toward different audiences. Publication is not the raison d’être of scholarship; scholarship should be the raison d’etre of publication.”
Well, yes. But you’ve got the whole problem of the optative, right there -- the complex and uncertain relationship between “ought” and “is.” (Sorry, had a neo-Kantian flashback for a second there.) The real problem is: How do you get them to line up?
The task force makes numerous recommendations – some discussed here. I thought it would be interesting to find out what Waters thought of the report. “It does talk about a lot of the problems honestly,” he told me, “including the shift to part-time labor.” But his reservations seem a lot more emphatic.
“My fear for the MLA report,” he wrote by e-mail, “ is that it will be shelved like the report of the Iraq Study Group. And there may be another similarity: The ISG made a mistake with Bush. They gave him 79 recommendations, not one. This report runs that risk, too. Like my Enemies book, the report offers up ideas that it will suit many to ignore.... Churchill said it so well -- the Americans will do the right thing only after they have exhausted all the other possibilities. The problem is that this relatively frail creature, the university, has survived so well for so long in the US because for the most part it was located in a place where, like poetry (to cite the immortal Auden) executives would never want to tamper. But they are tampering now. And they are using the same management techniques on the university that they used on General Motors, and they may have the same deadly effect.”
Worrying about the long-term future of the life of the mind is demanding. Still, you’ve still got to pack your luggage eventually, and make plans for how to spend time at the conference. MLA is like a city within a city. No accident that the program always looks a little like a phone directory.
It contains a great deal of information – and it’s well-organized, in its way. But it can also be kind of bewildering to browse through. It seems like a salutary development that people have, over the past couple of years, started posting online lists of the sessions they want to attend. It’s the next best thing to having a friend or trusted colleague make recommendations. Here is an example.
If you’ve already posted something about your conference-going itinerary, please consider using the comments section here to link to it. For that matter, if you’ve noticed one or two sessions that you consider not-to-be-missed, why not say so? Consider the space below a kind of bulletin board.
One tip I hope you’ll consider (despite the beastly hour of it) is the panel called “Meet the Bloggers.” It is scheduled for Saturday, December 30th, at 8:30 in the morning. The list of speakers includes Michael Bérubé, John Holbo, Scott Kaufman, and the professor known as Bitch, Ph.D.
For abstracts, go here. I will also be on the panel, commenting on the papers afterwards. That is, assuming I can get an intravenous caffeine drip.
There is a nice bit of synchronicity about the date that the program committee scheduled “Meet the Bloggers.” For it will be the anniversary (second or tenth, depending on how you count it) of “Bloggers in the Flesh” -- an article that appeared well before anyone in MLA thought of organizing a panel on the topic.
A lot has happened in the meantime -- including a sort of miniature equivalent (confined entirely to academe) of what sociologists call a “moral panic.” For a while there, blogging became a suspicious activity that threatened to weaken your scholarly reputation, ruin your job prospects, and cause thick, coarse hair to grow upon your palms.
It all seems kind of silly in retrospect. No doubt the level of discussion will be much higher at the panel. I hope some of you will make it. But even if not, please consider stopping by to say hello at the IHE booth, any afternoon between 2 and 4.
I was in a white clapboard building recently, near one of the many railroad tracks that crisscross central Illinois. The building was part of one of several church properties in Champaign-Urbana and its neighboring towns. I was there to teach a class on modern American poetry at a 12-month Christ-centered substance abuse rehabilitation program. The table dominating the room was being cleared of lunch when I arrived. Most of the men introduced themselves with their full names as I walked around and greeted them, but once at the table they were Brother Jones or Brother Green. Then we sat down, 10 African-American men and me around a wooden seminar table with photocopies of the poems I had assigned. The coordinator of the class -- or reading group -- is a tenured faculty member at a nearby university. Next semester the project will be supported by a grant from the Illinois Humanities Council, but my time and that of the other teachers was volunteered and will remain so.
I was invited to teach one of the two-hour sessions by a colleague. Some months ago I explained that I would focus on African-American poems about religion, some deeply grounded in religious faith, others critical of organized religion. This debate about religion among African American poets has a long history, as I explained to the participants. It is deeply felt and surely one of the impressive legacies of the last hundred years of our literary history.
One of the men soon volunteered that some of the poems made him angry. I said that was exactly right. Some black American writers felt sustained by the church, others felt betrayed, but none were writing merely to reassure us. They wanted us to respond powerfully. We certainly did not have to agree with them. We could take up our place in the debate. I explained that many people assumed poetry was a much milder art form. Not so, I argued, and these poems proved the point. They compressed the writers' views and made them available to us in telling language. The group had read Langston Hughes' "Christ in Alabama" and "Goodbye Christ," Amiri Baraka's "When We Worship Jesus," and Carolyn Rodgers' powerfully pro-Christian poems "when the revolution comes" and "mama's god."
I pride myself in being able to enter these poets' worlds and embody their disparate convictions. But on this December day I did not have a chance. Fifteen minutes into the session the reverend arrived and pulled me aside:
Reverend: "I cannot have these men exposed to this language and these ideas."
CN: "I'm letting them enter into this long-running debate, and I'll be very positive about the pro-religious poems. Let me go through the poems for you and show you what I plan to say about them."
Reverend: "I don't care. These men cannot read things like this. They have to get grounded."
CN: "I'm sure they see much worse on television and saw much worse on the streets."
Reverend: "They only watch the programs I let them watch. They don't read newspapers. Tell me the role of faith in your life."
CN: "Well, I believe in the pursuit of justice and in human decency."
Reverend: "You're not really telling me about your faith."
CN: "I suppose not. Look, this is about academic freedom."
Reverend: "Not here."
CN: "I'm the president of the American Association of University Professors. We've defined academic freedom for nearly a hundred years."
Reverend: "Not here. I decide what gets taught. I approve what they read. I'm ordering you to leave the building."
Since it was a private facility I left as ordered. But the program is to be funded with public money, and the Illinois Humanities Council was assured free speech was guaranteed in the classes. It is not. Indeed others have suggested the students were under pressure not to disagree with church doctrine. This is precisely why the separation of church and state is established in the United States Constitution, though there is reason to doubt President Bush is comfortable with the concept.
Although it was humiliating to be ordered out of a class I was teaching, it was also instructive. Though this local minister was not quite a prince of the church, it was still my first experience of being silenced by church authority. I naively assumed that clearing my lesson plan with the course coordinator was all I needed to do to guarantee my freedom. I naively assumed, adapting Gertrude Stein, that a classroom is a classroom is a classroom. I've not been silenced before or had the experience of being thrown out of the classroom in nearly 40 years of teaching. Other faculty members are not so lucky. Many religiously oriented colleges and universities would never conduct business so crudely. But some do. Any doubters might begin by reading the AAUP's investigative report on Brigham Young University. That is why we remain vigilant.
The reverend made it clear -- though he didn't use the word -- that indoctrination had to precede exposure to the free market of ideas. Students had to have their responses preprogrammed before they could be allowed to encounter secular culture. My own view is that these men -- in their 20s, 30s and 40s -- could read Langston Hughes and still side with Carolyn Rodgers. She concludes one poem with the lines "when mama prayed, she knew who she / was praying to and who she was praying to / didn't and ain't got / no color." I wanted them to hear the lines read aloud and discussed, because they are lines every American churchgoer should hear. These are lines these men could use in their encounters thereafter. But academic freedom did not carry the day.
Cary Nelson is president of the American Association of University Professors and a professor of English at the University of Illinois at Urbana-Champaign.
The word “criticism” shares the same root as “crisis” -- a bit of fortuitous etymology that everyone in literary studies remembers from time to time, whether in the context of sublime theoretical arguments (interpretation at the edge of the abyss!) or while dealing with the bottom-line obstacles to publishing one more monograph. Not to mention all the “criticism/crisis” musing that goes on at this time of year as people finish their papers for MLA, sometimes with minutes to spare.
Once this season of crisis management is past, I hope readers will turn their attention to Geoffrey Galt Harpham’s new book The Character of Criticism (Routledge). Harpham, who is president and director of the National Humanities Center, offers a meditation on what happens (in the best case, anyway) when a literary scholar encounters literary text. Most of the book consists of close examination of the work of four major figures -- Elaine Scarry, Martha Nussbaum, Slavoj Å½iÅ¾ek, and Edward Said – who bring very different methods and mores to the table when performing the critic’s task. The contrast between Nussbaum and Å½iÅ¾ek, in particular, seems potentially combustible.
But the book is not a study in the varieties of critical engagement possible now, given our capacious theoretical toolkits. Harpham’s argument is that literary criticism is a distinct type of act performed by (and embodying) a specific type of agent. We don’t read criticism just for information, or to see concepts refined or tested. Criticism is, at its best, a product of “cognitive freedom,” as Harpham puts it.
“Interpretation represents a moment at which cognition is not absolutely bound by necessity to produce a particular result,” he writes, “...and this moment serves as a portal through which character, an individual way of being in the world, enters the work.”
In the week just before the MLA convention, I interviewed Harpham by email about his book -- a discussion that led, in due course, to asking him for his thoughts on the MLA's recent report on scholarship and tenure. A transcript of the discussion runs below.
But first I want to quote some favorite lines in The Character of Criticism. They appear in a section drawing out, at some length, the parallel between literary criticism and the kinds of responsiveness and responsibility before “The Word” one finds in, say, Saint Augustine.
“The act of writing a critical text,” as Harpham puts it, “reaches deep into oneself, testing one’s acuity, responsiveness, erudition, and staying power. But critical writing also tests attributes normally considered as moral qualities, including the capacity to suspend one’s own interests and desires and to make of oneself a perfect instrument for registering the truth of The Word.”
Easier said than done, of course. Harpham goes on to describe the obligations thus imposed on the critic, thereby fashioning a new identity in the process. Here’s a passage in a format suitable to be printed out, clipped, and posted near one’s computer monitor for sober contemplation:
“One must .... wish to be regarded as a person who can overcome insubordinate impulses, remove clutter and distractions from the field of vision, isolate the main issues, set aside conventional views, persevere through difficulties, set high standards, see beneath appearances, form general propositions from particulars, see particulars within the context of general propositions, make rigorous and valid inferences from concrete evidence, be responsive without being obsessive, take delight without becoming besotted, concentrate without obsession, be suspicious without being withholding, be fair without being equivocal, be responsive to the moment without being indiscriminate in one’s enthusiasms, and so forth.” --Geoffrey Galt Harpham
That final clause -- “and so forth” -- is really something. Talk about criticism and crisis! The prospect of adding more to that list of demands is either inspiring or terrifying, I suppose, depending on the state of one’s character....
Here's the interview:
Q: We use the word "character" as a way of talking about a fictive person. We also use it, when talking about real people, to refer to a definitive pattern of behaviors and attitudes (something durable, if not inflexible, about how they deal with other people). And then, of course, there's the old-fashioned, moralistic sense -- as in referring to someone "having character" or "being of weak character." When you write about the role of character in academic literary criticism, which of these usages fits best? Any secret yearning to be William Bennett motivating your work?
A: Since I’m talking about the character of criticism, your second version, the “definitive pattern of behaviors and attitudes,” is the most pertinent for my purposes. But the first usage, referring to fictive people, is also relevant, because fictive characters have to exhibit more consistency than real people, just in order to be recognizable from one textual moment to the next. I’m willing to entertain the possibility that the two are linked, that personal consistency is a self-imposed constraint or “fiction” that makes us recognizable to ourselves and others.
To me, the most powerful instances of criticism are those in which the drama of perception and understanding, which is also a moral drama in the broadest sense, is somehow visible in a shadowy way, encoded or encrypted in the critical text. I’ve always been struck by the fact that the criticism that impressed me most deeply managed to suggest an intimate encounter, even a kind of wrestling, between a strong, committed, informed, and responsive mind and a cultural text that probed and tested that mind, revealing its powers, limitations, and dispositions -- in short, its character. Part of the character of criticism is its capacity to reveal the character of the critic, even in ways the critic has no knowledge of. In fact, I think that criticism is, or can be, one of the most interesting ways of manifesting character.
Any yearning I had to be William Bennett was more than satisfied when I became president of the National Humanities Center: He was one of my four predecessors, before he went to Washington to serve in the Reagan administration. He is, however, interesting in terms of all three of your definitions of character. Because he does not display consistent behaviors (scolding people about their lack of moral strength on the one hand, compulsive gambling with horrific results in Vegas on the other), he has come to be seen as a kind of “fictive person,” one that exists only in books -- his books. Some people, inspired perhaps by those very books, might draw old-fashioned moral conclusions.
Q: Your first chapter has a long section describing a sort of ideal-typical "critical character" (so to speak) through an account of the act or process of critical writing as testimony to the power of a definitive encounter with a text. It’s powerful. But it’s also utterly inapplicable to an awful lot of critical prose one comes across, whether in academic books or journals or at sessions of MLA. The tenure-driven critical encounter often seems like an effort to apply some exciting new theoretical gizmo to a problem that would otherwise be uninteresting except as an occasion for trying out said gizmo. Or is that completely wrong? Is criticism as vocation (the response to a call) actually surviving amidst all the so-called "professionalization"?
A: I agree that the optimal “critical character” is rare, and for good reason. First, one has to be not only a critic, with a certain kind of education and professional opportunities, but also an unusually interesting person, one whose responses to the world are consistent, valuable, and meaningful, significant in a larger sense because they seem to proceed from some set of commitments and convictions rooted in human experience. Then, one has to be willing and able to expose oneself to a text, to respond without defensiveness, to be alive to a challenge. And lastly, one has to be able to write in such a way that both adheres to professional decorums and does something more by giving the reader some sense of the experience of coming to grips with an object of great significance and value.
In addition to the critics I discuss in my book (Scarry, Nussbaum, Å½iÅ¾ek, Said), I can think of a number of others, but really, it’s a wonder anybody can do this. Much of what goes on in the world of literary studies (including gizmo R & D) supports the very best work by providing a professional context for it. Such work can be honorable without being heroic; it can, of course, also be neither. But the best work is done by those who are personally invested in it. I think if more people felt this way about criticism, their work and even their careers would profit and the whole field would be more interesting.
I have learned a great deal about the profession of literary studies from studies of professionalization, but I do not think that criticism benefits from a heightened awareness among critics of their status as professionals. It’s a difficult situation. As marginal and undervalued as literary scholars are at most colleges and universities, they need to develop their own credentialing structures just to keep their sense of dignity intact. But nothing kills the authentic spirit of criticism faster, or deader, than a consciousness of one’s own professional circumstances. Criticism is a professional discourse, but the sternest test of criticism is whether it can communicate even its most refined or challenging thinking in the vernacular.
I would not call criticism a vocation in the Weberian sense; nor would I call it a calling, as if it were a summons you could not refuse without disgracing yourself or violating your own deepest nature. But the greatest critics, the ones who animate and advance the discussion, do seem to have a certain need or urgency to communicate in this form that comes from within.
Q: Well, I want to challenge you a bit on part of that last answer. "Criticism is a professional discourse," you say. But that calls to mind R.P. Blackmur's statement to the contrary: his definition of criticism as "the formal discourse of an amateur." He meant, among other things, that the critic's role was connected pretty closely to the activity of the artist -- that it is a loving ("ama-teur") participation in the making and assimilation of literary form (even if at a certain, well, formal distance). Besides, the idea that there is anything particularly academic about literary criticism is a very recent development in cultural history. In 1920, an English professor who wrote criticism was doing something a little undignified and certainly "unprofessional." So how is it that all of this has changed? Or has it? If asked to name a recent critic whose work really manifested a strong sense of character as you've described it, I'd tend to think of James Wood, who's never been an academic at all.
A: I'll push back a bit on that one, even if it forces me to defend what I have just criticized. Blackmur began his career of poetry and editing in the 1920's; his critical career was finished over a half-century ago. And he was unusual even in the company of amateurs that dominated the literary scene at that time in that he did not have a B.A. Moreover, at the same time as Blackmur was advocating critical amateurism, John Crowe Ransom was writing "Criticism, Inc.," an early manifesto for professional academic criticism (1938). So even in Blackmur's time, his position was not the only, or even the dominant, position being enunciated.
I doubt that most people today would find criticism written in 1920 particularly interesting unless it was written by T. S. Eliot. Come to think of it, with the exception of Eliot's The Sacred Wood, I don't know of one durable, much less memorable piece of criticism that appeared in that year. Modern literature (post-Wordsworth) was not taught in universities, and criticism was necessarily confined to newspapers and journals like Hound and Horn, Blackmur's journal. The total situation today is different, and I don't think that we get a purchase on the present by reminiscing about the old days. Nor is James Wood an argument on your side. He is comfortable outside the academy, as is Louis Menand. But today, they're both at Harvard, Wood in a non-tenure-track position. They are part of the reason that (I contend) Harvard has, right now, the greatest English department ever assembled.
Universities provide jobs and -- in the case of Harvard -- ask little in return. Of course, the university does determine, in large ways and small, what goes on in criticism. Still, precisely because so little is explicitly demanded, an individual critic should find it possible to cultivate that "ama-teur" orientation that -- as I gather you feel -- is the precondition of character in criticism. If it were impossible, I would expect and even hope that talented young people would leave the profession (as I'll call it) in droves.
Q: The question of what counts as scholarship, and how it gets counted, is very much in the air, now, given the recent MLA task force report. The four figures whose work you examine in The Character of Criticism (Elaine Scarry, Martha Nussbaum, Slavoj Å½iÅ¾ek, and Edward Said) have produced work in the usual venues and formats of scholarly publication. But all of them have been active in other ways -- through public-intellectual commentary, but also as activists, at least to some degree. Can you draw any lessons from their examples that might be useful now, as other critic try to figure out how to respond to the felt need to change the circumstances of academic work?
A: This question approaches some very swampy ground, and my response may not get us on dry land altogether.
One easy response to the general problem you describe would be to declare that "the circumstances of academic work" have already changed, and that blogging, chatting, intervening in online discussions, and "public intellectual commentary" conducted in non-academic forums should be recognized by promotion-and-tenure committees as valid academic work, to be considered alongside books and articles in scholarly journals.
Even though this, too, is an easy response, I disagree. Universities pay you to do university work and they are not obliged to accept just any view of what counts. And, as an abstract proposition, it is important, both to oneself and one's readers, that one has established one's scholarly credentials before one weighs in. I say "as an abstract proposition" because I'm all too aware that our credentialing procedures, even at the very best universities, are, shall we way, non-ideal. But in theory the discipline and skills acquired in the course of mastering a certain body of knowledge and finding one's voice in an established discourse serve one very well. None of the people I discuss in my book were public intellectuals at the beginning of their careers, with the exception of Å½iÅ¾ek, who was operating in a very different environment. Nor, for that matter, were Noam Chomsky, Stanley Fish, Walter Benn Michaels, Skip Gates, Paul Krugman, or even Michael Bérubé.
One may think that it's stifling to insist that gifted young people hold their tongues until they prove themselves to their elders, but I don't see it that way. They aren't holding their tongues; they're doing what they were hired to do, and what they presumably love doing; and in the process they are preparing themselves so that if and when they do speak out on public matters in a public forum, they speak with an authority gained over years of reflection on the archive of human creative accomplishment. A distinguished professor, enraged, is a force to be reckoned with.
I know that the real effects of tenure, from an institutional point of view, are to depress faculty pay and encourage people to serve on committees. But among its side effects is a certain measure of protection for people who exercise their freedom of speech in oppositional ways. In fact, I think that tenure imposes a certain burden on one's conscience to do what one can when the situation calls for action.
Q: OK, but the new venues and potentials for digital publication represent only one part of the changing circumstances in academic work. The task force addressed the larger question of what kinds of scholarly activity count for tenure. Any thoughts on the rest of the report?
A: I've thought about tenure a good deal, especially in 2000-1, when I headed a university-wide committee on faculty evaluations and rewards at Tulane. Tulane was a perfect place for this debate to take shape because it was not an elite institution, but routinely compared itself to Brown, Northwestern, Emory, Rice, and Vanderbilt. In other words, faculty were encouraged to think of themselves as serious researchers, even though most of them were not -- if they were, the comparisons would have been more realistic.
What I found over the course of that year and a half was that the contemporary debate on tenure was being driven by a variety of forces, including state legislatures hostile to academia in general, conservative academics hostile to elite institutions, high-powered researchers at those very elite institutions, and a great many ordinary academics who were doing lots of committee work and teaching and wanted to be recognized, with promotions and salary increases, just like those who were publishing regularly. "Flexibility" was the key phrase: universities were encouraged to reward flexibility, as individuals realized themselves in their various ways. Our committee found several problems associated with "flexibility," each one of which we considered insurmountable.
The first was that it granted extraordinary powers to department chairs to work out individualized agreements with faculty members, and that was a recipe for corruption and cynicism. Second, it eroded faculty governance by making department chairs into members of the administration, rather than volunteers arising within the faculty. Third, it meant that the rank of professor at an AAU, Carnegie I institution would not mean anything in particular, and that would lead to a loss in status for all.
In principle, I was not opposed to "flexible" rewards for faculty, but I thought that each institution had to decide what it wanted to be, and how its faculty should be expected to think of themselves. At the top research universities, flexibility is a very bad idea: All faculty should be seen as having jumped over the same bars. At flagship state institutions, it's still a bad idea. But from there on down -- and at Tulane, one of the questions we had to face was exactly where we stood -- the issue was not so clearcut. Many colleges and universities may wish to reward superb teaching or loyal service to the institution with rank and salary increases.
The MLA recommendation that speaks most clearly to this issue is the one about the "letter of understanding" that institutions should issue to their faculty, outlining the expectations. But such explicitness would cause as much grief as it alleviated. It's a buyer's market for faculty, so lower-down institutions have a realistic chance to staff their faculties with Ph.D.'s from top-tier universities, and many do. These young stars may arrive still thinking of themselves as eminent-scholars-in-the-making. If they were given an official document stating that they were not to think of themselves in that way, it would have a demoralizing effect on them, their colleagues, and their students; it would be seen as a way of capping aspiration and upward mobility, and that would be inconsistent with the very idea of higher education.
If the letter of understanding outlined strict requirements for tenure and promotion, it would encourage precisely the wrong state of mind (checking the boxes) for real scholarship or intellectual inquiry. And if it said that there are many excellent self-realizing things you can do to be rewarded, then it would in effect abandon the very concept of "standards," and that, too, would be destructive.
Q: Whatever its potentially morale-killing effect, the "letter of understanding" would at least be explicit. Do you have an alternative in mind?
A: In a sense I do.
Each institution has to come to a rough understanding of itself, leaving enough room for anomalous individuals to be judged on terms appropriate to their contribution. I'm afraid there is no substitute for the act of judgment exercised case by case by people who are presumed to be competent. Though that presumption can be challenged in individual instances, it must be maintained, because it and it alone ensures faculty governance.
I speak from experience here. I -- like Martha Nussbaum, Louis Menand, M.H. Abrams, and many others -- was denied tenure (many years ago, at Penn), so I know how difficult it can be to maintain one's faith in the competence and judgment of one's betters. But the experience builds and tests character. Which is where we began, isn't it?
(A number of Harpham’s recent papers -- several of them overlapping with the themes of his new book – are available here.)