Amelia, a university sophomore, scores a 60 on her first academic paper. On her second she scores a 60 again. On her third paper, she pulls up to an 80 -- mostly due to extensive rewrites. Yet on her midterm and final, she received an astounding 90 and 85. Not only was her paragraph structure and use of quotations significantly better, but her ability to sequence ideas and support claims had taken a leap. Even her mechanics (grammar, sentence structure and punctuation) had improved.
I'd like to say that these two high scores came at the end of the semester; this would prove what an effective instructor I was. Instead, they came at odd times -- the first A came just after the second paper (which scored a D). The solid B paper did come at the end of the semester. The difference was in how the papers were produced. Both the 90 and 85 papers were handwritten in-class timed essays that constituted the midterm and final. The much lower scores were for computer-generated papers that she produced out of class. These, of course, could be rewritten over and over before the due dates.
I'd like to say that Amelia's experience is an anomaly. But I can't. In fact, this semester, 8 of my 20 sophomore English composition students scored significantly better on in-class essays written by hand in a timed situation. Some jumped more than a full grade level. In my three freshman composition classes, almost 20 of 60 students excelled when allowed to write in class rather than compose typed papers on their own time. In fact, at a large community college in California where I taught for six years, I frequently saw 10 to 25 percent of my developmental- and freshman-level writers do significantly better when asked to compose in-class with a topic given just before a two-hour writing period.
How can I make sense of this? Of course I immediately considered my grading rubric. Was I somehow more relaxed when grading handwritten essays? Possible. But in my mind, that could not explain jumps from 75 to 90. Yes, I was somewhat easier on misspelled words when grading handwritten essays. Yes, I may have been swayed by a student's handwriting -- in fact, studies have shown that instructors are often influenced to grade slightly higher or lower, depending on a student's handwriting. But in my mind, there must have been something more to explain jumps of more than a full grade level.
Finally I typed up a student's handwritten midterm and compared it to two computer-generated essays. The handwritten midterm was so much smoother -- I was shocked. Transitions abounded. Other than a few run-ons, sentence structure was fluid. One idea followed another. Claims were supported. The writer seemed to have hit a stride that held out for the required three pages. The computer-generated essays were passable. The ideas were sound, but the writing seemed awkward in every sense. Other than the possibility that I was flawed in my grading, there were several explanations for this jump.
First, the process of writing in-class in a timed situation seemed to discourage the kind of overwrought, constipated writing that some students produce with a typed paper. In my courses, I appeal to the high-context student. After wrangling syllabi for seven years, I've come to the conclusion that I like giving the students necessary information on the front end. After the first class, students walk away with a course outline that gives out specific due dates for all papers -- along with general topics. Those who are worried about their ability to produce college-level work may start on a paper ahead of time and rewrite up until the due date.
Although my office hours are busier at the end of the semester, I do notice an influx of students a week before each paper is due. The good news is that some of these students are producing better work -- their essay structure is sound, their now-approved thesis statements are well supported, and their conclusion doesn't sound tossed-together. The bad news is that some of these well-intentioned students are working, rethinking, and rewriting their papers until they become stiff and self-conscious. They rehash each sentence, tormenting themselves, rewriting until they can no longer see what works anymore. Suddenly their original draft has become stiff and mechanical -- and the due date is looming.
These students often relate number of hours to their final grade. Thus every weekend they have poured into an eight-page study of the topic should translate into a 10 percent jump in grade. Unfortunately, the reality is that trying to infuse light and spontaneity into a paper that has been reworked several times is impossible. So the end product is dull and overworked -- and their grade less than what they expected.
In-class writing, on the other hand, is a completely different form of exercise. Instead of dumping hours and hours into a format that already feels old and overdone, students are given a topic at the top of the hour. True, some students choke. They deliver half a paper. What is on the page is poorly thought-out and incoherent. Yet some, relieved of the need to think and rethink the topic, find themselves rising to the challenge. After outlining for 15 minutes, they find themselves churning out coherent paragraphs that stand together as a unified essay. I've never been able to predict which way a student will perform. It is only when I've graded their midterm that I can make observations about which process seems to produce the best written work.
Next, handwriting encourages students to focus on the writing process; for those less experienced with computers, keyboarding encourages students to focus on the end product. When asked to type up a sample paragraph in a classroom computer lab, all 20 of my English composition students spent more than 15-minutes setting up a document in MSWord, setting margins, choosing a font, centering a title, and typing up their names, instructor's name and class name at the top so that it sat flush-right. This left a disappointing 30-minutes of actual composing of text -- and of that, approximately five to nine more minutes were wasted when students insisted on particular line breaks with text, tried to change the amount of space between lines, and attempted to remove forced underlining of URLs.
Students' questions were not about how to approach the topic -- but were focused on the particular mechanics of the assignment: how many words they would have to provide, whether they could utilize grammar- and spell-check, whether the sample was to be single, one-and-a-half, or double-spaced, if one-inch margins were acceptable and the like. I started to feel like a software instructor instead of an English composition teacher. My frustration was compounded when students either couldn't print out their single paragraphs -- or attempted to e-mail them to me.
Second, handwriting brings writers closer to their work -- which may encourage excellence with particular students. Daniel Chandler, a scholar out of the University of Wales, has done extensive research on how students learn. His article, "The Phenomenology of Writing by Hand," comments on the conditions present when writers write by hand rather than by computer -- and the effect on the end product. In effect, the neurophysiological mechanism of each process is different. And although both handwriting and typing are under the influence of the central nervous system, the dynamics are noticeably different.
With substantial practice at the keyboard, I do believe that students are can become more "fluent" at writing and produce a product as creative as that produced by handwriting. In fact, studies often show that students do as well on a computer than they do handwriting compositions.
In the end, questions still remain for me. How does the time-constraint affect the end product? Do some students simply do better under pressure? Is there something about the timed in-class work that encourages a more focused end product? Does directly typing a work somehow encourage a piecemeal approach? If offered an in-class essay exam with computers, would students then do substantially better than those who chose handwriting? How does typing speed and familiarity with software and hardware impact a student's work?
What about the "power of print"? Isn't it true that students often view a typed paper as an "end product" whereas handwritten work feels like a step in a process? And, of course, how exactly can ideas be more "fluid" with the preferred composition method -- whether it be writing by hand or word processing? With research, more will be revealed. Until then, I will give my students the benefit of both methods. I will continue to offer both in- and out-of-class writing. Those who flourish with the additional time for writing will produce more polished work; those who chafe with the weight of long-term deadlines will rush into the midterm and final to write well -- and ultimately both groups will find the process that produces the best work. Those students who then hone their ability to do both handwriting and word processing may do better in all areas; the resulting degreed professionals may find that both processes serve them well.
Shari Wilson, who writes Nomad Scholar under a pseudonym, explores life off the tenure track.
About 10 minutes into last week's now legendary episode of Oprah (the show that made it to the front page of newspapers; the one that left "memoirist" James Frey on the verge of confessing that he possibly made up his own name, but couldn’t be sure), one part of my mind was riveted to the tube while another part wandered off to conduct an intensive seminar about the whole thing, complete with Power Point slides containing extensive quotations from Foucault’s late writings on the "technologies of the self."
This happens a lot, actually. What Steve Martin once said about philosophy also applies to cultural theory: "When you study it in college, you learn just enough to screw you up for the rest of your life."
Well, it turns out that a certain amount of my seminar was just a repetition of work already done in the field that we might well call "Oprah studies." It has a substantial literature, including four academic books and numerous journal articles, most of which I have read over the past few days. Some of it is smart and insightful. Some of it consists of banalities gussied up with footnotes. In other words, it's like Shakespeare criticism, only there isn't as much of it.
Though there's plenty, to be sure. I've now spent more time reading the literature than I ever have watching the show. Some of it has been very instructive. There was, for example, a journal article from a few years ago complaining that other scholars had not grasped Oprah's postmodernity because they had failed to draw on Mikhail Bakhtin’s work on dialogism.
What important results follow from applying Bakhtin? Well, the concept of dialogism reveals that on talk shows, people talk to one another.
We may not have realized that before. But we do now. Scholarship is cumulative.
Indeed, by 2003, there were grounds to think that Oprah was not postmodern, but an alternative to postmodernity. So it was revealed when the first book-length study of the daytime diva appeared from Columbia University Press: Oprah Winfrey and the Glamour of Misery: An Essay on Popular Culture, by Eva Illouz, a professor of sociology and anthropology at Hebrew University of Jerusalem.
"Far from confirming Fredric Jameson's view that postmodern culture lacks emotionality or intensity because cultural products are disconnected from the people who produced them," writes Illouz, "Oprah Winfrey suggests that both the meaning and the emotional intensity of her products are closely intertwined with her narrative authority." Her programs, books, movies, magazine, and other cultural commodities all add up to "nothing less than a narrative work [able] to restore the coherence and unity of contemporary life."
For an example of this redemptive process in action, we might turn to the program from six years ago called "Men Whose Hair Is Too Long" -- during which, as Illouz describes it, "Oprah brought to the stage women who told the audience of their desire to have their sons, lovers, brothers, or husbands change a ‘hairy part’ of their body (mustache, hair, beard)." The menfolk are briefly “exposed to the public” and then “taken to a back room” – from which they later emerge with “a change supposed to effect a spectacular transformation.”
Such transformations are part of the Oprah metanarrative, as we might want to call it.
“The ‘hairy parts’ are exposed as a transactional object in a domestic, intimate relationship that is constructed as contentious,” as Illouz explains. “The haircut or moustache shave provides a double change, in the man’s physical appearance and in his intimate relationship with a close other. The show’s pleasure derives from the instantaneous transformation -- physical or psychic -- undergone by the guests and their relationships, which in turn promote closer bonds.”
This all sounds deeply transformative, to be sure. It made me want to go get a haircut.
But something about the whole argument -- Illouz’s reference to Oprah’s “narrative authority”; the framing of makeover as ritual of self-transfiguration; the blurring of the line between intimate relationship and televised spectacle -- is really frustrating to consider.
It is hard not to think of Richard Sennett’s argument in The Fall of Public Man: On the Social Psychology of Capitalism (Knopf, 1977), for example, that we have been on a long, steady march towards “the tyranny of intimacy,” in which every aspect of the social conversations gets reduced to the level of the personal. “It is the measurement of society in psychological terms,” as Sennett put it. “And to the extent that this seductive tyranny succeeds, society itself is deformed.”
But no! Such worries are part of an “elite” cultural discourse, according to Sherryl Wilson’s book Oprah, Celebrity, and Formations of Self, published by Palgrave in 2003. A whole raft of theorists (the Frankfurt School, David Reisman in The Lonely Crowd (1950), the arguments about the rise of “psychological man” and “the culture of narcissism” in the writings of Philip Rieff and Christopher Lasch, and so on) have treated mass society as a force creating an almost inescapable force of consumerism and privatized experience. The fascination with celebrities is part of this process. Their every quirk and mishap becomes news.
To the “elitist” eye, then, Oprah might look like just another symptom. But according to Wilson (who is a lecturer in media theory at Bournemouth University in the UK) the Oprah phenomenon belongs to an altogether different cultural logic. It is a mistake to regard her program as just another version of therapeutic discourse. It draws, rather, on feminist and African-American understandings of dialogue -- the public sharing of pain, survival, and mutual affirmation -- as a necessary means of transcending the experience of degradation.
The unusually intense relationship between Oprah and her audience would probably have impressed a stodgy old Marxist like Theodor Adorno as evidence of alienation under advanced capitalism. Wilson regards “the apparent closing of the gap between the star self and the personal self” as something quite different.
“Rather than the participants seeking to transcend their ‘ordinariness’ by emulating the personal of a celebrity,” writes Wilson, “it is the ‘ordinary’ and everyday experience of Oprah which works to validate the personal stories recounted by the guests. In other words, those who speak on the show, and who participate through viewing at home, do not position themselves within the aura of a personal anchored in a glamour that for the majority is unattainable; rather, empowerment is located within the realm of everyday life.”
While the star does possess an undeniable charisma, Oprah’s is the glamour of simple decency. “Irrespective of the topic of the day or the treatment through which the topic is handled,” as Wilson puts it, “Oprah’s performance is guaranteed to be inclusive, (generally) nonjudgmental, (often) humorous, and (almost always) empathic.”
How that amiable persona then generated certain massive effects in the literary sphere is a matter addressed in the two scholarly volumes devoted to analyzing the Oprah Book Club.
Each book has a defensive quality; the authors seem to want to defend the book club, nearly as much as they do to analyze it. “From its inception in September 1996,” notes Rooney, “OBC was commandeered as a rallying point around which both cultural commentators and common people positioned themselves in perpetuation of America’s ongoing struggle of highbrow versus lowbrow. Both sides made reductive use of the club to galvanize themselves either as populist champions of literature for the masses or as intellectual defenders of literature from the hands of the incompetent.”
But Rooney contends that a closer look at the club, and at the books themselves, suggests “that there exists a far greater fluidity among the traditions categories of artistic classification than may initially meet the eye; that we needn’t shove every text we encounter into a prefabricated box labeled ‘high,’ ‘low,’ or ‘middle.’”
Farr’s argument in Reading Oprah converges with Rooney’s -- finding in the conversational praxis of the book club something like a down-home version of Barbara Hernnstein Smith’s Contingencies of Value: Alternative Perspectives for Literary Theory (Harvard University Press, 1988).
The book club has embodied “contingent relativism,” writes Farr, “constructed not in the absence of truth, but in the context of many truths, negotiated truths, truths that people arrive at in conversation with others and with their own often contradictory values.” Hence the need to discuss the reading, to embed the books in a conversation. They need to “have a talking life” so that so that readers can “explore and work their way through the myriad of possible responses.”
Given their interest in giving Oprah’s aesthetic and ethical stances the benefit of the doubt, it is all the more striking when either author admits to feeling some reservations about the program. While doing her research, Farr recalls, she “tuned into a pre-Christmas program” that proved to be “an hour-long consumer frenzy.”
This was an “O List show” which is evidently a major event among the Oprahites. The celebrity “gives away literally hundreds of dollars worth of free stuff to every guest in her audience,” writes Farr. “Pants, candles, shoes, electronics – you name it. If Oprah likes it, she’s giving it away on this show....I watched open-mouthed, both appalled and envious. Was this incredibly tacky or unbelievably generous? Did I want to run screaming from the room or do my best to get on the next show? Both/and. It was a moment of genuine American ambivalence.”
The protocols of the book club were also grounds for concern, at least for Rooney. “Once the tape started rolling,” she writes, “neither Winfrey nor her readers seemed permitted to remark critically on the selections, or to advance beyond any but the most immature, advertisement-like, unconditionally loving responses to every single novel they encountered.”
What made last week’s program with James Frey so fascinating was the sudden revelation of another side of the Oprah persona. Gone was the branded performance as “inclusive, (generally) nonjudgmental, (often) humorous, and (almost always) empathic.” Her manner had scarcely any trace left of its familiar “I’m OK, you’re OK” spirit.
Oprah was angry, and Frey was some very considerable distance from OK. She was also indignant to discover that the publishing industry makes no real effort to enforce the implicit contract between reader and writer that goes with a book being shelved as nonfiction. This seems terribly naive on her part. But no doubt most of her audience shared her surprise. (“She wants publishers to fact-check their books?” I thought. “Hell, they don’t even edit them.”)
Remarkable as the spectacle was, however, it did not come as a total surprise. Perhaps I will give myself away as an “elitist” here, in the terms that Sherryl Wilson uses in Oprah, Celebrity, and Formations of Self. But at the end of the day, the therapeutic ethos is not antithetical to a deep yearning for authority (a craving then met by the stentorian Dr. Phil, who scholars have yet to analyze, oddly enough).
Nor is there any deep discontinuity between the conspicuous consumption of an “O List show” and the completely uncritical attitude towards whatever book Oprah has selected for the month. If anything, they seem like sides of a coin.
In search of a different perspective on the matter, I contacted Cecelia Koncharr Farr – whose book Reading Oprah seems, on the whole, an endorsement of the “individual pluralism” of the show’s ethos. What did she make of l’affaire Frey?
“It seems apparent to me,” Farr told me by e-mail, “that Oprah started out with a viewpoint that most experienced readers would have in this situation, that the facts aren't as important as the more general truthfulness of the story in a novel or memoir. Most readers surely took some of Frey's aggrandizements and exaggerations with a grain of salt from the beginning, while still enjoying the character he was constructing, still enjoying the story, and still finding the book powerful and interesting.....
“My guess is that the righteous indignation we saw on last week's show comes from Oprah representing the less experienced readers who needed Frey's memoir to be true in a journalistic sense. Her chastisement of the publishing industry was the first real exertion of her authority I have seen beyond her selection of books. She's earned that authority, certainly, but it was surprising to see her use it. Still, I believe she used it on behalf of her readers.”
I was, to be honest, dumbfounded by this response. I printed it out, and read it a few times to make sure Farr had actually said what she seemed to be saying.
Her contention seemed to be that Oprah’s audience had become upset from mistakenly reading the book as “true in a journalistic sense” -- which was, somehow, a function of readerly inexperience, not of authorial dishonesty.
And from her account, it appeared that Frey’s memoir contained a "general truthfulness" -- one it would be naive to expect to be manifested at the level of occasional correspondence between the text's claims and ascertainable facts.
So I wrote her back, checking to see if I’d followed her.
“I think theorists and critics, especially, but also seasoned readers, read memoirs without an expectation of ‘correspondence between the text's claims and ascertainable facts,’” she responded. “Memoirists creatively construct characters and situations with a lot of license -- and readers and publishers have tacitly allowed that license. That's not to say Frey didn't take this license to its very limit. His constructions at times lose an even tenuous connection with ascertainable facts. When Frey pushed the limits, he drew intense attention to the slippage this connection has seen in recent years. But he wasn't the first to take such license, nor is he responsible for the larger changing perception of what ‘memoir’ (or ‘creative nonfiction’) means.”
Perhaps those terms now just mean “whatever you can get away with” -- though that seems vaguely insulting to honest writers working in those genres. (There is a some difference, after all, between the tricks played by memory and the kind that a con man practices.)
Why the furor over Frey? “I think the vilification he has been subject to in the media is extreme,” writes Farr, “and probably stems from some larger discomfort about dishonesty from sources who are (and ought to be ) culturally more responsible to the ‘ascertainable facts.’"
There may be something to that. And yet it begs any number of questions.
The man has made a small fortune off of fabricating a life and selling it -- while loudly talking, in the very same book, about the personally transformative power of “the truth.” Oprah Winfrey endorsed it, and (at first anyway) insisted that mere factual details were subordinate to a larger truth... A personal truth....A truth that, it seems, is accountable to nothing and nobody.
Suppose this becomes an acceptable aspect of public life – so that it seems naive to be surprised or angered by it. Then in what sense can we expect there to be institutions that, in Farr’s words, “are (and ought to be ) culturally more responsible to the ‘ascertainable facts’”?
Let’s leave that topic for the Oprah scholars to consider. In the meantime, remember that her next selection is Eli Wiesel’s Night, a memoir about surviving the Nazi death camps. It might be an interesting discussion. Especially if the book club takes up the idea that there are forms of truth that, in the final analysis, have exactly nothing to do with self-esteem.
Many now consider the humanities to be facing a relevancy “crisis.” Partly because of the culture wars, the humanities -- if not the whole university -- appear to have lost their reason to be. To choose just one compelling example, Bill Readings argues in The University in Ruins that the primary role of the university is no longer to inculcate national culture, so it now resorts to rhetorically convenient but substantively empty and ideologically suspect vagaries like the term “excellence” to justify its existence. As one result, faculty in English and composition also suffer from what some recent publications are casting as a labor “crisis.”
While the public grows increasingly skeptical of the nature and purposes of liberal arts education, academics generally, and we suspect English scholars particularly, have not been as effective as they could, should, and must be when representing the value of their work, especially teaching. In a colloquial nutshell, public criticism tends to follow some version of this reasoning: English departments aren’t teaching my kids to write and read well enough because they’re too busy trying to turn them into Marxists, feminists, homosexuals, or -- worse -- grad students. Meanwhile, our scholarship is derided as obtuse, cryptic, or absurd. It matters little that such descriptions are inaccurate, unfair, and often advanced in service of narrow-minded ideologies at odds with the democratic underpinnings of a liberal arts education. The fact remains that our work is nevertheless perceived at turns as irrelevant or threatening, a fact which directly and indirectly contributes to the deplorable state of labor conditions in English.
Because the value of work in English studies is so poorly understood, even among ourselves, negative stereotypes become entrenched in the general cultural psyche in the form of common sense: e.g., literature is boring, difficult to understand, and best left to experts who talk about it in ways that are also boring and difficult to understand. And the value of writing is often reduced to its correctness, which, to many, is valuable only to the extent that it earns, as in earns good grades and jobs. This leads (or likely will lead) to further decreases in the number of English majors (currently about 4 in every 100) and this, in turn, will lead to fewer tenure-track lines and increased stratification of faculty, in the form of part-time and other non-tenurable lines. For example, a 1999 Modern Language Association survey found that only 37 percent of English faculty members were on tenure-track lines. While jobs in composition, tenure-tenure track and otherwise, have proven more available than those in, say, 19th century American literature, such jobs often consist of administrative positions, or what both critics and reformers are now calling the middle-management class of faculty, wherein one or two tenured faculty are charged with supervising a large and shifting class of part-time faculty.
As faculty continue to stratify, it will become increasingly difficult to represent the purpose, direction, and value of work in English studies beyond the rudiments of business writing and the cultural capital afforded by cocktail party knowledge of Shakespeare or Melville. The vicious cycle can be simplified as follows: A managed and stratified faculty often has difficulty representing itself effectively in the culture wars, which in turn exacerbates the level of stratification, which in turn leads to increasing difficulty with representation. The consequences of poor representation and increased stratification harm all faculty and students in nearly every imaginable category, including infringement on academic freedom, especially in matters of curriculum design and assessment, as well as decreasing job security, inequitable pay scales, little or no benefits, high teaching loads, large class sizes, and pitiful office conditions.
James Piereson, writing in a recent issue of the conservative periodical The Weekly Standard reflects the views of many non-academics who haven’t been made to care or care enough about our problems and, in fact, resent academics for our seeming disengagement with their values. He writes: “When this year’s freshmen enter the academic world, they will encounter a bizarre universe in […] institutions that define themselves in terms of left wing ideology. […] which is both anti-American and anticapitalist.” Piereson approvingly refers to university trustees who (in his words) contend that “if their institutions are to be rescued, they dare not rely on faculties to do it.” Piereson’s variety of culture-war mongering and his apparent comfort with making outlandish claims without much more than scatter-shot anecdotal evidence, often lead to equally bombastic and antagonistic counter-statements, and so go the culture wars.
Citing findings from the National Center for Educational Statistics, Louis Menand points out in his 2005 contribution to MLA’s Profession that between 1970 and 2001 the number of English majors dropped, roughly, by a third; however, “the system is producing the same number of doctorates in English that it was producing back in 1970. These Ph.D.'s have trouble getting tenure-track jobs because fewer students major in English, and therefore the demand for English literature specialists has declined.” There are many theories about the causes of this discrepancy (e.g., students who would have previously majored in English are now turning to interdisciplinary programs, in, say, cultural studies, or students are driven by the increasing costs of college education to specialize in areas, such as, say, computer science, which have a reputation for more immediate financial pay off than does a B.A. in English). Regardless, more and more conversations in English studies seem to be focusing on ways to reinvigorate the work of English studies in the 21st century, so as to make it more relevant to the public, especially students.
The various strands of this already vast and quickly growing debate are difficult to summarize and properly attribute in the space that we have. For the moment, suffice to say that the main idea is that work in the humanities, both critical and imaginative, seems to be increasingly alien and perhaps irrelevant to the public. It is often said that scholarship in the humanities has become too insular for its own good. One possible solution to the perceived problem of insularity is often described with the phrase “going public.” In 1995, Linda Ray Pratt uses it in her contribution to the influential collection Higher Education Under Fire. In 1998, Peter Mortensen uses the phrase as the title to his article in the journal College Composition and Communication. More recently, it has been invoked in a Duke University panel on academic publishing, and Henry Boyte makes “going public” the focus of his 2005 occasional paper for the Kettering Foundation. If the catch phrase for the late 90s was “critical thinking,” the phrase for the early years of the 21st century may just be “going public.”
While we believe it is important to go public with academic work in the humanities, this phrase, however catchy, raises more questions than it answers. Go public with what, exactly? And what venues qualify as appropriately public? Further, Louis Menand invites us to consider the possibility that going public may not be as easy or as desirable as it may at first sound: "The last premise academic humanists should be accepting is that the value of their views is measured by the correspondence of those views to common sense and the common culture. Being an intellectual and thinking theoretically are going outside the parameters of a common culture and common sense." (Menand’s emphasis)
This is to say that the duty of academics, be they physicists or humanists, is not to the public but to knowledge, dare we say truth. And the public is not necessarily concerned with either. Menand concludes: "Ignorance has almost become an entitlement. We are living in a country in which liberals would rather move to the right than offend the superstitions of the uneducated. As always, the invitation to academics is to assist in the construction of the intellectual armature of the status quo. This is an invitation we should decline without regrets."
Here, Menand raises some valuable points of caution. In his line of argument, going public may mean caving in, stripping our ideas of nuance, and abandoning precision or critical thinking for the sake of public acceptance. Of course most of us agree that teachers who passively abide by common sense notions and status quo values are not acting like responsible academics, and none of us would endorse this behavior. However, as noteworthy as such cautions may be, the distinction between the academic and the public seems overdrawn here. After all, there are nearly 5,000 college campuses in the United States, enrolling more than 14 million students, with enrollments projected to increase through the year 2014. This is to say that the question of “going public” has already, to a very large extent, been settled: academic work is quite thoroughly situated in the public realm, and if the public considers ignorance to be “almost an entitlement,” then we are at least partly to blame for this state of affairs. Gerald Graff goes so far as to claim that the “university is itself popular culture -- what else should we call an institution that serves millions if not an agent of mass popularization. But the university still behaves as if it were unpopular culture, and the anachronistic opposition of academia and journalism continues to provide academics with an iron clad excuse for communicative ineptitude.”
Going public, therefore, is a useful but not entirely adequate phrase, since it does not explain how more public exposure will improve the current state of the humanities or the public’s view of work done within it. Therefore, we would like to focus on improving the work which is, far and away, the most public and the most popular -- that is to say, our teaching. It will be necessary for educators in English studies to make the case for the work of English studies. Increased and accessible public discourse about teaching literature and writing may be a first step, but one which would require more questioning of what we mean by teaching, to whom it is valuable, and why. As opposed to (re)fighting the culture wars with those like James Piereson, or resisting the public face of academic work, we might practice our discourse theories with the public, rather than merely attempt to report on them, even in jargon-free language. This assumes a dialogue that transforms not only the content of the humanities but also the participants of the conversation -- especially, teachers and their students.
Taking up this point in his recent book, English Composition as a Happening, Geoffrey Sirc bemoans the dulling influence of academic routine, which has led many of us to (re)produce the sort of polemical prose and responses which have, thus far, not proven particularly effective tactics in the culture wars. Instead Sirc urges us, as educators and scholars, to define teaching and writing in ways that articulate the value of innovation and imaginative thinking. And we would like to see Sirc’s suggestion enacted both internally and externally, that is in forums such as this one and in public venues such as newspapers, periodicals, and community meetings, in short, any of a variety of venues that serve to establish dialogue among academics, students, administrators, parents, media members, and legislators. The better we are able to do this, the better we will be able to supplant negative and inaccurate representations of our work.
While critics such as Sirc and Menand are clearly influential here, we understand this task to be of particular importance to graduate students, not least of all because the future of work in the humanities is quite literally in our hands. Should we continue the tradition of predominantly insular and/or antagonistic discourse, our degree of leverage and relevance with the public will continue to decrease, as will our prospects for tenure-line work. It is incumbent upon us to open the lines of communication and to make known the good work that is already being done in our classrooms.
Scholarship on this issue is already underway. For example, at the 2005 MLA conference, Michael Bérubé and Cary Nelson spoke to issues of contingent labor; others such as Peter Mortensen and David Shumway attended to matters of representation. We regard these two issues as linked; that is, the better we understand and represent our work (especially teaching), the better our working conditions stand a chance of improving. For this, we conclude with the following proposals that take from and build on the work of these and other scholars:
1. Cultivate existing trends toward interdisciplinarity, such as linked or clustered courses, in ways that effectively demonstrate the value of English studies, particularly in terms of accomplished reading and writing.
2. Realize that the Ph.D., as a credential for teaching, requires civic responsibility and ethical action. The better we collectively attend to this fact and make this work known, the better we will be able to build a platform from which to argue for improved working conditions.
3. Accept and embrace the possibility of working through cultural debates in ways and venues that are accessible to the general public. This is not to suggest necessary agreement with the public, but to encourage a variety of discourse that holds the public in vital partnership.
4. Encourage hiring, promotion, and tenure committees to value the above efforts or else they simply will not happen, or at least not to the extent that they should. In other words, in order to improve the representation of our work, it will be necessary to appeal effectively not only to the public but also to our senior colleagues.
Frank P. Gaughan and Peter H. Khost
Peter H. Khost is a lecturer in writing and rhetoric at the State University of New York at Stony Brook. Frank P. Gaughan is an instructor in English and first-year writing at Hofstra University. Frank and Peter are both doctoral candidates in English at the Graduate Center of the City University of New York. This article is adapted from a talk they gave at the annual meeting of the Modern Language Association.
On December 20, 2005, U.S. District Judge John Jones ordered the Dover Pennsylvania Area School Board to put science back in its place -- protected from intelligent design and other religious ideas. In Kansas, where we have had no such luck, I participated last semester in a new interdisciplinary college course also designed to put science in its place -- separate not only from religion, but from the humanities in general.
DAS 333 (the numerological implications are coincidental but amusing) -- Human Life and the Universe -- was the work of faculty in physics, geology, the life sciences, philosophy, and English, affiliated with the new Center for the Understanding of Origins at Kansas State University. The course was explicitly developed in the context of the evolution controversy to educate students about the fundamental constitution of science as a discipline. As the sole representative of the non-sciences in the course (the philosopher was a hard-nosed philosopher of science, no fuzzy humanist), I did not expect my contribution to come off as particularly consequential; I was merely there as a reminder of the Other to science, providing a sketch of non-scientific disciplinary thinking. All the action would take place in science's bailiwick. But, by the end of the course, I realized that I had not anticipated the dramatic though inchoate demand for what science cannot deliver. In a twist on C.P. Snow's classic criticism of the emerging chasm between science and the humanities, DAS 333 demonstrated both the necessity of their distinction and the urgent need for both.
A rigorous and reflexive approach to science education is the best way to manage the evolution controversy. The recent Fordham Institute report, "The State of State Science Standards," indirectly but forcefully underscores the wisdom of such an approach; Paul Gross et al in the introduction to the report conclude that the state of science in public schools does not so much reflect the impact of religiously-driven anti-science or intelligent design so much as a demonstrate a correlation between the weak handling of evolution and a general weakness in disciplinary content for science across the board. It follows that the most powerful redress to the resurgence of creationism is a strengthening of disciplinary content in the sciences. Anticipating this connection, DAS 333 provided serious although introductory college-level academic content in its science disciplines. Students calculated luminosities, grappled with the data responsible for the emergence of plate tectonics, and managed some of the microbiology involved in gene suppression. The philosopher of science then used this science content as a source of examples for demonstrating the interaction between theories, their auxiliary hypotheses, and observations, both clarifying the boundary between science and non-science and making the definition of a scientific "theory" clear -- and distinct from mere "opinion."
However, it became increasingly evident to students that the constraints on science, enabling progress in understanding nature, are disabling in other areas. Science cannot, for example, pronounce on the truth or falsehood of propositions like intelligent design. Essentially, the products of science are predictions of new observations consistent with the explanation of existing data. The product of science is not meaning. This set up my work; science: prediction; literature: meaning. I tried to counterpoint the science units with topical fictions, for example, H.G. Wells' "The Star," Burroughs' The Land that Time Forgot, and Crichton's Jurassic Park, in order to demonstrate how the use of language, including figures of speech and fabricated scenarios, elicits feelings and desires in order to construct meaning -- in contrast to what scientific accounts do in response to the same world. This was, more or less, fine. But it was not enough. I discovered at the end that these forays into the literary formation of meaning sidestepped the real force of the humanities in a course like this.
It was during a final class on the implications of the limits of science in the Terri Schiavo case that the greatest challenge to the non-sciences emerged. All students saw the controversy over Schiavo's care as a dispute in which the person whose wishes should have been paramount, Schiavo herself, had no reliable input. They had no idea of how we might achieve greater consensus and resolution on this life-issue as a society. Nor do most of us. The problem with that case, for college students, the general public -- and most of us -- is that, like the issue of abortion, it requires navigating waters murky with emergent technologies, religious tenets, strong feelings, and massive distrust.
Yet this is precisely the miasma into which students must plunge as citizens and decision-makers, and it is for these eventualities they need better and more intelligent preparation. The humanities cannot be content with just developing and promoting the ethical imagination for private use; they must also do much more to connect minds so enriched (all minds, not just those of future lawyers and bioethicists) to complex situations that demand such resources be put into action. The humanities must do more to offset the temptation to either authoritarian or excessively personal solutions for solving complex problems that science often creates, but whose solution is beyond the its reach.
One of the most revealing features of the evolution controversy, as well as so many of the controversies that, properly or improperly, connect religion to public life, is the degree to which it subtly relegates non-science to the personal and the private, to a space beyond the pale of public education. And so it is with the text of the now-infamous statement that Dover, Pennsylvania Area School District science teachers were required to read to students in ninth grade biology. The statement portrays the difference between science and non-science as a distinction between publicly-acknowledged fact and private opinion. Essentially, the creationist strategy is to eliminate the teaching of evolution by privatizing it. "Because Darwin's Theory is a theory," the statement reads, students should be encouraged to explore other views--specifically, "Intelligent Design ... an explanation of the origin of life that differs from Darwin's view."
As a result of the mere theory-status of Darwinism, "students are encouraged to keep an open mind. The school leaves discussion of the Origins of Life to individual students and their families." In other words, given that evolution, as theory, does not have the form of other scientific propositions constituting or based on laws (gravitation, thermodynamics, and so forth), determination of the origin of species is left to the individual and family, that is, to those realms beyond the institution and beyond the state and its laws. In essence, according to this line of reasoning, scientific "fact" is law, in contrast to mere theory. In the absence of scientific law, the formulation of beliefs about the world is essentially beyond education. Ironically, perhaps, it is now against the law in Dover, Pennsylvania, to exempt evolution from the authority of science that the anti-evolution forces narrowly equate with scientific law.
In any case, for creationists and their teach-the-controversy fellow travelers, "theory" on the one hand, and "authority," "fact," and "law" on the other, exist at opposite ends of the spectrum. Furthermore, where authority, fact, and law end is precisely where "each individual must decide for him or herself" begins; there is no controversy, no public arena in which private individuals, informed by science, by traditions religious and secular, by rational ethics and embodied sentiment and compassion can use their wisdom and knowledge to work out a course of action, and it appears that even the possibility of such deliberation is an alien concept to our students -- as it is to the Dover Area School Board. It was clear to Judge Jones, moreover, that the appeal to individual opinion on the part of the Dover Area School Board was a thinly-disguised effort to support religious authoritarianism; he opined that by reminding school children they can maintain beliefs taught by their parents, critical thinking is stifled, not promoted.
While it is conceivable, even likely, that science will succeed in reasserting methodological naturalism as its fundamental feature, and so reduce the inroads of creationism into science curricula to occasional nuisances easily parried by existing institutions, this is not enough to restore the integrity of public education generally. Just as the weakness of science education accompanies avoidance of evolution, the weakness of humanities education emerges as an avoidance of a host of issues that polarize American social and political life. It is hard to calculate the greater omission.
Linda Brigham is head of the English department and a member of the Center for the Understanding of Origins at Kansas State University.
Paula M. Krebs has been a professor of English at Wheaton College, a selective New England liberal arts college, for 15 years. Her sister Mary Krebs Flaherty teaches writing as an adjunct at the inner-city campus of Camden County College, a two-year institution. They are writing a series of artiles about what it's like to teach English at their respective institutions.
Paula: I'm trying not to be annoyed at my students who have e-mailed me that they won't be in class today and tomorrow because their flights back to school were cancelled due to the snow. What business, I wonder, do they have flying out of town three weeks into the semester? And this snowstorm was predicted all week -- they knew there was a chance they'd not get back for classes. Then I remind myself that one said she'd left for a "family emergency" and another because his sister had just given birth. They have a right to set their own priorities -- it's up to me how to handle those decisions in terms of grading.
Mary: Very few of my students have a computer at home, let alone internet access, so they can't e-mail me about problems that come up, such as not being able to attend class due to a snowstorm. None of my women students with children attended class during the snowstorm -- not because they couldn't make the commute, but because they didn't have a babysitter for their kids and the elementary schools were closed in Camden. The priority for these students is exactly that -- their children first, class second. I am acutely aware of the time restrictions that my students face in their personal lives. Most, if not all, have part time or full time jobs, and as I said before, many of my female students have parenting duties when they get home. I find that I have to make homework assignment decisions based on what I think they can actually accomplish without overwhelming them.
Paula: Mine would love it if I took into account their part-time jobs and other obligations when I assigned homework, but I can’t do that. This is a residential college (more than 90 percent of our students live on campus), and I operate on the assumption that taking classes is their full-time job. So I assume that they’ll spend at least three hours outside of class for every hour they spend in class, and I assign reading and writing accordingly. They grumble, but most of them do it.
Mary: I would love half of that time commitment from my students! Instead, I have accepted doctor’s notes for prenatal care appointments and family court documents from students who wanted “excused” absences from class. If a student wants to see me before or after class for additional help, I feel that I have to be generous with my schedule to accommodate them given that, as an adjunct, I have no office or office hours. Since most students have part time jobs and several students even work full time jobs, they have to balance outside work, family obligations, and homework. I admire their tenacity, but I also have to make sure that they are doing a fair amount of school work outside the classroom. This is especially difficult because for many of them their only access to a computer is on campus, and they have to alter work schedules and family schedules to type their papers. To add on to their schedules, I encourage them to participate in a campus bookclub called Mental Elevations, which is one of only three school-sponsored clubs on the Camden campus.
Paula: On my campus, most students have part-time jobs, but many also participate in activities on campus -- theater, singing groups, clubs, and, of course, sports. Scheduling events outside of class is always problematic. We have to work around rehearsals, practices, and working hours. I have never had a student with childcare responsibilities. For me the biggest problem is to make sure they see the relevance to their future careers of what I’m asking them to do. The value of a liberal arts education is clear to the faculty, but it isn’t necessarily self-evident to a 19-year-old how reading Elizabeth Gaskell will help in the world of high finance or state government or retail management.
Mary: It’s much easier for me to make clear to my students that effective writing carries over into their other academic courses as well as future careers. We read paragraphs and essays in different rhetorical patterns that directly correlate to specific career choices. We recently worked on the process essay (“how-to”), and I told them to think about being a human resource manager who had to write a training manual. Before that, we went over the narration paragraph, which corresponded with a nurse’s record of a patient. For me, translating the usefulness of effective writing is relatively easy -- getting the students to believe that writing is a skill that they can learn is the difficult part. They bring a "one and done" attitude into the class, and I need to help them come to think about writing as a process. By following certain steps, they can learn to be effective writers.
Paula: Your students must have pretty clear career goals or aspirations that bring them to a community college at a nontraditional age.
Mary: My class dynamic is definitely interesting because I do have some students directly out of high school (with children of their own), as well as a number of returning students who have now realized that, say, having a CNA certificate (Certified Nursing Assistant) is not as valuable or rewarding as an RN degree. In either case, it seems that the beginning students in Basic Skills classes only have a level of practicality that college equals money and better opportunities for their potential careers.
Paula: I think liberal arts colleges like mine want to have it both ways, really. The students and their parents are investing huge amounts of money in this bachelor’s degree, so they want a return on that investment in the form of a job. At the same time, they have chosen a liberal arts college and not a community college or a state college or university, so they also have a sense that they want an education that is more training in critical thinking, writing, and arts and sciences than it is job training or vocation-oriented, as in engineering or business school. So in our courses we treat knowledge and inquiry as valuable in and of themselves, but outside of class we stress internships, networking, and job and graduate school placement.
Mary: I find myself having this exact duality in my role as graduate student and as a teacher. There is a huge gap between critically discussing 19th century novels like Bleak House at night with fellow graduate students, then turning around and teaching the concept of concrete supporting details in a basic skills class the next morning. What makes this even harder is the fact that in between teaching and being a grad student is working 40 hours a week at a job that doesn't have any relevance to my academic life. But it's the job that pays the bills, and allows for my education, so it has first priority. Maybe this is why I have so much empathy for my students....
Paula M. Krebs and Mary Krebs Flaherty
The previous column by Paula M. Krebs and Mary Krebs Flaherty explored grading and other measures of academic performance.
At December’s meeting of the Modern Language Association, the Committee on Information Technology sponsored two special sessions on electronic scholarship and publication in literary studies. Rather than the familiar panels consisting of three 15-20 minute papers, a pitcher of water, and a brief Q&A (time permitting), these meetings were structured as “electronic poster sessions.” Multiple presenters stationed around the perimeter of the room in front of easel displays and laptop computers demonstrated their projects and spoke to anyone who stopped by with questions.
Conversational and (literally) interactive, these presentations sometimes lasted a few minutes and sometimes more, depending on the size of the group standing at the station and the nature of the questions asked. In the room as a whole, the general hum and the constant movement of people from station to station produced a very different atmosphere from what might be expected in the usual MLA session. It was noisier, more chaotic and informal, more the result of many localized one-to-one encounters. All of this may be new at the MLA, with its culture of paper-readings as public performances, but poster sessions in general have been around for a long time in other disciplines, and sessions like these have long been the norm at science and technology conferences, and even, for example, at the annual meetings of the interdisciplinary Association for Computing in the Humanities. Instead of paper (or foamcore) posters containing labeled graphs, charts, or other visualizations, technology poster sessions usually involve a computer screen. More important, they actually provide for live, hands-on demonstrations of the tools and resources being presented.
This was the second year the MLA has included this kind of technology poster session in its schedule and this year participation was markedly increased. Co-organizer Michael Groden of the University of Western Ontario speculated that simply changing the name in the program from “poster sessions” (which may have puzzled some MLA members) to “digital demonstrations” probably helped. This year’s sessions saw a constant stream of people moving into the room, making their way slowly around the various stations, and then leaving, to be replaced by others. Outside the room a sign containing graphical “thumbnail” images of the posters displayed inside -- a kind of poster of posters -- worked to attract some curious passersby, who wandered in to see what was being demonstrated. Inside, almost every station sustained a rotating cluster of two or three, or six or seven people, sometimes more, watching demonstrations, trying out resources on the computer, or talking to the presenters. Two presenters worked at some stations, in order to be able to divide their attention between two different groups of questioners.
The advent of this new format, along with the increased interest at the sessions, coincides with an increased attention by the MLA to the problem of how to acknowledge new forms of research in general -- and digital projects in particular. The crisis in book and journal publishing has been on the agenda for several years now, but this year a special panel was convened by the MLA to discuss strategies for changing expectations for tenure reviews and to encourage the profession to take account of “’multiple pathways’ to demonstrating research excellence,” other than the traditional letterpress monograph, among them forms of scholarship produced and published online. These new forms include peer-reviewed online journals, which was the focus of the first poster session, featuring my own Web site’s Romantic Circles Praxis Series, as well as The Writing Instructor, and Language and Learning Technology, among other projects.
Though peer reviewed in the traditional way, carefully edited, and indexed by the MLA Bibliography under its unique ISSN, our Praxis Series is also able to take advantage of the flexibilities of scale, inventive genres of scholarship, and more malleable production schedules made possible by online publication. Multimedia essays, illustrated or even using or audio recordings, are among the contributions to its 28 volumes to date. And of course they are searchable, both internally and via Google, and in future may be linked to growing clusters of interoperable digital scholarship.
Besides online electronic journals, which extend traditional forms of scholarship, alternative “pathways” to scholarly research (and publication of the results of that research) will undoubtedly also lead through the kinds of archival, editorial, and analytical work represented in projects at the second poster session: ”New Technologies of Literary Investigation: Digital Demonstrations." These included the William Blake Archive (recently awarded the MLA’s Prize for a Distinguished Scholarly Edition and granted the approval of the MLA’s Committee on Scholarly Editions, both firsts for a digital edition), the Stolen Time Archive, the Mark Twain Digital Project, text-analysis tools such as TAPoR and Tamarind (the latter of which aims to process large collections of XML documents for text-mining and visualization by the larger NORA project), and research interface tools such as the Litgloss Collaboratory for collecting and sharing annotated foreign-language texts, or Turning the Pages (already in use at the British Library and other places as an interface that allows for virtual, animated page-reading, magnification, and other digital manipulations).
These new forms of scholarship call for new forms of presentation beyond the traditional paper-reading panel. Though C.P. Snow surely exaggerated the divide between the two cultures (literary intellectuals are no longer, if they ever were, “natural Luddites”), these unfamiliar forms of presentation, associated for many with the sciences, may require a slight adjustment in expectations and conventional roles on the part of some MLA conventioneers. As the morning poster session was preparing to get underway and many of us were still booting laptops, arranging our tables, or setting out handouts, a group of several participants entered the hall carrying book bags and sat down in the handful of chairs left at the back of the room still arranged in rows. Facing the “front” of the room, they waited patiently for the panel to begin -- until it gradually dawned on them that there was no panel and no real front to the room, and that they were supposed to stand and circulate around the exhibits on their own.
Those attendees serve to remind us that the inertia of “conventional” culture must be overcome if these new forms of presentation are really to become accepted at MLA meetings. They also stand as a reminder of all the potential users out there who may be waiting passively for online scholarship to begin to speak to them where they “sit,” who may not yet know how to seek out and actively engage digital resources. A change in the culture of the discipline is needed if the new tools and resources are to have their desired impact among scholars. And some of those potential users still require basic education about digital scholarship, how to use it -- and how it might change what they do and how they think about their subject matter.
Kari Kraus of the William Blake Archive told me that “about 70 percent of the people were asking very basic questions about the material,” and that she found herself introducing the archive more often than she had expected. “But that’s fine,” she added, “I love talking about it.” At that point in our conversation we were interrupted by someone approaching the station: “Hi,” said Kraus, “Are you familiar with the William Blake Archive?” She was eventually able to demonstrate the visually striking (paper) print standing on her easel -- a pair of landscape-format plates from Blake’s "The Song of Los," digitally reconstructed by Blake scholar (and one of the archive’s editors) Joseph Viscomi from digital facsimiles downloaded from the archive. Using materials freely available at the archive and opening them in Photoshop on his own desktop, Viscomi was able to make and print out a new facsimile of Blake’s etched plates, reunified in order to demonstrate Blake’s own “virtual designs” according to his original intentions. This is only one example (though a particularly vivid one) of the kinds of individual acts of do-it-yourself scholarship now possible as a result of the many years of collaborative editorial work done to build and edit the massive image and text archive.
Despite the lack of live Internet connections (the hotel’s rates were exorbitant), and despite the newness of the format for some MLA members, most participants -- presenters and questioners -- I spoke with responded very positively to the sessions. In fact I learned that the line between presenter and questioner was not always perfectly clear. There was no dais, no microphone, no chairs in rows -- except for those few left in the morning session that confused the early attendees. Sometimes a passerby turned out to be working in a similar field, or was trying to get support for a new digital project at his or her own institution, or was a would-be contributor to an electronic journal on display, or even a long-distance collaborator at several removes -- such is the nature of the networked world of digital scholarship. Some people making their way around the poster-session rooms were conducting business, exchanging cards or URLs, offering informal proposals to journal editors, and others were engaged in detailed technical discussions about problems of textual encoding or user interface.
During the second session it suddenly dawned on me where I had seen this general kind of activity before: not just at humanities computing conferences, but (of course!) at the MLA’s own legendary and massive “poster session” -- the publishers’ book exhibit hall. At the moment the MLA was meeting to discuss ways to encourage “multiple pathways” to demonstrating excellence in research, including a “commitment to treating electronic work with the same respect accorded to work published in print” -- when many in the profession are seeking alternatives to the monographs produced in increasingly diminishing numbers by university presses, the center of gravity in this regard may have shifted ever so slightly away from the cavernous exhibit hall, with its familiar displays, and toward these buzzing, teeming poster sessions, with their multiple demonstrations of a variety of new digital research projects.
Steven E. Jones
Steven E. Jones is professor of English at Loyola University Chicago and is co-creator and co-editor of the Romantic Circles Web site. Â
At my university, I chair a faculty committee charged with reviewing and revising our general education curriculum. Over the past two and a half years, we have examined programs at similar colleges and studied best practices nationwide. In response, we have begun to propose a new curriculum that responds to some of the weaknesses in our current program (few shared courses and little curricular oversight), and adds what we believe will be some new strengths (first-year seminars and a junior-level multidisciplinary seminar).
In addition, we are proposing that we dispense with our standard second course in research writing, revise our English 101 into an introduction to academic writing, and institute a writing-across-the-curriculum program. Our intention is to infuse the general education curriculum with additional writing practice and to prompt departments to take more responsibility for teaching the conventions of research and writing in their disciplines. As you might imagine, this change has fostered quite a bit of anxiety (and in some cases, outright outrage) on the part of a few colleagues who believe that if we drop a course in writing, we have dodged our duty to ensure that all students can write clearly and correctly. They claim that their students don’t know how to write as it is, and our proposal will only make matters worse.
I believe most faculty think that when they find an error in grammar or logic or format, it is because their students don’t know “how” to write. When I find significant errors in student writing, I chalk it up to one of three reasons: they don’t care, they don’t know, or they didn’t see it. And I believe that the first and last are the most frequent causes of error. In other words, when push comes to shove, I’ve found that most students really do know how to write -- that is, if we can help them learn to value and care about what they are writing and then help them manage the time they need to compose effectively.
Still, I sympathize with my colleagues who are frustrated with the quality of writing they encounter. I have been teaching first-year writing for many years, and I have directed rhetoric and compositions programs at two universities. During this time, I have had many students who demonstrate passive aggressive behavior when it comes to completing writing projects. The least they can get away with or the later they can turn it in, the better. I have also had students with little interest in writing because they have had no personally satisfying experiences in writing in high school. Then there are those students who fail to give themselves enough time to handle the complex process of planning, drafting, revising, and editing their work.
But let’s not just blame the students. Most college professors would prefer to complain about poor writing than simply refuse to accept it. Therefore, students rarely experience any significant penalties for their bad behaviors in writing. They may get a low mark on an assignment, but it would a rare event indeed if a student failed a course for an inadequate writing performance. Just imagine the line at the dean’s door!
This leads me to my modest proposal. First, let me draw a quick analogy between driving and writing. Most drivers are good drivers because the rules of the road are public and shared, they are consistently enforced, and the consequences of bad driving are clear. I believe most students would become better writers if the rules of writing were public and shared, they were consistently enforced, and the consequences of bad writing were made clear.
Therefore, I propose that all institutions of higher learning adopt the following policy. All faculty members are hereby authorized to challenge their students’ writing proficiency. Students who fail to demonstrate the generally accepted minimum standards of proficiency in writing may be issued a “writing ticket” by their instructors. Writing tickets become part of students’ institutional “writing records.” Students may have tickets removed from their writing records by completing requirements identified by their instructors. These requirements may include substantially revising the paper, attending a writing workshop, taking a writing proficiency examination, or registering for a developmental writing course. Students who fail to have tickets removed from their records will receive additional penalties, such as a failing grade for the course, academic probation, or the inability to register for classes.
What would the consequences of such a policy be? First of all, it would mean that we would have to take writing-across-the curriculum more seriously than most of us do now. We would have to institute placement and assessment procedures to ensure that students receive effective introductory instruction and can demonstrate proficiency in writing at an appropriate level before moving forward.
Professors would also be required to get together, talk seriously and openly, and come to agreements about what they think are “generally accepted minimum standards of proficiency in writing” at various levels, in each discipline, and across the board. We would be required to develop more consistent ways of assigning, responding to, and evaluating writing. We would also have to join with our colleagues in academic support services to recruit, hire, and train effective tutors.
And we would have to issue tickets. Lots of them. But not so many after awhile when students soon learn the consequences of going too fast, too slow, or in the wrong direction, stopping in the wrong place or failing to stop altogether, forgetting to signal when making a turn, or just ending up in a wreck. Then there is that increasing problem of students who take someone else’s car for a joy ride.
Here’s your badge.
Laurence Musgrove is an associate professor of English and foreign languages at Saint Xavier University, in Chicago.
The other day, I received an e-mail from a colleague who teaches part-time at my university. She read an earlier piece I had written for Inside Higher Ed on why I thought students wrote poorly in college, and she wanted to talk to me about strategies for improving the quality of her students’ writing. She had just completed grading their final papers for the term, and she was frustrated with the number of grammar and citation errors.
During the week after grades were due, we met in my office, and she asked if I encountered the same kinds of mistakes. She also wondered what students were actually learning in our two-semester sequence of required writing courses. Were her expectations unreasonable? Should she assume students should be able to write correctly and cite secondary sources? As a member of the English and foreign languages department and past director of the writing program, I assured her that her expectations were not unreasonable and that students who had taken research writing at our school had received a general introduction to managing sources.
Then she shared with me her syllabus, which contained a one paragraph description of one of her writing assignments. My experience tells me that one of the main problems students have with successfully completing writing projects is the design of the assignments. I’ve found assignments left in the copier by colleagues, and I’ve cringed at the unnecessary complexity of the tasks described or the insufficient explanations of what must be accomplished by the student.
Many assignments, like the one contained in her one paragraph, jumble what we want students to do and what we want students to present. In other words, many assignments I’ve seen fail to clearly delineate between the kind of thinking students need to perform and the kind of communication students need to present. So instead of adequately providing students the information they need to succeed, faculty often distribute a sloppily designed task that is cognitively difficult, if not impossible, for students to sort out. Here’s an example of what I mean:
Describe your agreement or disagreement to the statement below. I would also expect you to include at least 3 references from the course readings. Your response should be in the form of a clearly written and logically organized paper of no fewer than 1500 words. No works cited page is necessary for this assignment, but use MLA format for citations. If you wish to show me an early draft, send it to me by e-mail no later than 2 days before paper due date. Also use no smaller than 12 point font and be sure to proof for grammar and spellcheck. As I explained in class, underline your thesis statement in your introductory paragraph, and try to come up with an original title for this paper as well.
Garbage In, Garbage Out. And then come the many complaints that students don’t know how to write.
I don’t mean to place all of the blame on faculty -- though some serious reflection on our culpability in these matters would certainly help. However, I did say to my colleague that students often fail to understand the complexity and time-consuming nature of writing, and instead of just demanding writing projects and assume students come to us as primed and ready to fire away, we need to help them manage their writing projects by providing carefully constructed assignments and a few opportunities to practice writing as a process over the course of the term.
Helping students practice writing as a process has long been taught as a solution to poorly composed papers, yet I don’t think it’s promoted much across the disciplines. But I also told her that there are cultural dimensions to this problem as well. I believe most students equate writing with transcription because the texts they most often encounter are the perfectly polished written products found in books, newspapers, and magazines. Since the hard work of composing those texts is hidden from readers, they believe that good writers think up what they want to say and then copy down their fully-formed thoughts onto the page. Thus, many students think they can’t begin to write until they have decided what they want to say. This, of course, is no news to composition theorists and teachers of rhetoric. But an alternative approach is rarely presented to students.
I did pitch my colleague some strategies for designing assignments and for providing models of what she expected, and I wish her the very best as she rethinks how to best support her students’ writing. Still, we have a cultural battle to fight. So here is another pitch: a new reality TV series called “The American Writer.”
Since contest shows on television have always generated enormous fascination and appeal in our culture, I would like to pitch a basic cable series (A&E, are you listening? PBS? Bravo? Hey, Oprah!) that follows a select group of college students, faculty, and authors as they meet together for a month at a writers’ retreat. The students will have been selected by a jury of college professors and professional writers based upon three writing samples: a short poem, a personal narrative essay, and an opinion piece. The faculty members will be selected from a variety of academic disciplines, and the authors will be selected based upon their abilities to write in more than one genre. At the end of the program, students will be judged on the quality of three new pieces of writing composed at the retreat, and the winner will receive a very generous cash prize.
The series will provide background about each of the students, faculty members, and authors, emphasizing their writing histories, as well as their favorite kinds of reading. The series will also follow these participants as they come to the retreat, reflect upon their selection to participate in the contest, share meals, attend workshops and tutorials, and describe their perceptions of the other participants. But the primary focus of the program will be on the participants’ descriptions of how they go about the act of writing. We will see them planning, drafting, revising, and editing works in progress. And we will sit in on writing workshops and individual tutoring sessions.
This is the basic pitch. Interested agents and producers should contact me for a more developed treatment. (Then there are the spin-offs: “The American Artist” and “The American Actor.”) But more to the point, my proposal is intended to introduce into our most popular cultural medium powerful knowledge all college students should have: an inside view of what really happens when writers struggle with the inescapable difficulties of communicating their ideas and emotions and stories and values through words on the page.
Maybe professors will learn a thing or two along the way as well.
Laurence Musgrove is an associate professor of English and foreign languages at Saint Xavier University, in Chicago.
A warning: This week’s column will be miscellaneous, not to say meandering. It updates earlier stories on Wikipedia, Upton Sinclair, and the Henry Louis Gates method of barbershop peer-review. It also provides a tip on where to score some bootleg Derrida.
Next week, I’ll recap some of my talk from the session on “Publicity in the Digital Age” at the annual conference of the Association of American University Presses, covered here last week. The audience consisted of publicists and other university-press staff members. But some of the points covered might be of interest to readers and writers of academic books, as well as those who publish them.
For now, though, time to link up some loose ends....
One blogger noted that the comments following my column on Wikipedia were rather less vituperative than usual. Agreed -- and an encouraging sign, I think. The problems with open-source encyclopedism are real enough. Yet so are the opportunities it creates for collaborative and public-spirited activity. It could be a matter of time before debate over Wikipedia turns into the usual indulgence in primal-scream therapy we call "the culture wars." But for now, anyway, there’s a bit of communicative rationality taking place. (The Wikipedia entry on "communicative rationality" is pretty impressive, by the way.)
A few days after that column appeared, The New York Times ran a front-page article on Wikipedia. The reporter quoted one Wikipedian’s comment that, at first, “everything is edited mercilessly by idiots who do stupid and weird things to it.” Over time, though, each entry improves. The laissez faire attitude towards editing is slowly giving way to quality control. The Times noted that administrators are taking steps to reduce the amount of “drive-by nonsense.”
The summer issue of the Journal of American History includes a thorough and judicious paper on Wikipedia by Roy Rosenzweig, a professor of history and new media at George Mason University. Should professional historians join amateurs in contributing to Wikipedia? “My own tentative answer,” he writes, “is yes.”
Rosenzweig qualifies that judgment with all the necessary caveats. But overall, he finds that the benefits outweigh the irritations. “If Wikipedia is becoming the family encyclopedia for the twenty-first century,” he says, “historians probably have a professional obligation to make it as good as possible. And if every member of the Organization of American Historians devoted just one day to improving the entries in her or his areas of expertise, it would not only significantly raise the quality of Wikipedia, it would also enhance popular historical literacy.”
The article should be interesting and useful to scholars in other fields. It is now available online here.
This year marks the centennial of Upton Sinclair’s classic muckraking novel, The Jungle, or rather, of its appearance in book form, since it first ran as a serial in 1905. In April of last year, I interviewed Christopher Phelps, the editor of a new edition of the novel, for this column.
Most of Sinclair’s other writings have fallen by the wayside. Yet he is making a sort of comeback. Paul Thomas Anderson, the director of Boogie Nights and Magnolia, is adapting Sinclair’s novel Oil! -- for the screen; it should appear next year under the title There Will Be Blood. (Like The Jungle, the later novel from 1927 was a tale of corruption and radicalism, this time set in the petroleum industry.) And Al Gore has lately put one of Sinclair's pithier remarks into wide circulation in his new film: “It is difficult to get a man to understand something when his salary depends upon his not understanding it.”
That sentiment seems appropriate as a comment on a recent miniature controversy over The Jungle. As mentioned here one year ago, a small publisher called See Sharp Press claims that the standard edition of Sinclair’s text is actually a censored version and a travesty of the author’s radical intentions. See Sharp offers what it calls an “unexpurgated” edition of the book -- the version that “Sinclair very badly wanted to be the standard edition,” as the catalog text puts it.
An article by Phelps appearing this week on the History News Network Web site takes a careful look at the available evidence regarding the book’s publishing history and Sinclair’s own decisions regarding the book and debunks the See Sharp claims beyond a reasonable doubt.
In short, Sinclair had many opportunities to reprint the serialized version of his text, which he trimmed in preparing it for book form. He never did so. He fully endorsed the version now in common use, and made no effort to reprint the "unexpurgated" text as it first appeared in the pages of a newspaper.
It is not difficult to see why. Perhaps the most telling statement on this matter comes from Anthony Arthur, a professor of English at California State University at Northridge, whose biography Radical Innocent: Upton Sinclair has just been published by Random House. While Arthur cites the “unexpurgated” edition in his notes, he doesn’t comment on the claims for its definitive status. But he does characterize the serialized version of the novel as “essentially a rough draft of the version that readers know today, 30,000 words longer and showing the haste with which it was written.”
A representative of See Sharp has accused me of lying about the merits of the so-called unexpurgaged edition. Indeed, it appears that I am part of the conspiracy against it. (This is very exciting to learn.) And yet -- restraining my instinct for villainy, just for a second -- let me also point you to a statement at the See Sharp website explaining why the version of The Jungle that Sinclair himself published is a cruel violation of his own intentions.
Memo to the academy: Why isn’t there a variorum edition of The Jungle? There was a time when it would have been a very labor-intensive project -- one somebody might have gotten tenure for doing. Nowadays it would take a fraction of the effort. The career benefits might be commensurate, alas. But it seems like a worthy enterprise. What’s the hold-up?
In February 2005, I attended a conference on Jacques Derrida held at the Cardozo Law School in New York, covering it in two columns: hereand here. A good bit of new material by “Jackie” (as his posse called him) has appeared in English since then, with more on the way this fall. Next month, Continuum is publishing both a biography of Derrida and a volume described as “a personal and philosophical meditation written within two month’s of Derrida’s death.”
Bet you didn’t know there was going to be a race, did you?
In the meantime, I’ve heard about a new translation, available online, of one of Derrida’s late-period writings. It is part of his engagement with the figure of Abraham, the founding phallogocentric patriarch of the three great monotheistic religions. The translator, Adam Kotsko, is a graduate student at the Chicago Theological Seminary. (See this item on the translation from his blog.)
The potential for “open source” translation may yet open more cans of worms than any team of intellectual-property lawyers can handle. I’ll throw this out as a request to anyone who has thoughts on the matter: If you’ve committed them to paper (or disk) please drop me a line at the address given below.
And finally, a return to the intriguing case of Emma Dunham Kelley-Hawkins -- the most important African-American writer who was not actually an African-American writer.
In a column last spring, I reported on the effort to figure out how the author of some rather dull, pious novels had become a sort of cottage industry for critical scholarship in the 1990s. After a couple of days of digging, I felt pretty confident in saying that nobody had thought to categorize Kelley-Hawkins as anything but a white, middle-class New England novelist before 1955.
That was the year a bibliographer included her in a listing of novels by African-American writers -- though without explaining why. And for a long time after that, the scholarship on Kelley-Hawkins was not exactly abundant. Indeed, it seemed that the most interesting thing you could say about her fiction was that all of the characters appeared to be white. Kelley-Hawkins did make a very few references to race, but they were perfectly typical of white prejudice at its most casually cruel.
Only after Henry Louis Gates included her work in a series of reprints by African-American women writers did critics begin noticing all the subtle -- the very, very subtle -- signs of irony and resistance and whatnot. Why, the very absence of racial difference marked the presence of cultural subversion! Or something.
So much ingenuity, in such a bad cause.... Subsequent research suggests that Kelley-Hawkins was Caucasian, by even the most stringent “one drop” standards of white racial paranoia in her day.
A recent item by Caleb McDaniel discusses the most recent work on Kelley-Hawkins. The puzzle now is how the initial re-categorization of her ever took place. Evidently that bibliography from 1955 remains the earliest indication that she might have been African-American. (A second puzzle would be how anyone ever managed to finished reading one of her novels, let alone embroidering it with nuance. They can be recommended to insomniacs.)
McDaniel also quotes something I’d forgotten: the statement by Henry Louis Gates that, if he had put up a photograph of Kelly-Hawkins in his barbershop, “I guarantee the vote would be to make her a sister."
You tend to expect a famous scholar to be familiar with the concept of the sepia tone. Evidently not. Here, again, is where Wikipedia might come in handy.
One of the first things a graduate student in the humanities and “softer” social sciences learns is that communication is rarely simple. Words carry latent values and vestigial biases, they are told, and over time the consequences of a word’s usage exceed its ostensible meaning. Post-bac training begins with that distinction, and students advance by attuning themselves to the tacit and the subtextual. “Language is not transparent,” announces the favorite T-shirt of a colleague, and to interpret statements accordingly isn’t just common wisdom. It’s a professional duty.
I’ve felt its pull many times, once while watching a debate on television around 1991 when the campus had become a central theater of the culture wars. Catharine Stimpson, Stanley Fish, and two others took on John Silber, William Buckley, Dinesh D’Souza, and Glenn Loury, with the canon, speech codes, and political correctness the topics. At one point, when Silber asserted the silliness of substituting the title “chair” for “chairman” -- women “calling themselves furniture,” he put it -- Fish replied with a point about the “deep culture of the language.” Often, he argued, “linguistic assumptions can be so deeply assumed that the society that uses them is not aware of them,” and when scholars and teachers unveil them, people feel threatened and confused. It’s a common premise, and it makes it easy to cast the academics as tenured meddlers going against common sense. The academics, in turn, feel that the more figures such as D’Souza resist, the more they know they’re on to something. That some of these expressions carry discriminatory baggage sharpens the analytic radar and adds a moral imperative to the labor. Indeed, no mandate has granted literary scholars so strong sense of mission in the last 25 years.
It certainly touched me, and I recall judging Buckley et al as obtuse anti-intellectuals and cheap-shot artists pitiably ignorant of advanced arguments. With a fresh Ph.D. in hand, and infused with Heidegger and Derrida, I believed fervently in the interpretative calling, disdaining what phenomenologists called the “natural attitude,” the outlook that takes things at face value. Added to that, I claimed language and literature as a professional subject, which meant that my livelihood depended upon the under- or other side of words, and that it took a special acumen to access it.
Fifteen years later, though, after countless written and spoken readings that lifted the political sediment out of ordinary and extraordinary language, the practice sounds pedestrian and predictable. In some cases, the search for “linguistic assumptions” exposed sexist and racist attitudes underlying different discourses, invisible but operative -- for instance, Gilbert and Gubar’s analysis in The Madwoman in the Attic of patriarchal motifs in critical discussions of creativity -- and it also reflected handily upon the institutional circumstances of them. But when it ascended into a theoretical premise, and soon after settled into a professional habit, the conclusions it drew lapsed into routine. Indeed, much queer theory has involved the extraction of queer subtexts from canonical texts and popular culture, influentially enough that assertions such as that of a lesbian undercurrent in "Laverne and Shirley," as one book offered several years ago, produces the effect of either whimsical curiosity or a rolling of the eyes.
The theory provided no guidelines as to where it did and did not apply, and so it was stretched too thin. It provided no means for distinguishing between content that was invisible from content that actually wasn’t there. The professors saw implicit meaning everywhere, much of it political or identity-oriented. Persons outside the academy looked at the whole of their exchanges and found most of them uncomplicated and transitory. The surface was all. To that audience, conservatives such as Silber had a better grasp of the nature of “linguistic assumptions” than the professors did. And it didn’t help that so many professors shared Theodor Adorno’s belief in “the stupidity of common sense.” That, indeed, may explain why conservative intellectuals routed the professors in public settings over the years -- not because they lacked nuance, played on irrational fears, or traded in simplistic, but telegenic gibes. Rather, they understood better when to analyze and when to assert, when to dismantle and when to affirm.
Both camps would agree, however, that the disclosure of assumptions and biases in language does apply to certain contexts, especially those in which an institution weighs heavily upon the utterances. When the protocols of communication are strict, when a statement reflects a speaker’s knowledge and legitimacy, when misstatements violate a group’s sense of mission, when entry into the discourse requires a long and regulated preparation by the entrant -- such settings are “overdetermined,” and they need detailed analysis and thick description. The terms are loaded and the topics authorized. Statements impart norms as well as ideas, mores as well as referents. The expressions licensed there reinforce the institution and echo its rationale. The subtext is dynamic, and if we don’t analyze it, then we do, indeed, break our promise to critique.
For this reason, it has been astonishing to watch the professors respond to indictments leveled recently by conservative, libertarian, and First Amendment figures against academic practice and politics. These figures cited voter registrations, campaign contributions, and occasional acts of oppression, but most of the time the first exhibit of bias and illiberalism was a sample of institutional language. Scholarly articles such as a 2003 study of the “conservative personality” that found fear and aggression at the heart of conservatism (“Political Conservatism as Motivated Social Cognition,” Psychological Bulletin. May 2003); course descriptions such as those gathered by American Council of Alumni and Trustees in a report issued last month; speech codes targeted by the Foundation for Individual Rights in Education; paper titles culled by Frederick Hess and Laura LoGerfo from the last meeting of the American Educational Research Association ... these formed the evidence. They served well because of their patent absurdity, or because of their offense to public taste, or their adversarial dogma (anti-American, anti-capitalist, etc.).
But while the manifest content had an immediate impact, sometimes entering national circulation as a reviled token (e.g., “little Eichmanns”), many claimed a deeper meaning for them. In a word, they were offered as symptomatic expressions, an index of the values, norms, biases, and interests of academics. Conservatives and others presented them as precisely the kind of language packed with “linguistic assumptions,” performing subtextual feats, and ripe for socio-political analysis.
And yet, how have the professors responded? Not by taking up the critical challenge and carrying out the analysis. Not by bouncing the samples off of the institution in which they appeared. Instead, they shot the messenger. They declared the samples isolated and un-representative, or they denied to them the symptoms alleged by the critics. The course description wasn’t a fair stand-in for the course itself, they protested. Ward Churchill’s post-9/11 rant was an aberration. The conference paper title was just a way to garner an audience, so let’s not confuse it with the real substance of the paper. In sum, they put the most benign construction on the samples. That turned the allegations back upon the people who cited them, David Horowitz, Anne Neal, and the rest, who were cast as sinister crazies pushing a vile political agenda.
One can understand the professors’ defensiveness, but to let it squelch the exercise of a practice that they have at other times wielded so boldly is a breach of their own ideals. Have they lived so long and so closely to “social justice,” “social change,” “queer,” “whiteness,” and “gender equality” that they do not recognize them as loaded terms? Have they imbibed the political currents of the campus so thoroughly that they regard a polemical phrasing in a course description as merely a lively description? By their own instruction, we should regard the widespread attention to race, gender, and their social construction as emanating from a world view and signaling an ideological commitment. When Ward Churchill’s notorious speech made headlines, the professors were correct to cite his First Amendment rights and reprove those calling for his job. But as more information came to light, and his political attitudes seemed to bear a closer relation to his scholarship, academic doctrine demanded that the institution that rewarded him be reviewed. Roger Bowen, general secretary of the American Association of University Professors, has assured the Commission on the Future of Higher Education that “Faculty members are accountable for their work in many ways,” including peer review of scholarship and grant applications and annual departmental review for salary and promotion. What, then, is the relationship between Churchill’s high ascent in the profession and his discredited writings? Humanities and social science professors work backward from institutional statements to the culture of the institution itself all the time. Why exempt academic language from the process?
The academic defense comes down to this: conservatives and libertarians read too much into bits and pieces of language -- an ironic turnabout, given that they used to make the same charge against literary theorists 20 years ago. Tim Burke, responding to the ACTA report, chooses the term “Eurocentric” as a case in point. While ACTA’s report selected a course description containing the term as an instance of bias, Burke replied, “I’ll let them in on a little secret: it can also be just a plain-old technical term for historiographical models that argue that modern world history has primarily been determined by factors that are endogamous to Europe itself.” So it can, but even if we accept that as one meaning of Eurocentric, it doesn’t erase the occasions when, as Burke concedes, “the term is also used as a fairly dumb epithet by nitwitted activists.” That is precisely one of the dangers of loaded terms. They can function neutrally or tendentiously, and when pressed the users can always fall back upon claims of innocence.
The question rests upon the frequency of biased meanings, “the existence of telling linguistic patterns,” as Erin O’Connor puts it while commenting on the issue. When a call for papers foregrounds anti-union corporatist practices, is that a tendentious usage, or are the libertarian commentators who cite it being oversensitive? The answer largely depends upon one’s relation to the institutional setting. When a libertarian delivers a talk at a symposium sponsored by Reason Magazine, the mention of government will have over- and undertones different from those issuing from government at a meeting of social justice advocates. From my perspective in 1991, I regarded Eurocentric, theory, patriarchy, and even the blank terms race and gender as descriptive ones. Yes, they had a political thrust, but essentially they were justified because they were accurate names for real phenomena in history and society. Indeed, it was the other discourse that was politicized, the one from which race etc. were absent. Now, having watched those terms in action, I see them as more often tendentious than not. In the majority of cases, their “institutional meaning” overshadows their denotative meaning.
That’s my experience, and maybe it’s too partial to count. But we can’t know for certain so long as leading academics remain as quick to deny the possibility that a narrow political agenda underlies academic discourse. Apart from the wall it erects against further inquiry, the reflex draws them into a vulnerable position. First of all, it results in overt intellectual blunders. For example, in the article cited above on the conservative personality, the authors define “conservatism” as, at heart, “opposition to change,” a simplistic and sweeping characterization that allows them to conclude, “One is justified in referring to Hitler, Mussolini, Reagan, and Limbaugh as right-wing conservatives ... because they all preached a return to an idealized past.” (They also add Stalin, Khrushchev, and Castro to the list of political conservatives.)
A second and more damaging problem in neutralizing their own terminology is the double standard it represents. Academics recognize the tension in terms such as race and sexuality, but they attribute its source to the resistances of others, persons who can’t give up their own biases and anxieties. That tactic will only work behind the campus walls. Try it in an outside setting and the arrogance comes across immediately. The hypocrisy shows, too, as academics fail their own standard. They present themselves as hard-headed, clear-sighted analysts, but in this case they prove selective in their labor. People outside the campus recognize that academia is just the kind of Establishment that calls out for ideological and social criticism, and its language is one place to begin. Academics already have a credibility problem when discussing their own practices, and if they wish to face down their many critics, they need to start extending those criticisms by themselves. Public observers realize, however reluctantly, that the best people to conduct that examination are the professors themselves, if only they will stop acting so proprietary. If academics don’t assume the lead, then they will find their credibility falling still further, having revised one of their favorite dicta to their own advantage -- “a ruthless criticism of everything existing,” everything, that is, but their own.
Mark Bauerlein is professor of English at Emory University.