Submitted by Anna Leahy on January 29, 2009 - 4:00am
I know what you’re thinking: Why is a poet writing about assessment in higher education? Honestly, I wonder that myself. One day, when assessment came up in conversation, I commented that it could be useful to programs as they make curricular decisions. Within 48 hours, the dean placed me on the institution’s assessment committee. Suddenly, assessment is a hot topic and, of all people, I have some expertise.
My years on that committee convinced me that we must pay attention to the rise of assessment because it is required for accreditation, because demands have increased significantly, and because it might be useful in our professional lives. Accrediting bodies are rightly trying to stave off the No Child Left Behind accountability that the Spellings Commission proposes. Maybe the incoming secretary of education will consider how we might be better -- not more -- accountable. Perhaps, too, Wall Street should be held accountable before the Ivory Tower. But assessment for higher education will likely become more pressing in a weak economy.
One tool to which many institutions have turned is the National Survey of Student Engagement (NSSE, pronounced Nessie). NSSE was piloted in 1999 in approximately 70 institutions, and more institutions participate each year. This survey appeals especially to college and university presidents and trustees, perhaps because it’s one-stop, fixed-price assessment shopping. NSSE presents itself as an outside -- seemingly objective -- tool to glean inside information. Even more appealing, it provides feedback on a wide array of institutional issues, from course assignments to interpersonal relationships, in one well-organized document. Additionally, the report places an institution in a context, so that a college can compare itself both with its previous performance and with other colleges generally or those that share characteristics. And it doesn’t require extra work from faculty. NSSE seems a great answer.
Yet, NSSE does not directly measure student learning; the survey tracks students’ perceptions or satisfaction, not performance. Moreover, respondents appraise their perceptions very quickly. In the 2007 NSSE, students were informed, “Filling out the questionnaire takes about 15 minutes” to complete 28 pages, some of which included seven items to rate. So, as with its Scottish homonym, NSSE presents a snapshot of indicators, not the beast itself.
Importantly, NSSE is voluntary. A college or university can participate annually, intermittently, or never. If a college performs poorly, why would that college continue? If a university uses the report to, as they say in assessment lingo, close the loop, wouldn’t that university stagger participation to measure long-term improvements? Over its 10-year existence, more than 1,200 schools have participated in NSSE, and participation has increased every year, but only 774 schools were involved in 2008, which suggests intermittent use. In addition, some institutions use the paper version, while others use the Web version; each mode involves a different sample size based on total institutional enrollment. NSSE determines sample size and randomly selects respondents from the population file of first-years and seniors that an institution submits.
Perhaps, all these factors lead NSSE to make the following statement on its Web site: "Most year-to-year changes in benchmark scores are likely attributable to subtle changes in the characteristics of an institution’s respondents or are simply random fluctuations and should not be used to judge the effectiveness of the institution. The assessment of whether or not benchmark scores are increasing is best done over several years. If specific efforts were taken on a campus in a given year to increase student-faculty interaction, for example, then changes in a benchmark score can be an assessment of the effectiveness of those efforts."
This statement seems to claim that an increase in a score from one year to the next is random unless the institution was intentionally striving to improve, in which case, kudos. Yet, NSSE encourages parents to “interpret the results of the survey as standards for comparing how effectively colleges are contributing to learning” in five benchmark areas, including how academically challenging the institution is.
I have larger concerns, however, about assessment tools like NSSE, which are used for sociological research on human subjects. The humanities and arts are asked to use a methodology in which we have not been trained and for which our disciplines might not be an appropriate fit. NSSE is just one example of current practices that employ outcomes-based sociological research, rubric-dominated methodology, and other approaches unfamiliar in many disciplines.
Such assessment announces , anyone can do it. I’ve seen drafts of outcomes and rubrics, and that’s not true. Programs like education and psychology develop well-honed, measurable outcomes and rubrics that break those outcomes down into discernable criteria. Programs in the sciences do a less effective job; some science faculty assert that the endeavor is invalid without a control group, while admitting that a control group that denies students the environment in which they most likely learn would be unethical.
Those of us in the arts and the humanities want wide, lofty outcomes; we resist listing criteria because we disagree, often slightly or semantically, about what’s most important; we fear omission; and we want contingencies in our rubrics to account for unexpected — individual, creative, original — possibilities. Writing and visual art cannot easily be teased apart and measured. Critical thinking and creative thinking are habits of mind. How can NSSE or rubrics capture such characteristics?
Moreover, by practicing social science, often without reading a single text about those methods, arts and humanities faculty diminish the discipline we poach as well as lessen the value and integrity of our conclusions. If we don’t know what we’re doing — how many of us really understand the difference between direct and indirect measures or between outcomes, objectives, goals, and competencies — the results are questionable. To pretend otherwise is to thumb our noses at our social science colleagues.
Further, this one-size-fits-all, cookie-cutter mentality ignores that different disciplines have different priorities. Included in Thomas A. Angelo and K. Patricia Cross’s Classroom Assessment Techniques is a table of top-priority teaching goals by discipline. Priorities for English are Writing skills, Think for oneself, and Analytic skills, in that order. Arts, Humanities, and English have just one goal in common: Think for oneself. We can survey student perceptions of their thinking — an indirect measure — or maybe we know independent thinking when we see it, but how do we determine thinking for oneself in a data set? These priorities aren’t even grammatically parallel, which may not matter to social scientists, but it matters to this poet!
Other priorities for Arts — Aesthetic appreciation and Creativity — and Humanities — Value of subject and Openness to ideas — are difficult, if not impossible, to measure directly. The priorities of Business and Sciences are more easily measured: Apply principles, Terms and facts,Problem solving, and Concepts and theories. So, a key issue is to determine whether the arts and humanities can develop ways to assess characteristics that aren’t really measurable by current assessment methodology or whether we must relinquish the desire to assess important characteristics, instead focusing on easily measured outcomes.
Another table in Classroom Assessment Techniques lists perceived teaching roles. Humanities, English, and Social Sciences see Higher-order thinking skills as our most essential role, whereas Business and Medicine view Jobs/careers as most essential, Science and Math rank Facts and principles most highly, and Arts see Student development as primary. Both knowledge of Facts and principles and job placement can be directly measured more easily than Student development. For English, all other roles pale in comparison to Higher-order thinking skills, which 47 percent of respondents rated most essential; the next most important teaching role is Student development at 19 percent. No other discipline is close to this wide a gap between its first- and second-ranked roles. Surely, that’s what we should assess. If each discipline has different values and also differently weighted values, do we not deserve a variety of assessment methodologies?
Lest I bash assessment altogether, I do advocate documenting what we do in the arts and humanities. Knowing what and how our students are learning can help us make wise curricular and pedagogical decisions. So, let’s see what we might glean from NSSE.
Here are items from the first page of the 2007 NSSE:
Asked questions in class or contributed to class discussions
Made a class presentation
Prepared two or more drafts of a paper or assignment before turning it in
Worked on a paper or project that required integrating ideas or information from various sources
Included diverse perspectives (different races, religions, genders, political beliefs, etc.) in class discussions or writing assignments
Students were asked to rate these and other items as Very often, Often, Sometimes, or Never, based on experience at that institution during the current year. These intellectual tasks are common in humanities courses.
In another section, students were questioned about the number of books they had been assigned and the number they had read that weren’t assigned. They also reported how many 20+-page papers they’d written, as well as how many of 5-19 pages and how many of fewer than five pages. We can quibble about these lengths, but, as an English professor, I agree with NSSE that putting their ideas into writing engages students and that longer papers allow for research that integrates texts, synthesizes ideas, and encourages application of concepts. And reading books is good, too.
Another relevant NSSE question is “To what extent has your experience at this institution contributed to your knowledge, skills, and personal development in the following areas?” Included in the areas rated are the following:
Acquiring a broad general education
Writing clearly and effectively
Speaking clearly and effectively
Thinking critically and analytically
Working effectively with others
The English curriculum contributes to these areas, and we are often blamed for perceived shortcomings here. While NSSE measures perceptions, not learning, this list offers a simple overview of some established values for higher education. If we are at a loss for learning outcomes or struggle to be clear and concise, we have existing expectations from NSSE that we could adapt as outcomes.
In fact, we can reap rewards both in assessment and in our classrooms when students become more aware of their learning. To do this, we need some common language — perhaps phrases like writing clearly and effectively or integrating ideas or information from various sources — to talk about our courses and assignments. Professional organizations, such as the Modern Language Association in English or the College Art Association in the visual arts, could take the lead. Indeed, this article is adapted from a paper delivered at an MLA convention session on assessment, and the Education Committee of CAA has a session entitled “Pedagogy Not Politics: Faculty-Driven Assessment Strategies and Tools” at their 2009 conference.
We needn’t reorganize our classes through meta-teaching. Using some student-learning lingo, however, helps students connect their efforts across texts, assignments, and courses. Increasingly, my students reveal, for instance, that they use the writerly reading they develop in my creative writing courses to improve their critical writing in other courses. I have not much altered my assignments, but I now talk about assignments, including the reflective essay in their portfolios, so that students understand the skills they hone through practice and what they’ve accomplished. Perhaps, I’m teaching to the test — to NSSE — because I attempt to shift student perceptions as well as the work they produce. But awareness makes for ambitious, engaged, thoughtful writers and readers.
Good teachers appraise their courses, adapt to new situations and information, and strive to improve. As Ken Bain points out in What the Best College Teachers Do, “a teacher should think about teaching (in a single session or an entire course) as a serious intellectual act, a kind of scholarship, a creation.” We are committed to teaching and learning, to developing appropriate programs and courses, and to expectations for student achievement that the Western Association of Colleges and Schools asks of us. We can’t reasonably fight the North Central Association of Colleges and Schools mandate: “The organization provides evidence of student learning and teaching effectiveness that demonstrates it is fulfilling its educational mission.” Assessment is about providing evidence of what we do and its effects on our students. Our task in the arts and humanities is to determine what concepts like evidence, effects, and student learning mean for us. If NSSE helps us achieve that on the individual, program, or institution levels, great. But NSSE is best used, not as an answer, but as one way to frame our questions.
In a lecture, the novelist Robertson Davies once gave a wry characterization of the life of a full-time writer. It is, he said, every bit as gratifying as non-writers usually imagine it to be -- “except for the occasional complete collapse of the will to go on.”
This is quoted from memory (my effort to relocate the passage has not gone well) so the wording may be inexact. But the turn of phrase certainly rhymes with experience -- in particular, the tension of emphasis in “occasional complete collapse,” with its mix of casual surprise and total devastation. I feel less certain that Davies used the expression “the will to go on.” It sounds a bit melodramatic. But then, he was a satirist, and he might well have been making fun of the impulse to indulge in overacted displays of artistic temperament. (Making fun of this need not preclude indulging it.)
Anyone who spends much time trying to put the right words in the right order will accumulate a private anthology of passages like this one: quotations that map the high and the low points on the interior landscape of the writing life. Knowing that others have been there before you is reassuring – if only just so much help.
For Robert D. Richardson – the author of, among other things, William James: In the Maelstrom of American Modernism (Houghton Mifflin Harcourt), which won the Bancroft Prize for 2007 – one such landmark passages appears in “The American Scholar.” There, Ralph Waldo Emerson writes: “Meek young men grow up in libraries believing it their duty to accept the views that Cicero, Locke, and bacon have given, forgetful that Cicero, Locke, and Bacon were only young men in libraries when they wrote those books.”
In First We Read, Then We Write: Emerson on the Creative Process (to be published in March by University of Iowa Press), Richardson says the line “still jolts me every time I run into it.” I think I know what he means, but the quality and intensity of the jolt varies over time. Reading “The American Scholar” as a meek young man, I just found it irritating – as if Emerson were translating the anti-intellectualism of my small town into something more refined and elegant, if scarcely less blockheaded.
This was a naive reading of a remarkable and (at times) very weird essay. "The American Scholar" is actually something like a Yankee anticipation of Nietzsche’s “On the Use and Abuse of History for Life” – with the added strangeness that, when Emerson gets around to pointing out a prototype of the new-model American scholar, the example he gives is ... Emanuel Swedenborg, the 18th century Swedish polymath. Who, when not writing huge works on the natural sciences, spent his time talking to angels and devils and the inhabitants of other planets. WTF?
Rereading Emerson a couple of decades beyond adolescence, I saw that the target of his scorn was meekness -- not bookishness, as such. He was in any case not so genteel as he first appeared. There was a wild streak. There were depths beneath the oracular sentences that made him a kind of cultural revolutionary. You are not necessarily prepared to detect this when reading Emerson as a teenager. Like Bob Dylan says, "Ah, but I was so much older then, I'm younger than that now."
Richardson’s award-winning Emerson: The Mind on Fire (University of California Press, 1996) retraced his subject’s voracious and encyclopedic reading regimen, which seems to have been tinged with the urgency of addiction. That book was intellectual biography. The new one, which is far shorter, is something else again -- a synthesis of all the moments when Emerson muses over his own process, a distillation of his ethos as a reader and (especially) as writer.
“A good head cannot read amiss,” says Emerson. “In every book he finds passages which seem confidences or asides, hidden from all else, and unmistakeably meant for his ear.” Full attention and active engagement are always, by Emerson's lights, present-minded: “I read [something] until it is pertinent to me and mine, to nature and to the hour that now passes. A good scholar will find Aristophanes and Hafiz and Rabelais full of American History.”
Not being prone to foolish consistency, Emerson also maintains that some academic works are incapable of coming to life themselves, let alone revitalizing anyone else. “A vast number of books are written in quiet imitation of the old civil, ecclesiastical, and literary history,” he says. (One may quietly updates this by thinking of comparable 21st century tomes.) “Of these we need take no account. They are written by the dead to be read by the dead.”
By contrast, meaningful writing is an effort “to drop every dead word.” Emerson rules out any effort to rub pieces of jargon together in hopes they will generate sparks. “Scholars are found to make very shabby sentences out of the weakest words because of exclusive attention to the word,” he notes. You don’t say.
The struggle to connect with living currents of thought and meaning should begin with a notebook -- the place to cultivate, as Emerson puts it, “the habit of rendering account to yourself of yourself in some rigorous manner and at more certain intervals than mere conversation.” The important thing is to keep at it: “There is no way to learn to write except by writing.”
This may sound like generic advice, and to some degree it is. But from long years of scholarly attention to the daily progress of the essayist’s labors, Richardson hears the anxious undercurrents in Emerson’s reflections on writing. “There is a strangely appealing air of desperation, finality, of terminal urgency,” he writes, “to many of Emerson’s observations.... In every admonition we hear his willingness to confront his own failures; indeed, he never seems more than a few inches from total calamity. He urges us to try anything – strategies, tricks, makeshifts. And he always seems to be speaking not only of the nuts and bolts of writing, but of the grain and sinew of his – and our – determination.”
When necessary, Richardson points out, Emerson would “just sit down and start writing – anything – to see whether something would happen. He was quick to spot the same trick in others. ‘I have read,’ he noted, ‘that [Richard Brinsley] Sheridan made a good deal of experimental writing with a view to take what might fall, if any wit should transpire in all the waste of pages.”
Kenneth Burke once described Emerson’s prose as a “happiness pill” – that being a common enough assessment, though there is more to the sage than his role as dispenser of transcendental Prozac. It makes some difference to know that the pharmacist also had to heal himself. He cannot have been free from all of the worldly desires felt by lesser writers. The same wishes mean the same frustrations. The challenge is to keep faith with the rest of one’s reasons for writing – the motivations that break through the rubble.
First We Read, Then We Write is worth keeping at hand for moments of occasional complete collapse. I'll end with a passage that now belongs in the anthology, for emergency use:
“Happy is he who looks only into his work to know if it will succeed, never to the times or to the public opinion; and who writes from the love of imparting certain thoughts and not from the necessity of sale – who writes always to the unknown friend.”
Given this chilly climate for administrators -- salary freeze, hiring freeze -- I turn for relief to that dusty ghost-town in my mind’s geography, the one labeled Intellect. This turn has been further encouraged by the publication in recent months of an article on the influence, or lack thereof, of a book I wrote 20 years ago on the relations between American and British writers in the 19th century, titled Atlantic Double-Cross. This book tried to explain why the writers of each country hated each other’s guts and how this animosity informs the great literary works of the period. In it, I argued for a new subdiscipline of comparative literature that would take up the Anglo-American relationship. The book pretty much flopped, in my view, and so I was delighted to read even a measured discussion of the book’s effect on my discipline — delighted, that is, until I arrived at a paragraph beginning, “In Weisbuch’s day. ...”
At first I was tempted to call the gifted, clearly youthful Columbia professor who wrote this sentence to say, “listen, it may be late afternoon; it may even be early evening. But it is still my day.”
More to the point, the phrase made me realize that I am pretty old, and that made me think — I guess I am supposed to speak like a codger now and say instead, “that got me to thinking...” — about the changes in academe in my lifetime. I thought about the move of psychology, for instance, away from the humanities through the social sciences over to the sciences, a journey by which Freud was moved from being a point of reference to a butt for ridicule.
I considered the tendency for economics to forego fundamental questions for refining an accepted model. I note as well a decline in the influence of the humanities, from whence most university presidents arose in the 1930’s, say, and an ascendancy of the sciences, and in particular genetic science, from which field an increasing number of our institutional leaders now emerge.
But going through these admittedly contentious thoughts, I saw something more substantial, which was that my thinking was taking place via the disciplines — and to that I added the realization that my poor book of so long ago had stated itself as an attempt to create a subdiscipline. I have just recently reread Douglas Bennett’s very perceptive quick history of the liberal arts in the 20th century where he notes that the organization of colleges by these disciplines that we now take so for granted in fact was a fast and dramatic occurrence between about 1890 and 1910. Today, it seems, we really care more about them than we do about the whole self or whatever the liberal arts ideal is.
So I got angry at the disciplines, and there is reason for that. It gets difficult to understand, especially at the graduate level, why a doctorate in literature and a doctorate in physics exist on the same campus when it seems they might as well be pursued on different planets. During a year when I served as graduate dean at the University of Michigan, I attended a physics lecture and was seated next to the chair of the comparative literature program. As the scientist went on, my neighbor whispered to me incredulously, “This guy thinks the world is real.” That takes C.P. Snow’s two-worlds problem to a new and desperate place.
Or again, I invited the scientist from NYU who had successfully submitted an article of post-structuralist nonsense to a journal of literary theory and had his hoax accepted to speak at Michigan, with a panel of three scientists and three humanists responding. As the large crowd left the room, the conversations were remarkable. The scientists in the audience to a person found the critique of the pretension of literary theory wholly persuasive. The humanists to a person felt that their colleagues had successfully parried the attack, no question about it, reminding the physical and life scientists that their language could be pretty thick to an outsider too and that the very history of science could be seen as the overturning of accepted truths to be revealed as unintended hoaxes.
And so, enraged at the disciplines, I tried to imagine what it would be like to have a university, a world, a mind that did not rely on the disciplines — and failed.
And my next move is to say, perhaps this is fine. If general education is tantamount to a mild benevolence toward humanity, involvement in a discipline is like falling passionately in love with a particular person. We need both. It is okay to be captured by an ecstatic interest. But we also know the danger of early love. In the words of Gordon McRae or somebody, “I say that falling in love is wonderful.” And indeed it is arguable at least that we do not induct students into a love of the life of the mind by abstractions but by finding the single discipline that fixes their fascination.
Even so, we want that fascination to be versatile, to be capable, that is, of moving from one arena of thought to another, or at least to understanding why someone else would care passionately about something else. Every summer, I spend a week on an island off Lake Winnipesaukee. This is very odd for me, as my relation to nature is such that a friend once asked if I had suffered a traumatic experience in a forest or a park. I prefer my nature in iambic pentameters, and this family island, without electricity or plumbing, I have dubbed The Island Without Toilets. Still, it is restful, and each year we campers read and discuss a book or essay. One year it was Bill McKibben’s book, The Age of Missing Information. In this tome, McKibben contrasts a day spent hiking to a modest mountaintop to a day spent watching a full 24 hours of each channel of a cable television system in Virginia. (The fact that there were only 90 channels in 1992 tells us that we are losing more information all the time.) The book is somewhat eco-snobby, but McKibben’s main contrast is really not between the natural world and its vastly inferior electronic similitude or replacement but between deep knowledge and sound bites.
He illustrates deep knowledge by an Adirondack farmer’s conversation concerning each and all the species of apple. There is so much to know, it turns out, about apples; indeed, there is so much to know about everything. As I wrote a few years ago, “Life may appear a deserted street. But we open one manhole cover to find a complex world of piano-tuning, another to discover a world of baseball, still others to discover intricate worlds of gemology and wine, physical laws and lyric poetry, of logic and even of television.” And I asked, “Do our schools and colleges and universities reliably thrill their charges with this sense of plenitude?”
They do not. And while I cannot even imagine a world without the disciplines — which are really the academic organization of each of these microcosms of wonder -- I can imagine them contributing to an overall world flaming with interest. Falling in love is great and irreplaceable, but how about reimagining the campus as Big Love, Morman polygamy for all sexes, or at least as a commune, where each of us is mated to a discipline but lives in close proximity with family-like others on a daily basis.
That is, I believe, what we are, however awkwardly, attempting by having the disciplines inhabiting the same campus. However much general education has been swamped by disciplinary insistence, a remnant remains. Even academics tend to tell other people where they went to college, not so much in what they majored. We probably already possess the right mechanism for a 21st century renaissance. It just needs some adjustments.
I want to suggest two such adjustments. One is in the relation of the arts and sciences to the world; and another readjusts the arts and sciences in relation to themselves and to professional education.
When I was at the University of Michigan several years ago, something shocking took place. The sciences faculty, en masse, threatened to leave the college of liberal arts. “How could the sciences leave the arts and sciences any more than Jerry could leave Ben and Jerry’s?,” I asked someone who had been present at these secession meetings. “The same way another Jerry could leave Dean Lewis and Jerry Martin,” he replied. Somehow to Michigan’s credit, the rebellion was quelled, but to me it is suggestive of the weakness of the liberal arts ideal at many of our institutions.
There are many signs of its frailty, beginning with the frequent statistic that more students at four-year colleges now major in leisure studies than in mathematics and the sciences. It is difficult to find a middle or high school where anyone speaks of the liberal arts, and much as I have been worrying about the disciplines, aside from scattered efforts they seem to have been missing in action from much of the last forty years of discussion of school reform. In speaking about the arts and sciences in relation to the world, I want to suggest, though, that the lording of the disciplines over general education and the absence of the excitement of the disciplines in the schools have everything to do with each other.
This near paradox can be illustrated best if I stay within my own neighborhood of the humanities for this aspect of the argument. Last month, filled with nostalgia, I agreed to serve on a panel for the National Humanities Association, which advocates to Congress for funding for these impoverished disciplines. My job was to provide one version of the speech for the public efficacy of English and history, religion and philosophy and so on. I decided to fulfill this assignment rapidly and then to ask why, if we believed in the public efficacy of the humanities, we utterly ignored it in our mentoring of graduate students in these disciplines.
My argument for the humanities is exactly the same as my argument for the arts and sciences generally. As a young person, I never expected a major battle of my lifetime to be the renewal of dogmatic fundamentalism in opposition to free thinking. I find myself again and again referring to an episode of the television program "The West Wing" that aired shortly after 9-11. The president’s youthful assistant is speaking to a group of visiting school children and he says, “Do you really want to know how to fight terrorists? Do you know what they are really afraid of? Believe in more than one idea.”
This is not always as simple as the Taliban versus Shakespeare. There are subtle discouragements within our own society to the freedom to doubt and the freedom to change one’s mind. And there are elements within each of us that tend toward dogmatism and against the embracing of difference and a will to tolerate complexity. The campus, ideally, is a battleground for this freedom.
Against the many who would tyrannize over thought, we need to fight actively for our kind of education, which is far deeper than the usual political loyalties and divisions. God and George Washington are counting on us. And so are all those kids in East LA scarred by violence and poverty. In a nation of inequality and a world of sorrows, damn us if we neglect to advocate effectively for the only education that lifts up people.
Having said that, I asked why, paraphrasing Emerson, we do not turn our rituals and our rhetoric into reality. Over the last 40 years, the professoriate in the humanities has been a mostly silent witness to an atrocity, a huge waste of human resources. According to Maresi Nerad in the "Ten Years After" study, in a class of 20 English Ph.D.'s at whatever prestigious institution, three or four will end up with tenure-track positions at selective colleges or research universities. And yet this degree program, and all others in the humanities, pretend that all 20 are preparing for such a life. It’s a Ponzi scheme.
When I led the Woodrow Wilson Foundation, we began a Humanities at Work program, one aspect of which was to give little $2,000 scholarships to doctoral students who had found summer work beyond the academy. A cultural anthropologist at Texas worked at a school for delinquent girls who had been abused as children. She employed dance, folktales, autobiographical writings and a whole range of activities related to her expertise to improve these girls’ self-images. A history student at U.Va. created a freedom summer school for fifth graders in Mississippi, teaching them African American history. Meanwhile, we secured thirty positions at corporations and non-profits for doctoral graduates.
Our point was not to become an employment agency but to suggest that every sector of society, from government to K-12 to business, could benefit hugely by the transferable talents of people who think with complexity, write and speak with clarity, and teach with verve and expertise. We wanted such graduates to comprehend the full range of their own possibilities. Should they then decide to enter academia, at least they would perceive this as a free choice. And in the meantime, the liberal arts would populate every social sector as never before. I do not mean it ironically when I look to the liberal arts takeover of the world.
For that to take place at any level of education, I think, we need to marry intellectual hedonism to the responsibility of the intellectual. If we want our professoriate and our students to apply their learning -- and I do -- if we want them not simply to critique society but to constitute it, we must first acknowledge the simple joy of learning as a prime realistic moment. My dear friend Steve Kunkel is a leading investigator at Michigan of the AIDS virus. He is a fine fellow and I am certain that he would wish to reduce human suffering. But when I call Steve at 7 in the morning at his lab, because I know he will be there already, he is there less out of a humanitarian zeal than because he is crazy about science, the rattiest of lab rats. Just so, when I unpack a poem’s meaning, I experience a compulsive enjoyment. This is half of the truth, and it leads someone like Stanley Fish to scorn the other half by writing a book with the title Save the World on Your Own Time.
I think we can devote some school time to saving the world without prescribing or proscribing the terms of its salvation. Louis Menand, surely no philistine, argues that we need to get over our fear of learning that may brush shoulders with the practical and more generously empower our students. Granted, and granted enthusiastically, academic enclosure, the distancing of a quiet mind from the harsh noise of immediacy, is a great joy, even a necessity in the growth of an individual. But when it becomes the end rather than the instrument, we approach social disaster. We must travel back and forth between the academic grove and the city of social urgencies.
This is to say, and beyond the humanities, that a certain precious isolation — is it a fear? — has kept the fruit of the disciplines within the academy, away even from our near neighbors in the schools. The absence of the disciplines from the public life and the bloating of the disciplines to squeeze out the liberal arts ideal in the colleges are part and parcel of the same phenomenon. It is not that the world rejected the liberal arts but that the liberal arts rejected the world.
In a brilliant article, Douglas Bennett provides a brief history of 20th century college in which he notes an increasingly exclusionary notion of the arts and sciences. And this seems to me part and parcel of the same dubious ethic that so distrusts the messiness of the social world. As I read that we arts and science adepts kept purifying ourselves — education is too messy, throw it out, along with the study of law, along with business, along with anything material (again, “That guy thinks the world is real”) -- I am reminded of Walt Whitman’s critique of Matthew Arnold, whom he termed “one of the dudes of Western literature.” To Arnold, Whitman says, “the dirt is so dirty. But everything comes out of the dirt, everything; everything comes out of the people, the people as you find them and leave them: not university people, not F.F.V. people: people, people, just people.”
The liberal arts became pure and they became puerile. Having greatly expanded the old curriculum by addition and subdivision, they spent the rest of the century apologizing by limiting themselves. They expelled fascinating areas of human endeavor that then came to constitute professional education, and professional education proceeded to eat the libbies’ lunch.
Who or what can teach us to do what Menand urges, empower not only our students but our academic disciplines? The answer, plain as can be, is the sciences. Is it any wonder, given the exclusionary bent of the liberal arts, that scientists, whose subject and whose instruments of investigation are often frankly material, might consider secession, especially when social influence, which is also to say funding, was getting thrown away along with whole areas of crucial consequence?
And by the same token, it is the sciences that can teach the humanities in particular how to reconnect. Indeed, a few moments ago, I was calling for the humanities equivalent of tech transfer; and that is half of my hope for a 21st century renaissance.
By a renaissance in our time — in Weisbuch’s day -- I do not mean the recovery of classical learning and its inclusion in a Christian worldview that marked the original. I want to invoke instead the extreme interdisciplinarity of that time when the arts and sciences came so spectacularly into, if not unity, vital relationship, and when learning and worldliness ceased their contradiction. Here is what I mean. I do not in fact live on the campus of Drew University, but in a town 15 miles away, Montclair, New Jersey. Aside from the filming of some scenes featuring AJ Soprano down the street at our high school, the neighborhood was all too quiet when we moved in, with neighbors at most stiffly waving to one another from a distance. Then Tom and Janet and their three moppets moved in, along with Tom’s insane white limousine, the backyard hockey rink, the Halloween scare show, the whole circus. As Tom started offering the middle-school neighborhood kids “rock-star drop-offs” to school in his limo, everything changed. Some of our houses have large front porches, and neighbors began to congregate on summer evenings. Soon, whenever we lit the barby a few families would turn up with their own dogs and steaks and ask if they could join in. There are about ten families now that assist each other in myriad ways, that laugh together and, when necessary, provide solace and support.
The university can become a porch society in relation to the disciplines. Indeed, for the last 40 years we have been experiencing a loosening of the boundaries, as the prefix “bio” gets attached to the other sciences; as environmental studies unites the life sciences, theology, the physical sciences, public policy, even literary criticism; as Henry Louis Gates, as historian, employs genetic research to revise and complicate the notion of racial heritage. And then there is the huge potential of democratizing knowledge and recombining it through the burst of modern technology, one of whose names, significantly, is the Web.
You cannot intend a zeitgeist but you can capitalize upon one, and this is one. A few simple administrative helps occur to me as examples. We can invite more non-academics to join with us in our thinking about the curriculum. We can require our doctoral students to take some time learning a discipline truly a ways from their own rather than requiring the weak cognate or two, and we can take just a few hours to give them a sense of the educational landscape of their country. We can start meeting not with our own kind all the time but across institutional genres, and we can especially cross the divide into public education not so much by teaching teachers how to teach but with the rich ongoing controversies and discoveries of the living disciplines.
Less grandly, within our own institutions, we can pay a bonus to the most distinguished faculty member in each department who will teach the introductory course and a bigger bonus to those who will teach across disciplines, with the size of the bonus depending upon the perceived distance between the disciplines. We can stop attempting to formulate distribution requirements or core curricula via committees of 200, which is frankly hopeless in terms of conveying the excitement of the liberal arts, and instead let groups of five or ten do their inspired thing, spreading successes. We can create any number of rituals that encourage a porch society. As one new faculty member told me at a Woodrow Wilson conference years ago, “My graduate education prepared me to know one thing, to be, say, the world’s greatest expert on roller coasters. But now in my teaching position, I have to run the whole damn amusement park and I know nothing about the other rides, much less health and safety issues, employment practices, you name it.”
We might name this zeitgeist the whole damn amusement park, but I would suggest a naming in the form of a personification: Barack Obama. When I am fundraising, I often chant something of a mantra, and I ask you to forgive its sloganeering. The new knowledge breaks barriers. The new learning takes it to the streets. The new century is global. And the new America is multi-everything. There you go and here he is. Our fresh new president is indeed international, multi-racial, multi-religious, multi-ethnic, a liberal-arts major and law school grad who became a community organizer and breaks barriers with an ease that seems supernal. He was not required; like the courses we choose freely, he was elected.
Barack Obama was born on an island, and at the start of this essay I mentioned the site of my summer challenge, the Island Without Toilets. Our disciplines are islands. Our campuses are islands. And islands are wonderful and in fact essential as retreats for recuperation. But in the pastoral poems of an earlier Renaissance, the over-busy poet rediscovers his soul in a leafy seclusion but then returns, renewed and renewing, to the city. It is time for us to leave our islands. We are equipped.
Robert Weisbuch is president of Drew University. This essay is adapted from a talk he gave at the 2009 annual meeting of the American Educational Research Association.
At first glance, Peter Drucker might seem an unlikely candidate to have published an academic novel. Famous for writing books such as Concept of the Corporation and The Effective Executive, Drucker was dubbed “The Man Who Invented Management” in his 2005 Business Week obituary. Drucker’s audience was to be found among the Harvard Business Review crowd, not the Modern Language Association coterie, and, not surprisingly, his two novels are no longer in print.
But the university he presented in his 1984 novel, The Temptation to Do Good, confronted some key questions that face higher education institutions in today’s unprecedented financial downturn: Are current practices sustainable? Have we strayed from our core mission? Will the liberal arts survive increasing budget pressures?
As these questions -- hardly the usual literary fare -- demonstrate, Drucker’s work is a rarity among academic novels. These texts typically provide a send-up of academic life, by making fun of intellectual trends through characters such as Jack Gladney, who chairs the department of Hitler studies in Don DeLillo’s White Noise, or by parodying the pettiness of department politics, as in Richard Russo’s Straight Man, in which one English professor’s nose is mangled during a personnel committee meeting, courtesy of a spiral notebook thrown at him by one of his peers. By contrast, The Temptation to Do Good is almost painstakingly earnest in its portrayal of Father Heinz Zimmerman, president of the fictional St. Jerome University.
Like other contemporary academic novels, The Temptation to Do Good depicts the problems of political correctness, the tensions between faculty and administration, and the scandal of inter-office romance. But St. Jerome’s problems are no laughing matter. Lacking the improbable events of other academic novels -- in James Hynes’s The Lecturer’s Tale, the adjunct-protagonist even gains super-human powers -- the plot of The Temptation to Do Good is completely plausible, and the problems above destroy a good man.
St. Jerome’s chemistry department decides not to hire Martin Holloway, a job candidate with a less-than-stellar research record. Feeling sorry for the soon-to-be-unemployed Ph.D., Zimmerman decides to recommend Holloway to the dean of a nearby small college. Zimmerman knows he shouldn’t interfere, but he feels he must do the Christian thing, and so, succumbing to “the temptation to do good,” he makes the call. Meanwhile, Holloway’s angry wife spreads unfounded rumors about a dalliance between the president-priest and his female assistant. The faculty overreact to both events, and although most of them come to regret it, Zimmerman’s presidency is brought down, and he is eased out by the church into a sinecure government position.
Often reading like an intricate case study of one university’s internal politics, The Temptation to Do Good aims to do more than that, too, raising questions about the purpose of higher education institutions writ large. Representing the contemporary university as a large, bureaucratic institution -- much like the companies that Drucker’s theories would shape -- The Temptation to Do Good portrays Zimmerman as a successful executive, one who “converted a cow college in the sticks” into a national university with a reputation unrelated to its religious roots. He even makes the cover of Time magazine for increasing his endowment by a larger percentage than any other university over the past five years.
Although some faculty recognize, as one physics professor admits, that they wouldn’t be able to do their research without the money he has brought in, many of them are also disenchanted with Father Zimmerman, CEO. The chemistry chair chose to come to St. Jerome because he expected it to be “less corrupted by commercialism and less compromised by the embrace of industry” than other institutions, which he realizes isn’t the case.
“We have a right,” says the chair of modern languages, upset over the abolition of the language requirement, “to expect the President of a Catholic university to stand up for a true liberal education.” In both cases, we see the ideals of a Catholic university being linked to the ideals of a liberal arts education, both focused on a pure devotion to the pursuit of knowledge seen as incompatible with Zimmerman’s expanded professional schools and intimate sense of students’ consumer needs. Can St. Jerome be true to both the liberal arts and the practical, professionalized realm at the same time?
This question is never resolved in the novel, but outside of his fiction writing, Drucker was deeply interested in the practicality of the liberal arts. In his autobiography, he discusses his deep appreciation of Bennington College, a school designed to combine progressive methods -- connecting learning to practical experience -- with the ideas of Robert Hutchins, the University of Chicago president and famed proponent of classical liberal ideals. William Whyte’s sociological classic Organization Man cites Drucker as saying that “the most vocational course a future businessman can take is one in the writing of poetry or short stories.”
Although Drucker was unusual in actually writing novels himself, he was not alone among business thinkers in expressing the values of the liberal arts. Tom Peters and Robert Waterman’s In Search of Excellence: Lessons from America’s Best-Run Companies describes an investment banker who suggests closing business schools and providing students with a “liberal arts literacy,” that includes “a broader vision, a sense of history, perspectives from literature and art.”
More recently, Thomas Friedman’s The World is Flat includes a section focusing on the importance of a liberal arts education in the new integrated, global economy. “Encouraging young people early to think horizontally and to connect disparate dots has to be a priority,” writes Friedman, “because this is where and how so much innovation happens. And first you need dots to connect. And to me that means a liberal arts education.”
Books like Rolf Jensen’s The Dream Society: How the Coming Shift from Information to Imagination will Transform Your Business, Joseph Pine II and James H. Gilmore’s The Experience Economy: Work is Theatre and Every Business a Stage, Daniel H. Pink’s A Whole New Mind: Why Right Brainers Will Rule the Future, and Richard Lanham’s The Economics of Attention: Style and Substance in the Information Age make these points more specifically, often showing how certain “literary” skills, such as storytelling and empathy, are crucial to success in the current time.
Out of the authors mentioned above, only Lanham is a humanities professor, and in a field (rhetoric) largely out of scholarly vogue today. “Let’s go back to the subject of English a moment. Of all subjects none is potentially more useful,” Whyte writes. “That English is being slighted by business and students alike does not speak well of business. But neither does it speak well for English departments.”
What’s significant about Whyte’s account -- along with that of Drucker, Friedman, and others -- is that none of them claim that colleges and universities should merely churn out students of technical writing or focus on the practicality of the composition course; instead they want students to think about narrative complexity and story-telling through the liberal arts. Whyte himself focuses on the study of Shakespeare and Charles Lamb.
However, instead of embracing these potential real-world allies, liberal arts disciplines have seemed to withdraw, letting others become the experts in -- and proponents of -- the relevance of their subjects. Consider, for example, that in January 2008, one of the most famous English professors in the world proclaimed on his New York Times blog that the study of literature is useless. Asserting that the humanities don’t do anything but give us pleasure, Stanley Fish wrote that, “To the question of ‘what use are the humanities?’ the only honest answer is none whatsoever.” The arts and humanities, Fish contended, won’t get you a job, make you a well-rounded citizen, or ennoble you in any way.
Not surprisingly, readers were appalled. Within the next 48 hours, 484 comments were posted online, most of them critical of Fish. The majority of these comments, from a mix of scientists, humanists, business people, and artists, could be divided into two categories: first, the humanities are useful because they provide critical thinking skills that are useful for doing your job, whether you’re a doctor or CEO; and second, the humanities are useful for more than just your job, whether that means being a more informed citizen or simply a more interesting conversationalist.
However, perhaps the most fascinating comments came from those who recognized Fish’s stance as a professional one: in other words, one that relates to attitudes toward the humanities held by practitioners inside the academy (professors), as distinct from those held by general educated readers outside it (the Times audience). “Let’s not conflate some academics -- those who have professionalized their relationship with the humanities to the point of careerist cynicism -- with those [...] still capable of a genuine relationship to the humanities,” said one reader. Another added that the “humanities have been taken over by careerists, who speak and write only for each other.”
In other words, while readers defend the liberal arts’ relevance, scholars, who are busy writing specialized scholarship for one another, simply aren’t making the case. This was an interesting debate when Fish wrote his column over a year ago; now in 2009, we should consider it an urgent one.
Traditionally, economic downturns are accompanied by declines in the liberal arts, and with today’s unparalleled budget pressures, higher education institutions will need to scrutinize the purpose of everything they do as never before. Drucker’s academic novel provides an illustrative example of the liberal arts at work: as Fish’s readers would point out, literature can raise theoretical questions that help us understand very practical issues.
To be sure, the liberal arts are at least partly valuable because they are removed from practical utility as conceived in business; the return on investment from a novel can’t be directly tied to whether it improves the reader’s bottom line.
But justifiable concerns among scholars that the liberal arts will become only about utility has driven the academy too far in the opposite direction. Within higher education, we acknowledge that the writing skills gained in an English seminar might help alumni craft corporate memos, but it is outside higher education where the liveliest conversations about the liberal arts’ richer benefits -- empathic skills and narrative analysis, for example -- to the practical world seem to occur.
Drucker and his antecedents may be raising the right questions, but these discussions should be equally led by those professionally trained in the disciplines at hand. In today’s economic climate, it may become more important than ever for the liberal arts to mount a strong defense -- let’s not leave it entirely in the hands of others.
Melanie Ho is a higher education consultant in Washington. She has taught literature, writing and leadership development courses at the University of California at Los Angeles.
The idea that "Little Orphan Annie" as a historical document full of clues to contemporary American political culture is not, perhaps, self-evident. Many of us remember the comic strip, if at all, primarily as the inspiration for a long-running Broadway musical; the latter being a genre of which I, for one, have an irrational fear. (If there is a hell, it has a chorus line.)
Yet there is a case to make for Annie as an ancestor of Joe the Plumber, and not just because both are fictional characters.The two volumes, so far, of The Complete Little Orphan Annie (issued last year by IDW Publishing in its "Library of American Comics" series) come with introductory essays by Jeet Heer, a graduate student in history at York University, in Toronto, who finds in the cartoonist Harold Gray one of the overlooked founding fathers of the American conservative movement. Heer contends that the adventures of the scrappy waif reflect a strain of right-wing populism that rejected the New Deal. He is now at work on a dissertation called "Letters to Orphan Annie: The Emergence of Conservative Populism in American Popular Culture, 1924-1968."
Heer is the co-editor, with Kent Worcester, of Arguing Comics: Literary Masters on a Popular Medium (2004) and A Comics Studies Reader (2009), both published by the University Press of Mississippi. I recently interviewed him about his work by e-mail. A transcript of that exchange follows.
Q: You've co-edited a couple of anthologies of writings on the critical reception of comics and are now at work on a dissertation about one iconic strip, "Little Orphan Annie." Suppose a cultural mandarin like George Steiner challenged the whole notion of "comics studies" as manifesting a trivial interest in ephemeral entertainments on rotting newsprint. In the name of what values would you defend your work?
A: Since I think George Steiner is a fraudulent windbag, he’s perhaps a bad hypothetical example. But let’s talk about some genuine mandarins, rather than those who just put on airs. I came to comics studies partially as a lifelong reader of comics (after my family immigrated to Canada from India I learned to read English by deciphering Archie comics as if they were hieroglyphics) but also intellectually via high modernism. As a graduate student, I was fascinated by mid-century Catholic intellectuals who did so much to inform our understanding of modernism (Marshall McLuhan, Walter Ong, Hugh Kenner). Erudite as all get out and working to reconcile Catholicism with modernity, these thinkers constantly emphasized that the great modernists (Joyce, Eliot, Pound) were deeply shaped by modern mass culture (Joyce kept a copy of the comic strip "Gasoline Alley" on his mantelshelf and stuffed Finnegans Wake with countless allusions to comics). McLuhan and company taught me that high and low culture don’t exist in hermetically sealed compartments but rather are part of an organic, mutually enriching, conversation: Culture is not an exclusive club, it’s a rent party where anyone can join in and dance.
Aesthetically, I’d argue that the best comics (Herriman’s "Krazy Kat," Art Spiegelman’s "Maus," Lynda Barry’s "Ernie Pook Comeeks") are as good as anything being done in the fine arts or literature. Most comics aren’t as good as "Krazy Kat," of course, but the sheer popularity and longevity of ordinary comics like "Archie" or "Blondie" makes them historically and sociologically interesting. "Little Orphan Annie" is a good example, although it is more than ordinary as a work of art, it is also historically fascinating since it helped reshape conservatism in America, giving birth in the 1930s to a form of cultural populism that you can still see on Fox News. Read by millions (including politicians like Clare Booth Luce, Jesse Helms, and Ronald Reagan), Orphan Annie has a political significance that makes it worth studying.
Finally comics are very interesting on a theoretical level. Comics involve a fusion of works and pictures (this is true even of pantomime strips, where we “read” the images as well as look at them). Therefore, comics are inherently hybrid, existing at the crossroads between the literature and the fine arts. As French theorist Thierry Groensteen has noted, the hybrid nature of comics makes them a scandal to the “ideology of purity” that has long dominated art theory (i.e., philosophers and critics ranging from G.E. Lessing to Clement Greenberg). The best writing on comics (a sampling of which can be found in A Comics Studies Reader) all grapple with formal issues raised by hybridity: How can words and pictures interact in the same work? What’s the relationship between seeing and reading? Do visual artifacts have their own language? These are all very challenging questions, which makes comics studies an exciting field.
Q: In addition to two deluxe volumes of the complete "Little Orphan Annie" from the 1920s (with more on the way), you have put together a collection of the proto-surrealist strip "Krazy Kat." Would you describe the process of assembling this sort of edition? It seems like the work would be as much curatorial as editorial.
A: Until fairly recently, most cartoonists didn’t keep their original art, which meant that reprints of old comic strips and comic books had to be shot from the published work (often yellowing old newsprint). This meant that the art often looked like photocopies of 1970s vintage: smudgy and muddy, frequently off-register.
In the last decade, thanks to digital technology, it’s become much easier to clean up old art and restore it to how it originally looked (the cost of printing in color has also gone down). I’m lucky to work with a group of publishers that are willing to put in the hours necessary to do the restoration work. I’ll single out Pete Maresca whose books Sundays with Walt and Skeezix and Little Nemo in Slumberland: So Many Splendid Sundays reprint old Sunday pages at the exact size they originally appeared, with the dimension of a newspaper broadsheet. Pete is meticulous in trying to restore the colors to their original form (the strips were, among other things, a marvel of engraving craftsmanship brought to the United States by German immigrants). To do so, Pete often has to spend a week or more on each page, in effect taking longer on the restoration work than the cartoonist took in drawing the page.
This is not a project I'm directly involved with, but my publisher Fantagraphics recently published an amazing edition of Humbug (a sophisticated humor magazine from the 1950s edited by Mad magazine founder Harvey Kurtzman). Paul Baresh, who also works on the "Krazy Kat" books, did a remarkable job, equal perhaps to someone cleaning up the icons on a medieval church, in restoring the original art. He talks about the production process here.
Q: We'll get to your dissertation's focus on the ideological dimension of character and plot in "Little Orphan Annie" in a moment. But first let me ask about the artwork. As someone who has studied comics closely, do you see anything innovative or distinctive in its visual style? Also, what's the deal with Annie's eyes? It looks like she just got back from a rave....
A: The earliest newspaper cartoonists mostly came out of Victorian magazine illustrations, which meant their art tended to be florid, dense with decoration and unnecessary details. Harold Gray, Annie’s creator, belonged to the second generation of comic strip artists who did art that was sensitive to the fact that newspaper drawings didn’t need to be so elaborate, indeed that simpler art was more effective because it pushed the narrative along quicker. Gray’s great gift was in character design. In her pilgrim's progress through the world Annie meets all sorts of people, ranging from decent, hard-working farmers to sinister, hoity-toity pseudo-aristocrats. Gray was able to distill the essence of each character so that you can tell, at a glance, that the farmers were care-worn and dowdy but decent, while Mrs. Bleating-Hart (the Eleanor Roosevelt stand-in) was pompous and exploitative.
The best description of Gray’s art I’ve ever seen was written by the 15-year-old fan John Updike, in a letter I found in Gray’s papers at Boston University. “The drawing is simple and clear, but extremely effective,” Updike wrote. “You could tell just by looking at the faces who is the trouble maker and who isn't, without any dialogue. The facial features, the big, blunt fingered hands, the way you handle light and shadows are all excellently done." Updike’s reference to “light and shadows” refers to Gray’s other skill, in creating mood and atmosphere. Annie lives in a dark, somber, gothic world, where evil blank eyes are always peering out of windows.
Annie’s blank eyeballs were a convention Gray inherited from his artistic mentor Sidney Smith, who did "The Gumps." But artistically, as Gray explained to a fan in 1959, the blank eyeballs served to enhance reader involvement with the strip: not seeing what is going on in the eyes of the characters, readers could impose their own fears and concerns into the narrative. Recent comics theorists, most famously Scott McCloud in his frequently-cited book Understanding Comics, have argued that blank, empty characters (Charlie Brown, Mickey Mouse) are easier to identify with. Gray seems to have understood that instinctively.
Q: You argue that from its start in the mid-1920s the strip manifests a strain of conservative populism. The honest, hard-working, "just folks" Annie makes her indomitable way in a world full of elitists, social-climbing poseurs, and pointy-headed do-gooders. How did the strip respond to the economic catastrophe of 1929 and the New Deal that came in its wake?
A: While big business Republicans like Herbert Hoover were politically vanquished by the Great Depression, Harold Gray actually prospered during the 1930s, with Annie becoming the star of the most popular radio show of the decade. How can we explain this, given that Gray was as much an advocate of two-fisted capitalism as Hoover?
Whatever the merits of Hoover’s policies, the President was tone deaf in responding to the Depression because he adopted a harsh rhetoric that denied the reality of poverty. “Nobody is actually starving,” Hoover said as millions had to line up in soup kitchens. “The hoboes, for example, are better fed than they ever have been. One hobo in New York got ten meals in one day.”
Orphan Annie and “Daddy” Warbucks never voiced such complacently unfeeling indifference to poverty. Annie was poor even in the prosperous 1920s, often living as a hobo and begging for food when separated from her capitalist guardian Warbucks. In the 1920s Gray was a progressive Republican in the tradition of Theodore Roosevelt: He praised labour unions, public schools, feminist reforms (Annie dreams of being President like her hero Lincoln), and mocked anti-socialist rhetoric. In reaction to the New Deal, Gray became much more of a partisan right winger, turning the template of his story (Annie and Warbucks battling against powerful and corrupt forces) into an explicitly conservative populist allegory.
In 1931, Daddy Warbucks loses his fortune to unscrupulous Wall Street speculators, is blinded, and lives for a time as a street beggar. But after hitting bottom he regains his fighting spirit and outwits the Wall Street sharks who brought him and America low. By 1932, the villains in the strip are increasingly identified with the political left: snide bohemian intellectuals who mock traditional values, upper-crust class traitors who give money to communists, officious bureaucrats who hamper big business, corrupt labour union leaders who sabotage industry, demagogic politicians who stir up class envy in order to win elections, and busybody social workers who won’t let a poor orphan girl work for a living because of their silly child labor laws. Gray started to identify liberalism with elitism, a potent bit of political framing which continues to shape political discourse in American today.
Q: What have you learned from going through the archives, including the cartoonist's fan mail? What does it tell you about how people responded to his politics? Were there people who supported FDR but still rooted for the plucky little orphan?
A: Reading Gray’s correspondence with his fans was what made me fall in love with this project. There was such a rich array of letters from such a wide spectrum of readers: some from little kids, some from famous or soon-to-be famous people (as mentioned John Updike and Clare Boothe Luce but also the journalist Pete Hamill), most from ordinary, run-of-the-mill but often eloquent adults. Politically the letters are all over the place; some readers loved the way Gray attacked liberals, but many readers (Updike and Hamill are good examples) were New Deal supporters. One such reader wrote that we love Annie because she’s just plain folks like the rest of us and Gray should stop ruining her stories by attacking the President Roosevelt, who is trying to help the real Annies.
What I’ve learned is that people don’t read comics in a passive way: Many Annie readers were bringing to the strip their own life experiences and worldviews. This really helps us understand the way comics can weave themselves into the everyday life of readers.
One example close to my heart: In 1942 Annie forms a group call the Junior Commandos to help the war effort by collecting scrap metal. One of the Junior Commandos is a African-American boy named George, who is shown to be intelligent and resourceful. Rare for the time, George was drawn in a realistic, non-stereotypical way. Gray received many letters from black readers, praising him for showing that their race was contributing to winning the war (although some black readers also felt George was a little bit too servile). Gray also received a letter from an editor of newspaper in Mobile, Alabama, who was upset that a white girl was shown consorting with a black boy. In these letters, we can see how Annie provoked discussion about wartime racial politics.
Q: How did the strip respond to the Sixties? Did Warbucks support the Goldwater campaign? Was Annie menaced by hippies?
A: With the rise of “movement conservatism” and the Goldwater campaign, Gray responded to the times by making Daddy Warbucks and his allies even more militantly anti-communists than before (mind you, the strip featured communist villains since the early 1930s.) Dissatisfied with the mealy-mouth diplomacy of the State Department, Daddy Warbucks and his private army fight a Castro-style Latin American dictator.
Throughout the 1960s beatniks and hippies are cast as villains. As one sympathetic character complains, “I wonder why we see all these peculiar people nowadays” like “th’ beatnik types, th’ ones with long hair, the ones with beads and funny clothes.” There is a fascinating sequence in 1967 showing anti-war protesters burning American flags, and then being attacked by a group of patriotic immigrants from around the world who love America. “We are loyal Americans defending our flag!” one immigrant proclaims. “What are you unclean vermin?” In some ways, this episode prefigures the hardhats versus hippies drama that Rick Perlstein describes in his great book Nixonland.
Like the conservative writers at National Review and the Republican Party itself, Gray also became more sympathetic to the South as a region, seeing it as a bastion of traditional values. Many of Annie’s adventures in the 1960s are set in the South, although the issue of civil rights is scrupulously avoided. This was a big shift for Gray, who in the 1920s started off as a Lincoln Republican (his middle name was even Lincoln), with Annie explicitly and implicitly criticizing racism. We can see the emergence of the “Southern strategy” in Annie.
Q:In the dissertation, you draw a parallel between Gray's populist sensibility and the work of Wilmoore Kendall, the right-wing political philosopher. Actually that's where you hooked my attention -- very few people remember Kendall, let alone write about him (though Gary Wills portrays him in Confessions of a Conservative and Kendall is the inspiration for the title character of Saul Bellow's story "Mosby's Memoirs").Since you aren't arguing that the thinker influenced the cartoonist or vice versa, how do you account for the affinity between Kendall's take on John Locke and Little Orphan Annie?
A: Kendall is a fascinating figure, deserving much more attention than he’s received (although John Judis in his biography of William F. Buckley does a good job of describing Kendall’s pivotal role as mentor to the founder of National Review). Prior to Kendall’s path breaking work (he flourished as a thinker from the 1940s till his death in 1968) conservative intellectuals were almost always openly elitist and anti-democratic: think of T.S. Eliot’s royalism, Albert Jay Nock’s pinning his hope on the “saving remnant,” H.L. Mencken’s Nietzsche-inspired scorn for the booboise, F.A. Hayek’s belief that the courts should be used to curb the rise of the welfare state.
Kendall broke with this tradition of celebrating hierarchy and fearing the masses. He firmly believed that the vast majority of the American people were “conservative in their hips” and that the American political institutions were designed not to thwart the will of the majority but to articulate the deeply held conservative principles of the masses. As a conservative who was closer on a theoretical level to Jean-Jacques Rousseau than to Edmund Burke, Kendall recast the language of American populism in an anti-liberal direction. To his mind, liberals were “the establishment” which needed to be overthrown. As the historian George Nash noted, Kendall was Nash, Kendall was “a populist and a conservative. The contrast with much aristocratic, even explicitly antipopulist, conservatism in the postwar period was striking.”
The reason Kendall’s in the thesis is that the overwhelming majority of the literature on mid-century American conservatism deals with elite political and intellectual figures like Buckley, James Burnham, Whittaker Chambers, Richard Nixon, Barry Goldwater, etc. Historians haven’t done such a good job at locating the origins of conservative ideas in the broader culture, in movies, songs, and comic strips. In drawing parallels between Kendall’s worldview and the ideas that were earlier articulated in Annie, I’m trying to show that high and low culture don’t exist in isolation, but are part of a large conversation with common ideas and images percolating up and down the line. Consider the phrase “egg-head” which Kendall often used when insulting liberal professors (a rather cheeky term since he himself was a Yale man).
Long before the phrase “egg-head” was coined, cartoonists like Gray drew oval-faced professors who lacked common sense and sneered at practical minded business men. I don’t know whether Kendall read Annie or not (although some of his colleagues at National Review clearly did since they wrote about her in their magazine). But it seems to me incontestably that Kendall and Gray shared an overlapping worldview, and can usefully be compared. The fact that Gray’s conservative populism preceded Kendall’s work by a decade also raises interesting questions as to whether elite intellectuals are always at the vanguard of ideological change.
Q: When cultural studies began implanting itself in American academic life about 20 years ago, there was a strong tendency to discover and celebrate the "subversive" and "emancipatory" aspects of popular culture. There was some blend of wishful thinking and willful ignorance about this, at times -- along with a narrow present-mindedness that tended to ignore popular culture from earlier decades, or to look only at things that seemed "counter-hegemonic" in comforting ways. Do you see your work as an explicit challenge to that sort of cultural studies, with its ahistorical perspective and cookie-cutter hermeneutics? Or do you understand what you are doing as part of the "cultural turn" within the historical profession itself?
A: I completely agree with your characterization of early American cultural studies, especially in the form it took in the 1980s and 1990s. The whole “Madonna is subversive” schtick exhausted whatever limited value it had very quickly. So yes, I hope my work tries to challenge the limits of this mode of thinking by being more historical, more grounded in archival research, and attentive to the divergent political voices found in popular culture. One of the great things about working in archives is that the very diversity of voices you find in the past (as in the letters to Orphan Annie I’m working with) force you rethink any “ahistorical perspective” or “cookie-cutter hermeneutics” you may have started off with.
Having said that, I wouldn’t be able to do the work I do without the opening created by cultural studies. One of the points Kent Worcester and I make in our two anthologies is that there was a wide variety of very interesting writers (ranging from Gershon Legman to Thomas Mann) who wrote on comics in the past but it was only with the advent of cultural studies that comics were able to find a secure home in the academy, with an infrastructure of journals, conferences, and library support. Cultural studies has greatly expanded the academic opportunities for anyone interested in popular culture.
My own discipline of history has been transformed by cultural studies. As you properly note, there has been a “cultural turn” in history. To my way of thinking, this “cultural turn” can be traced back to the original British New Left of the 1950s, and especially the writings of E.P. Thompson. My own work might seem far afield from Thompson’s epic work on the British working class, the moral economy of food riots, and the politics of Romantic poetry. Still for all his work has been criticized and challenged in the last few decades, it remains for me the best example of how to do cultural history. Thompson had a great ear: he could pick up nuances from the past that other historians were simply too tone-deaf to hear. The voices of ordinary people, in all their tangled complexity, came through in Thompson’s work. As more historians grapple with culture, Thompson remains the model to follow. I doubt if my work has anywhere near the value of Thompson’s, but as a close friend always tells me, you have to aim high.
Over dinner party talk about my work directing a university writing center, a friend remarked that while reading to her kids a British edition of Harry Potter, she came across passages where revise meant study, as in Harry and Ron revised all night for the potions exam.
She asked whether I was familiar with that usage. I wasn't. But I had, in recent weeks, been mulling over the ways that faculty across the disciplines define revision.
My university, like many others that have formal writing-intensive (W) requirements, demands that W course instructors assign a minimum number of pages (in our case 15) and build in a deliberate process for revision. In a climate of budget cutting, when everything is under scrutiny, some faculty have been asking: Should we keep the current requirement of two Ws? Can we afford to keep them capped at 19 when enrollment caps for so many other courses are rising? Are labor-intensive W courses really the best use of faculty time? Are the W courses working?
The debate has revealed a range of unstated assumptions about what revision is and how it should be taught. Two of the more vexing assumptions -- held by a few in favor of the W requirement and a few critical of it -- strike me as especially persistent: that revision is about correcting student deficiencies and that requiring revision breeds dependency.
Conflating revision with correction is quite natural: Students submit (usually flawed) drafts; faculty prescribe how to fix them; and students fix the flaws. Such a process, as anyone who has worked with a skilled editor knows, may not always be fun but it leads to a better final product.
The problem is that the ultimate aims of editing and teaching are different: editors want better writing; teachers may want that too, but they ultimately want better writers.
Certainly students can learn a great deal by following the lead of a good editor, but when teachers slip into editor mode, students in turn focus on delivering what the teacher/editor wants more than on either learning or inquiry. Consider the extreme version (but I've seen it happen): a student submits a draft electronically; a dedicated teacher makes extensive, time-consuming edits in Track Changes; and the student scans the first few edits and then hits the "Accept All" button. Revision done.
The lesson here is not that we need to force students to march through a correction process more deliberately. It is that we need to craft our responses to drafts in ways that encourage students to take responsibility for their own texts.
In practical terms this may mean following some of the pedagogical recommendations of writing across the curriculum experts: when students submit drafts, require them to include cover letters that articulate their own revision plans; attend to macro-level matters such as purpose and argument early in the writing process and sentence-level errors later; rather than copyedit start to finish, line edit only a small portion of a draft, noting patterns of error and leaving the rest of the editing for the writer; balance critique for what isn't working with praise for what is; invite writers to focus on just two or three manageable priorities for revision; and so on.
This does not mean that we should shy from pointing out flaws; nor does it mean that we should avoid giving direct advice on matters large and small. But does mean that we should guard against complicity in an "I'll tell you what's wrong and you fix it" transaction.
The ubiquity of the fix-it orientation may help explain one finding from a recent assessment of student writing done at our university. By collecting course syllabi and student final papers from W courses in four departments, we discovered much good news: that faculty are assigning long, research-driven papers on challenging topics and are requiring drafts; and that over 90 percent of the papers met at least minimal proficiency for undergraduate writing as judged by faculty and graduate students who scored anonymous papers from their home departments.
However, we also discovered that instructor grades were, on average, more than a full letter grade higher than the quality scores given to those same papers. Grade inflation, the stripping of context, and a number of other factors may explain that disparity, but part of the reason for high grades may also be that well-intentioned students and teachers are tacitly locked in a correction mode revision: students draft, teachers point out what to fix and how to fix it, and students correct. Grades ratchet up with each draft as the system rewards compliance above all.
Is the alternative then to have students to work more independently? So think many who believe that requiring revision breeds dependency and that the job of faculty is to wean students from such dependency.
The best students, the logic goes, do need little or no help drafting and revising, so requiring them to do so is at best unnecessary and at worst infantilizing. The worst students tend to submit dashed off drafts, trusting that faculty will essentially write the paper them for them, which turns revision into an empty exercise. This leaves a small slice of motivated but flawed writers who will really benefit from teacher-assisted revision, and they can always come to office hours.
We all want students to be independent learners and to take responsibility for their own education, but that does not mean that the best writers draft, revise and edit on their own. That may work for some but it does not characterize the process of most successful writers, academic or otherwise. We know that writing demands stretches of solitary work but we also know that writers who are willing to share their work early and often typically do better than those who muscle it out entirely on their own.
The aspiration we should have for student writers is not independence as much as interdependence.
Teaching revision as encouraging interdependence does not mean withholding critique or going soft on students. But it does mean that more than deliver prescriptions or justify grades, teacher comments on drafts should challenge writers with options and spark further conversation. Only then can we leverage what sets extended writing assignments apart from other modes of assessment, such as exams: that by working across drafts and with others writers can, within the bounds of academic expectations, walk their own paths through the material, making their own connections and claims along the way. If what we really want is coverage and correction, better that we stick to exams.
A policy that requires revision should be justified not on the grounds that students need remediation but on the reality that scholarly writing emerges from a condition of interdependence, a process that typically includes the guidance of mentors and sharing of drafts as well as peer review and directive editing. Apprentice scholars deserve some approximation of that experience.
Tom Deans is associate professor of English and director of the University Writing Center at the University of Connecticut.
One favorable outcome of the current economic crisis might be that literary studies finally puts poverty near the top of the agenda and the center of the field. A few years ago, Hurricane Katrina reminded the nation about Americans living in poverty, and it seemed then, to some of us in literary studies who write about poverty, a possible turning point in critical priorities. But it was not to be. Though important work from literary critics on the subject of the poor has come out since then, especially Gavin Jones’s American Hungers (2007) and Walter Benn Michaels’s The Trouble with Diversity (2007), it is perhaps not surprising that the suffering of the poor, even when it temporarily comes to light spectacularly, is not enough to prompt such a major change of direction in the professional discourse.
The engine of “cultural studies” has incredible momentum, and there is a concomitant tendency for the cultural or identity issues of race, gender, sexuality, and even class to subordinate that of poverty. To take just one example, a 2007, post-Katrina book, published by a major university press and called Slumming in New York, can nearly wipe away the poverty problem in New York in the 19th century in a single sentence unsupported by historical evidence: the author writes, “Unlike many European cities, New York in the 19th century promoted itself as a city free from rigid economic class distinctions. In some ways I believe this was true and that the more destructive conflict was fought over cultural legitimacy and representation.”
One of the reasons that the important issues of race, gender, and sexuality have had such traction in English departments is of course that English departments have numerous professors who have suffered and indeed continue to experience racial, gender, and sexual-preference discrimination or prejudice, and so the profession’s investment in the issues is not merely academic. Meanwhile, English departments have had very few tenured professors who have come out of poverty and, by definition one might have supposed, none who were still living in poverty — literary-critical poverty studies has had almost no “insider” advocates. Or at least such was the case until the current economic crisis. While tenure-track professors may not be experiencing true poverty, many are facing furloughs and pay and benefit cuts that will indeed have a real impact on their standards of living. And adjuncts – some of whom are indeed living in poverty – are losing positions all over the country.
Now that acute socioeconomic suffering has hit home or threatens to hit home among university faculty -- not only English Department instructors and adjuncts, but even some tenure-track and tenured professors are facing or anticipating economic difficulties that make the poverty issue less academic, less other.
This is a terrible moment for many people, and it has reminded many of us, in the most painful way, that socioeconomic suffering is not merely the others’ problem. Let’s take this crisis as an opportunity to put poverty on the front burner in our profession, along with race and gender.
Such a change may be especially possible now, given also, during the same moment, the election of the nation’s first African-American president. At this historical conjuncture, it is apparent that the nation has made progress on the problem of racial discrimination that it has not made on that of socioeconomic privation. And yet, of course, these problems are related, and blacks still suffer from poverty out of proportion to their numbers in the population.
We English professors might take a hint from other disciplines. The day after the election, sociology professor Orlando Patterson of Harvard University discussed on television the public triumph of the first African-American president-elect and the continuing private or social isolation of poor African Americans (he talked, for example, about de facto school segregation, more intense than the segregation that existed in the 1970s, and disproportionately high rates of incarceration for blacks).
We might also take a cue from the writers we study and teach. As Gavin Jones has reminded us, many of our great American writers, black and white, women and men, have been concerned with poverty in the United States, including Herman Melville, Edith Wharton, Theodore Dreiser, Stephen Crane, Richard Wright, and James Agee. And, I would add, some of the great American writers who wrote about the poor have in addition come out of poverty, such as Jacob Riis, Zora Neale Hurston, Richard Wright, and Claude Brown.
English departments have done tremendous social good by methodically studying issues of race, gender, and sexuality, good that has gone even beyond raising consciousness and changing attitudes among students; they have also made it a priority to hire minorities, women, and gays and lesbians. Can we English professors make similar contributions to addressing the ongoing poverty problem? Can we take a leading role in promoting poverty studies and affirmative action for the economically disadvantaged? Poverty is a problem, of course, that won’t go away when this economic crisis has passed, but this crisis might leave the literary profession more connected to it.
The novelist and critic Isaac Rosenfeld died of a heart attack in 1956. He and Saul Bellow had been friends since childhood, and they had arrived on the literary and intellectual scene of the early 1940s as a team, “the Chicago Dostoevskians,” with Rosenfeld’s reviews and short stories making the larger initial impression on anyone paying attention to the world of little magazines.
In later years any lingering affection between them was cut with traces of bitterness, for Bellow went from triumph to triumph, while Rosenfeld drifted, turning out the occasional brilliant short piece while unpublished manuscripts piled up. I do not think the word “adjunct” was in wide use at the time, but Rosenfeld got by with dead-end positions at the University of Minnesota and the University of Chicago.
By the end of his friend’s life, wrote Bellow, his friend was living in “a hideous cellar room” from which any hint of bohemian glamor had long since fled. He had, Bellow wrote, “one of those ready, lively, clear minds that see the relevant thing immediately.” But Rosenfeld’s cutting lucidity left him filled with scorn for any motive involving the pursuit of success, let alone propriety. Bellow wrote that his friend “seemed occasionally to be trying to achieve by will, by fiat, the openness of heart and devotion to truth without which a human existence must be utterly senseless.”
He imposed a grim discipline on himself, a kind of squalid asceticism. To the naked eye it looked like failure. When he died in a shabby apartment, Rosenfeld was 38 years old.
As it happens, I was exactly half that age when, in the early 1980s, I discovered An Age of Enormity, the posthumous collection of his reviews and essays that, while long out of print, remains the single best introduction to his work and to his legend. To a teenager, of course, exiting the world in your late thirties hardly seems to qualify as a youthful death. But his example loomed in my imagination for many years as essentially heroic. Rosenfeld’s intransigence, his disdain for the gods of success, was somehow inspiring, albeit in ways that have not done me very much good over the long term. (In general it may be unwise for young people to accept career advice from the dead.)
The first book devoted to his life and work is Rosenfeld’s Lives: Fame, Oblivion, and the Furies of Writing (Yale University Press) by Samuel J. Zipperstein, a professor of Jewish history and culture at Stanford University. My review of the biography has appeared elsewhere, to which it bears adding something: Zipperstein makes very clear the resentments and resentiment of Rosenfeld’s final years, so that even a callow adolescent could not fail to understand their misery.
Point taken. Yet the legend of Rosenfeld, his aura as beautiful loser, still exercises a certain power over my imagination even after more than a quarter of a century.
It is no doubt understandable and fitting that Zipperstein should treat Rosenfeld’s life as a chapter in the story of Jewish-American ambivalence about assimilation. That does not seem to be the basis of any elective affinity in my case, however, unless growing up in a small Southern Baptist town is comparable to life in a shetl, which does seem like stretching it.
The grounds for this continuing fascination became clearer after looking, once again, at the title essay in George Scialabba’s new collection What Are Intellectuals Good For?
Scialabba does not simply repeat standard complaints about the decline of free-range public intellectuals and the rise of transgressive professorial jargonization. (That is a familiar story, even perhaps too familiar.) Scialabba points, rather, to the role played by a “new variety or mutation” of thinker in the “modern, efficient machinery of persuasion” necessary to hold highly developed societies together. Scialabba calls this type “the anti-public intellectual, whose function is not criticism, not defense of the public against private or state power, but the opposite.… As a result of the intellectuals’ incorporation en masse into the ‘power elite,’ it now requires far more training, leisure, and resources to penetrate the screen of corporate or government propaganda….”
And so the critic must redouble his efforts at challenging the arts of public manipulation, however Sisyphean those effort may be. The boulder will crash through the screen every so often, with enough luck and a good aim.
Such is the proper role of the intellectual, as Scialabba reminds us; and most of the time I would not argue otherwise. But it does not exhaust the options.
A different response can be found in a talk that Isaac Rosenfeld gave to the staff of The Chicago Review in the spring of 1956, a few months before his death. The talk was recorded, and a transcript appeared in the Review the following year under the title “On the Role of the Writer and the Little Magazine.” It was not reprinted in his collected essays, or anyplace else, it seems, which is strange because it counts as Rosenfeld’s final testament.
“I am used to thinking," he told his listeners, “because of my upbringing, of the writer standing at one extreme from society … over against the commercial culture, the business enterprise, the whole fantastic make-believe world which some people would like for us to believe is the real world. Of course it can’t be that for the writer.”
This condition of extremity had once been made tolerable by the solidarity of peers who were in the same condition. There was an avant garde – a culture apart, marginal but tough, its spirits fortified and even lifted by the experience of rejection. It was “a small but vigorous and very vital, active and conscious group,” said Rosenfeld, “which knew fairly well the sort of thing it stood for even if it had no specific program and whether or not it had any political allegiance.”
But now the vanguard was dead, or at least domesticated, and the writer was constantly solicited to assume a role in “the symbol manipulation industries,” whether in academe or government or the media. You had to go along to get along. After all, even bohemia was expensive. As psychic defense and compensation, there emerged a spirit of aloofness -- not just about your job, but towards life itself.
It led to “embarrassment with human subject matter,” said Rosenfeld, the desperate cultivation of a “flair for the abstract… for the ‘cool.’ ” This sensibility tolerated expression of “nothing too immediate, too direct or emotional, because that would be considered ‘square’ or ‘frantic.’ ”
(People now assume that prepackaged irony was invented sometime within, at most, the past couple of decades. Not so. The marketing has just improved.)
For Rosenfeld, this amounted to creative death, disguised as a lifestyle. No serious writer could indulge it. He had been a political radical in younger days -- and more recently a psychoanalytic radical, following Wilhelm Reich’s call for sexual revolution. But this, his final profession of faith, did not call on writers to play the role of intellectual activist, challenging the Empire’s mandarins on their own ground. The role of the writer, he said, was not to play a role at all. “Playing a role” was exactly the problem.
Writers had to earn an income, and working in academe was one option. (About teaching, the transcript shows that Rosenfeld twice said, simply, “It’s a living.”) But for any serious writer there was no escape from the need, if necessary, to go it alone -- to trust one’s sense of the important, even if no committee welcomes the effort. Intuition had to be developed, not ambition.
The writer "will have to play,” said Rosenfeld, “the role that is not a role; to be the living man, the one left alone at three o’clock in the morning, when it’s always the dark night of the soul; to be the man whom one encounters when there is no longer any uniform to wear… to be the man who is naked, who is alone, and the man who pretty much of the time is afraid: the man who sees himself as he really is in this flesh and in these bones and in these feelings, in these impulses, in these emotions; the man who confronts himself in his dreams and his reveries; the man who sees himself walking across the street, thinking there but for the grace of God go I, or in his envy: there but for God’s disdain of me I could have gone…. He has to see the light and the truth that can be seen even in our phony and artificial age.”
Saul Bellow recalled that his friend, who did graduate work in philosophy at New York University in the early 1940s, had abandoned that path when he discovered Herman Melville. After reading Moby Dick, logical positivism seemed too blinkered a take on the world. Rosenfeld turned into an intellectual equivalent of Bartleby the Scrivner, saying, “I would prefer not to,” over and over, as the years slipped past.
This was not a good way to live. But then what is? The question is not rhetorical but real -- the kind that is waiting at three o’clock in the morning.
My fantasy is that I pick up a novel or story in Russian and I don’t realize I’m reading Russian. I smile, full of the story, excited and exhilarated as I turn the pages, and it’s only when I set the book down that I notice it’s not in English.
Another fantasy is that after walking over from the college where I teach in Brooklyn, I’m waiting for the train at the subway stop in Brighton Beach, and two Russians are sitting on the bench discussing Dostoyevsky, and I, ignored by them as I sit down, throw in a comment in Russian, and they, in disbelief, as if a kid has walked onto a baseball diamond and lined a fastball from Roger Clemens off the fence, throw me questions at the same time, and I respond in perfect colloquial Russian. We continue discussing Russian literature.
In real life, I have been studying Russian on my own, every day, for the past year and a half. What works for me is reading stories and scenes I already know very well in English. My literary divinity Tolstoy said the best way to learn a language was to pick up your favorite book and start reading it in that other language. (He used the Bible; I used my bible, Anna Karenina.)
In St. Petersburg last winter I bought a Russian-language CD of Chekhov’s stories and I listened to an actor read “Dama s sabatchkoi” (Lady with Little-Dog) about 20 times. Sometimes I looked at the text as he read. He elided words and sounds that I never would have guessed could be elided. His pronunciations and emphases were little like the ones I managed as I read it aloud to myself.
I read the opening chapter of Anna Karenina and I divined many words. But there they were, in Russian! How delightful! It’s the difference between seeing a painting in a book and seeing its original hanging on a wall. Well, there it is! You can’t get any closer than that!
I studied every day, wandering, doing what I felt like doing. When I didn’t want to read, I listened, and I told myself I needed to listen. I listened to vocabulary tapes, grammar tapes, spoken-word recordings of Chekhov and Pushkin. I listened to Lev Tolstoy himself on a Web site.Bozhe moi! There he is!
I did anything that seemed easy. I avoided the grammar, trusting myself to pick it up as I needed.
Then one late night, when I was visiting St. Petersburg and I couldn’t sleep, and I retreated to the hotel lobby to read a biography of Pushkin lest I wake up my roommate, a friendly woman sat down on an adjoining couch and volleyed my bad Russian with her bad English, and we worked out a classroom-like conversation about the weather, education, sports, music. After I declined her invitation to a massage to help me relax, and we said our do svidanyas, she advised me, kindly: “Nuzhno pravila.” (“Grammar is needed.”)
Yes, it is.
And that’s what my Russian-speaking friends tell me. Rules are necessary.
But then I wouldn’t flow with the rhythm of my interests and desires. I resist. I take the easiest route. I take the road that beckons me. And yet, having it all my own way, avoiding the dictionary (I sometimes go days without checking a dictionary, telling myself that, well, I’ll just look for the words I know or can figure out from context), avoiding any method, I find myself in the same boat as many of my students.
I remember last fall my student Irina, who came to the United States two years before and who’s my age, saying, “My English … shame!”
“I feel ashamed of my English.”
“Shame of my English.”
“Yes, a-shamed. Ashamed.”
How ashamed I was thinking of my Russian!
But how happy I am with my students who plunge ahead, never faint-hearted, making lots of mistakes. How well some of them write in spite of the incorrect grammar, in spite of the limited vocabulary — how fresh some of their descriptions. They have to describe what they see without any pre-mixed colors and scarcely any canned language. How I admire them, how much I admire, for example, Lingtong! She came here at 16 from the south of China, without any English, and she threw herself into learning the language from her teachers, from her books, from experience on the job. How well she speaks, how hard she continues to pick up refinements in idioms (her grammatical mistakes are those of native New Yorkers).
And yet preying on me so much of the time — I feel it and it shows up in my journal entries about my Russian — is the shame of not knowing anything. Besides it not being very becoming of me, besides it contradicting my feeling about my own ESL students (that they have nothing to be ashamed of, that they are climbing a mountain, that they are doing something extremely difficult), I continue to complain of and feel ashamed of my lack of knowledge of Russian. On the other hand, I really am proud to have learned so much on my own. I am proud of figuring things out about the grammar simply from reading from Anna Karenina and “Lady with Little-Dog” and knowing that this belongs to that, and he (the character) would not say that, so maybe it’s this, and how this must be an object and this an adjective.
Of course through my self-teaching I’m understanding better the agony of some of my students, how Irina would turn to her compatriot Sofiya with a look of panic on her face, and how she and some of my Chinese-born students watch my mouth for clues — sometimes, a moment later, repeating or mouthing my phrasing; wincing, lost, some of them eager to be asked the very question they know how to answer, but no other question! The complaints about synonyms! Why? Why are there two words for this? Well, I explain, there are three. My Chinese-born students complaining about my correction of words they looked up! “Is right! — Why not right, Professor?”
“It’s right, but there are other words that are better — that mean just what you mean, but don’t mean the other things anybody would think of before that. It’s ambiguous.”
Russian students know that word.
The hopelessness of learning a new language.
I realize that it’s good my students hear their writing out loud. I like my short assignments where they write an anecdote or a poem and I collect them and read them all aloud. Under pressure of time, yet free of the pressure that it has to be an essay or good or finished, they write with the words they have. They work with the tenses they have.
As for reading, it is so hard! And of course it’s good that they read a conversational voice. Langston Hughes’s "Simple" stories, for instance, have voice in the narration and lots of dialogue. My students get the humor. I wonder what humor I could possibly understand in Russian. I have read with feeling the passages where both Annas (in both Anna Karenina and “Lady with Little-Dog”) break down in tears. I have been refortified by remembering the significance that Irina attached to her breaking down in tears while reading in her education course Torey Hayden’s One Child. So I know that my ESL students are way ahead of me, but that they were all where I am now. That makes me hopeful that I will eventually reach their fluency.
But I will never, unless I change my personality, have Russian the way Lingtong has English.
I imagine myself visiting Yasnaya Polyana, Tolstoy’s estate, next summer, and I will be or feel humiliated. But look how far I’ve come! Look how far! That will be running through my head in my humiliation. We are not humiliated by what we’ve fallen to, but by what we are striving to attain.
Bob Blaisdell is a professor of English at City University of New York’s Kingsborough Community College.
Apart from his preoccupation with race, class, and gender -- not to mention his interest in both cross-dressing and cannibalism -- the worst thing about William Shakespeare is, of course, his language. He coined the expression "the beast with two backs." Hamlet refers to the "country matters" that "lie between maids' legs." Characters in another play make penis jokes about how a certain word should be understood as a noun in "the focative case" -- thereby sneaking in a pun on a word that the Federal Communication Commission fines you for using.
I can't believe they teach this trash in schools. It's time for Fox News to do an expose.
And while they're at it, perhaps it is time to investigate another scandal: Oxford University Press (no less!) has just issued the new edition of The F Word, by Jesse Sheidlower, an editor at large for the Oxford English Dictionary. Random House published the first version of his study in 1995. But the word itself has only grown in its range of nuances in the meantime. It is often heard in punk rock and gangster rap, and has in recent years enlivened the discourse of the executive branch of the United States government.
The latest edition adds more than 100 variations on the word to its lexicon, draws on a variety of digital databases, and incorporates examples of usage from New Zealand, South Africa, and elsewhere. Terms once identified as belonging to one part of speech are analyzed in their full range of usages; "fugly," for example, is now treated as both noun and adjective. The nuances of words are now more finely parsed. While previous editions defined "fuckfaced" as meaning "ugly," it can also mean "tired" or "drunk." The military and civilian usages of "clusterfuck," whether as noun or verb, are cataloged.
Sheidlower's introduction undertakes a swift and no-nonsense debunking of some common myths about the word. It is not the acronym of "for unlawful carnal knowledge" (let alone the preposterously stilted "fornication under consent of the King"). More surprising to learn is that it isn't really an Anglo-Saxon word either, as it's usually called. The first known appearance in English is around 1475; its ancestry appears to be Germanic.
This is vulgarity at its most erudite, and vice versa. Although Sheidlower indicates he chose some illustrative quotations because he found them humorous, The F Word itself is a sober piece of scholarship. I asked the lexicographer a few questions about his project by e-mail; a transcript of the interview follows.
Q: This is the third edition of your book, and by far the most extensive. How did you come to make studying the word your life's work?
A: I think that all words are interesting, but especially slang terms, because slang is an area that had been ignored or treated with active hostility by academics for quite some time. Thus, there's still a lot of work to do on slang.
My specific interest in this word came about mostly by chance. I had been working on the Historical Dictionary of American Slang at Random House, and suggested in an editorial meeting that we publish the fuck material separately, for ease of access to what would be one of the most-looked-up words in the book, and this suggestion was taken up with an enthusiasm that surprised me. And that's how it all started.
Q:The word appears in an Italian-to-English in dictionary in 1598 and returns in a guide to English etymology (written in Latin) in 1671. It pops up in other reference works over the following century -- then, after 1775, disappears from general dictionaries entirely for 170 years. How do you understand this deliberate lexicographic blind spot? Was it something that applied to most "swear words" or "vulgarities"? Or was it singled out for repression?
A: No, it was words of this kind in general. The same thing that made the Victorian era so (publicly, if not actually) repressive affected the view of the language as well.
In the Introduction I quote from a legal decision in the 1840s where the judge specifically notes that despite being absent from dictionaries, the word fuck was in common use, so we shouldn't use lexicographers' modesty as a guideline. In the 1890s, a printer refused to publish a volume of a (privately printed) slang dictionary because of its obscene content, and when the dictionary's author took the printer to court for breach of contract, the printer won the moment the jury saw what it was he didn't want to print. A few examples like that are all we need to see to learn about the kind of pressures that existed at the time.
Q:You document an wide range of uses of the word -- including scores of idioms, numerous cognates, a lot of abbreviations (the most famous being SNAFU), and several ways to write it down without quite violating the prohibition, such as "fug" or "f****" or even "XXXX." The variety is astounding. At the same time, the connotation of any given use tends to be hostile or aggressive, as often as it is sexual. How deep is that association? Did the word start out with that overtone, or did it acquire its hostile edge at some point along the way?
A: As far as we can tell, it's relatively recent. For the first several centuries, sexual uses were the only thing we had. Uses such as 'to harm; victimize' and 'to cheat or trick' aren't found until the late eighteenth and late nineteenth centuries respectively, and these are very rare until the twentieth. With that said, the association of hostility or aggression with sex is not a new development.
Q:T his is the first reference book I've ever seen to cite Usenet as documentation. Would you say a bit about the value -- and the pitfalls -- of using digital resources for this project.
A: People often think that having access to big databases makes it easier to do research. Quite the contrary -- it makes the result better, but if very often makes it much harder. Instead of getting a moderate amount of evidence, that you are able to handle, you get a vast amount of evidence that you have to struggle to process. And if you ignore it, someone else won't.
So you do end up with a much more thorough and comprehensive project, but at the cost of enormous time. There were several simple, one-sense entries that I started to work on thinking that I'd be done in ten minutes, and ended up hours later with a greatly expanded multi-sense entry.
It is, of course, great that all of this is available. And it's a great democratizer -- everyone at a university will have access to the same range of electronic resources, and that's wonderful. But it makes your job as a scholar more difficult when you know that anyone can find something that you missed. That's true for the Internet as a whole, not just in relation to language research.
Q: Last month, a guest on "Saturday Night Live" used the word by accident; she meant to say "freaking," it seems, but the uneuphemized version came out. Around the same time, the anchorman for a New York television station used the curious expression "keep fucking that chicken" while on the air. These incidents would have been a big deal, once upon a time. Now they barely register on public awareness. Do you think the word will ever be just ... a word?
A: I think it's unlikely that fuck will lose all of its power at any point in the foreseeable future. After all, even relatively mild expressions ("darn!" or "bastard," say) still maintain a certain amount of colloquial force. And because fuck is still viewed as the most extreme general word there is, its use on TV will continue to be surprising. So while I do think that the progression we've seen in the last 40 or so years in particular will keep going -- i.e. that it will become ever more acceptable -- it will be a very long time, if ever, before it's just a word.