It's Time to End 'Courseocentrism'

It’s often said that one of the great failings of American higher education is that teaching fails to get the respect it deserves. It seems to me, however, that, especially in the humanities, the current academic generation is significantly more dedicated to teaching than most of us were when I started out in this profession in the early sixties. The real problem, as I see it, is that the way we think about teaching needs to change.

At a time when amazing new forms of connectivity are made possible by new digital technologies and when much of the best recent work in the humanities has made us more aware of the social and collective nature of intellectual work, we still think of teaching in ways that are narrowly private and individualistic, as something we do in isolated classrooms with little or no knowledge of what our colleagues are doing in the next classroom or the next building and little chance for each other’s courses to become reference points in our own. Indeed, we betray our assumption that teaching is by nature a solo act in our unreflecting use of “the classroom” as a synecdoche or shorthand for all teaching and learning, as if “the way we teach now” were reducible to “the way I teach now.”

The isolated, privatized classroom is itself a product of a more affluent era for American universities, a luxury made possible by the generous economic support they enjoyed during the first two-thirds of the 20th century. In this heady economic climate, a university could grow by expanding its playing field, proliferating new courses, fields, subfields, and scholarly perspectives while giving each enough separate space to ward off unproductive turf wars. To make a long story short, we became terrific at adding exciting new theories, fields, texts, cultures, and courses to the mix, but we’ve been challenged, to say the least, when it comes to connecting what we’ve added. Interdisciplinary programs have helped make some connections, but ultimately they have reproduced fragmentation rather than lessened it, since interdisciplinary programs tend to be disconnected from each other as well as from the disciplines. And now that we don’t have the financial luxury to keep adding on -- as is seen in our alarming overdependence on underpaid and overworked adjuncts -- we need to get a lot better at putting the components into dialogue, which means getting on the same page in our teaching in ways we lack practice at and may find uncomfortable.

Exhorting you to “get on the same page” may sound strange coming from me, since, if you know nothing else about me, you probably know that I’ve been arguing for years that we should “teach the conflicts,” putting our controversies at the center of our courses and programs, and I’ve often complained that we hide our disagreements from our students or reveal them only in fleeting glimpses. But I want to argue that, as much as we do conceal our disagreements from students, we also conceal our agreements from them as well as from ourselves. And teaching in non-communicating black boxes helps prevent us from discovering and taking advantage of the fact that in fundamental ways, as I will argue in a moment, we already are on the same page.

I believe that our experience of teaching in hermetically sealed classrooms makes us -- to coin a word -- “courseocentric.” Courseocentrism -- like its ethno-, ego-, and Euro- counterparts -- is a kind of tunnel vision in which our little part of the world becomes the whole. We get so used to the restricted confines of our own courses that we became oblivious to the fact -- or simply uninterested in it -- that students are enrolled in other courses whose teachers at any moment may be undercutting our most cherished beliefs. As my retired colleague Larry Poston recently observed, there is something remarkable about the “almost entire lack of interest we manifest as a profession in what is going on in our colleagues' classes.”

To get on the same page, of course, we would need to know something about each other’s teaching and the ways of thinking behind it, and such knowledge might lead to embarrassing disagreements. So perhaps the less we know about each other the better. It’s not surprising, then, that instead of asking us to try to get on the same page in our teaching, universities assume that each of us will figure out how to teach our subjects on our own.

This assumption is understandable, since many of us became academics in the first place because we liked figuring things out on our own and were good at it. I myself certainly appreciate my classroom freedom, and am not about to ask that I be made to submit a lesson plan to my department head, a curriculum committee, or a district supervisor, as many high school teachers must do. I think I understand why untenured and adjunct faculty members may feel that the classroom is a relatively safe zone that would be threatened if their colleagues knew more about their teaching. I know that on my own really bad days as a teacher I’m relieved that the train wreck has been witnessed only by my students and not my senior colleagues and deans.

Still, I can’t help wondering if our professionalism and our prestige would be fatally compromised if we had to coordinate our teaching the way high school faculties often do. I also suspect that we overestimate the safety our classroom privacy confers, and that more transparency and collaboration in our teaching would not only help students make better sense of us, but would ultimately be as safe for the most vulnerable among us as a curriculum that lets us hide out from each other.

The learning community model is one obvious way to go, but an excellent first step would be to pair first year composition and general education courses, as many colleges and universities now do. A step beyond that would be to pair some science and humanities courses and courses in ancient and modern periods. If we don’t make such pairings, students will lose sight of the contrasts and continuities that define the sciences and humanities and differentiate the ancient from the modern. We’re also missing an opportunity every time a big period course isn’t co-taught by colleagues from several different disciplines. Then, too, the more we are part of a team, the less easily replaceable we become -- a fact that could provide more job security to adjunct faculty members.

The trouble with leaving it up to each of us to figure things out on our own is that it really means leaving it up to our students to figure us out on their own. The assumption is that if we all teach our courses conscientiously, each of us making sure our demands are as clear and transparent as possible, our students will make coherent sense of our diverse perspectives and will eventually be socialized into our intellectual community. The problem is that, no matter how transparent each course is, as long as we know little about our colleagues’ courses our students figure to come away with confusingly mixed messages that will be hard to make sense of without more help than we are providing. As the educational thinker Joseph Tussman once put it, all the courses in a program may be admirably coherent, “but a collection of coherent courses may be simply an incoherent collection.”

It would take too much space to list all the confusingly mixed messages students get from an average set of humanities courses in an average academic day. College students have already coped with such mixed messages on making the transition from high school, when what had been called “Language Arts” mysteriously evaporated and morphed into foreign languages and “English” -- a word that is itself far from helpful or self-explanatory. Once in college, a student can go from one teacher who passionately believes that interpretations of literary texts are correct or incorrect -- or at least more correct or incorrect than other interpretations -- to another teacher who smiles or rolls his or her eyes at the naivete of such a belief; or from one teacher who expects undergraduates to analyze literature by using a rigorous methodology and terminology to another who thinks it sufficient if they learn to appreciate books in whatever way is comfortable to them; or from one teacher who discourages students from summarizing, telling them, “I’ve already read the text -- I want to know what you think,” to another who says, “I don’t care what you think, I want to see how carefully you’ve read the text.” No wonder students often come up and ask, “Do you want my ideas in this paper or just a summary of the reading?” And I do not even mention the discrepancies between the humanities and science and business.

Our classrooms allow us on the faculty to tune each other out, but our students don’t have that luxury. They consequently develop their own protective forms of courseocentrism, adapting to a compartmentalized curriculum by mentally compartmentalizing us. I’m thinking of the familiar student practice of “psyching out” successive teachers and giving each of us whatever we seem to want even if it flatly contradicts what the last teacher wanted. Students thus learn to become relativists at ten a.m. and objectivists after lunch. We often complain about the cynicism of this shape-shifting act, but arguably it is precisely the behavior our curricular mixed messages encourage. Since the disjunctions between courses prevent them from forming an intelligible collectivity, students end up concluding that the only way they can figure us out is one at a time. This virtually means starting over from scratch in every new course.

Some defend this mixed message curriculum as a healthy cognitive workout regimen, an antidote to dogmatic certainty, or even as the perfect training for dealing with the ambiguities, instabilities, and unpredictable changes of life in the 21st century. And the high achieving minority of students do flourish under this curriculum, since they are able to synthesize its disparate views or summarize the places where they conflict, creating on their own the connected conversation that the curriculum obscures and thus entering it as insiders. These high achievers detect the places where their diverse courses converge and therefore experience the redundancy and reinforcement our minds need, according to information scientists, to make sense of the world. But for the struggling majority, the discontinuities from one course to the next tend to erase this redundancy and reinforcement, leading them to come way with a greatly exaggerated picture of the differences between faculty members, disciplines, and fields while missing the common practices underneath. When taking courses becomes a process of serially giving your teachers whatever they seem to want -- assuming you can figure out what that is to begin with -- jumping through hoops replaces deep socialization into the intellectual community. In other words, the disconnect between courses and teachers ultimately reproduces itself in a disconnect between students and academic culture itself.

Courseocentrism thus goes far toward explaining the apathy and disengagement that educational researchers have found in reports like the National Survey of Student Engagement. It also helps explain the finding of less well publicized studies that students who learn a subject well enough to get a good grade in a course often fail when they are asked to apply what they learned to a context outside the course. In one study discussed by Howard Gardner in his book The Unschooled Mind, elementary school students who did well on tests that required them to know that the earth is spherical and revolves around the sun reverted to their earlier flat earth beliefs when tested after the course. Their learning was apparently so tied to the course in which they’d learned it that once the course was over they quickly forgot it and regressed to their pre-educated understandings. As my correspondent Jim Salvucci put it, “What you learn in a course stays in the course.”

Again, however, underlying the great diversity and difference in the substantive content of today’s academic intellectual culture lies an important area of common ground with respect to its fundamental practices, though this common ground is hidden both from students and teachers by the disconnection of courses. Whether we follow Lacan or Leavis, we would not have gotten very far in the university unless we had mastered the fundamentals of reading, analysis, and argument, of summarizing others and using them to define our own ideas, that comprise what we now call “critical thinking skills.” It is this implicit agreement on core practices -- as distinct from the content of our ideas -- that explains why colleagues who otherwise have little in common tend to agree overwhelmingly on who the good students are. But our separation from each other in the curriculum prevents us from discovering the existence of these practices and thus alerting students of their existence.

This failure to recognize our common ground has marked the culture war debates that have embroiled us since the mid-1980s. As I have argued elsewhere, we became so caught up in the conflicts over which books should be taught and how that we lost sight of the fact that for most American students -- again with the exception of the high-achieving few -- the great stumbling block has always been the culture of books and book discussion as such, regardless which side gets to draw up the reading list. And today we are still so caught up in the battles between traditional and trendy versions of intellectual culture that we lose sight of the fact that to most students it is the nebulosity of intellectual culture itself that is the problem, whether the form this culture comes in is traditional or trendy.

I have elsewhere described coming up against this problem myself in a course in which I had juxtaposed assigned readings by the arch-traditionalist Allan Bloom and the radical African American feminist bell hooks. To any academic insider, Bloom and hooks are so far apart ideologically as to be on different planets, but I realized that for some of my students they were virtually indistinguishable, both using an obscure academic language to discuss problems the students had a hard time seeing as problems. In a succinct formulation of the point that Michael Bérubé offered me after hearing a talk in which I struggled to articulate it, any two eggheads, no matter how far apart ideologically, will always be far closer to each other than to non-eggheads. Again, the reason is that eggheads -- intellectuals -- whether they are on the Left or the Right, are defined and differentiated from outsiders by their membership in a common culture of ideas and arguments, a common culture that our curricular mixed messages hide from our students and our non-communicating courses hide from us.

I’m often told that I’m naïve in thinking that academics will ever willingly consent to coordinate their courses across their partisan divisions, much less argue with each other in the ways such coordination might require. I am told that, whether rightly or wrongly, arguing out our differences just isn’t the way the academic world works. Yet it’s striking to me that we argue out our differences all the time when we review each other’s books and articles and engage each other in our publications and professional conferences.

In fact, I’m always shocked by the contrast between the academic conference scene, with its intense and lively -- if often acrimonious debates -- and our isolation from each other when we go back home. It’s not uncommon at a conference for me to run into a colleague from my own department whom I’ve passed in the hallways for years and discover that we have common interests we never suspected. I wonder why we had to travel hundreds of miles to have a conversation about the professional issues we care most about, but it’s apparently the fact that we care about them that makes them too risky for home consumption. It’s as if academic conference culture itself came into existence to satisfy a desire for intellectual community that wasn’t met by campus culture -- a fact that might guide us in changing campus culture. When I reflect that I’ve probably learned more about how to be an academic at conferences than I ever did in graduate school, I’m all the more convinced that there has to be a better way to organize intellectual life for educational purposes than dicing it into non-communicating courses.

I mentioned earlier that such courses are at odds with the new forms of connectivity enabled by our new electronic technology. They are also at odds with the most sophisticated and original work in the humanities during the last generation, which has taught us that what seem to be free-standing identities—whether they be texts or selves -- are produced by collective structures of discourse and representation. It seems we have deconstructed the autonomous, self-authorizing subject and the autonomous, self-authorizing literary work. It’s time we got around to deconstructing the autonomous, self-authorizing course.

Gerald Graff
Author's email: 

Gerald Graff is professor of English and education at the University of Illinois at Chicago. He is the immediate past president of the Modern Language Association and this essay is adapted from the presidential address he gave in December at the association's annual meeting.

Dada in the Classroom

My assignment didn’t cause consternation, but it presented challenges. I told them to write a poem on their favorite fruit, bring it to next week’s class, read (perform) it, and then share the fruit with the class. By way of preparation, we studied a badly copied (by me) art-book reproduction of a painting by Zurbaran and lingered at length over Pablo Neruda’s blurb on the back cover of Julio Cortazar’s book, Hopscotch, where (I paraphrase), Neruda claims that having never read Cortazar is like never having tasted a peach; a man like that, who never tasted a peach, would become sadder and sadder until one day he’d die of sadness. I dwelt briefly on that ever-bothersome noun “man,” as being generic for Reader, but faintly scented in this case by Neruda’s well-known erotic appetite that set up a suspect dynamic between Man-Reader, Untasted Peach, and Terminal Sadness. The women snickered.

We discussed fruit in poetry throughout the ages, beginning with the plum flowers and, eventually, plums of the Japanese haiku poets, through the modernists, particularly W.C. Williams, whose refrigerated and missing plums are on the lips of every poetry student in America. We then went around the room to see what was everyone’s favorite fruit. Grapes came up first, plums in a close second, melons third, apples fourth, and oranges sixth. Peaches came in a distant tenth, a fact due possibly to our location in southern Louisiana, where peaches don’t grow. I ascertained also that cumquats, fresh figs, persimmons, guava, and star-fruit were unknown to students in Poetry Writing 4000, an intermediate poetry class.

An “intermediate poetry class” is the product of decades-long elaboration of an absurdity that once ensconced within the English Department and the Humanities could only be dislodged by a major thought earthquake, equal in potency to the Dada revolution. No such earthquake-revolution has occured in the teaching of the humanities since Dada itself became the predominant pedagogy of our “higher education” system in the post-modern Sixties.

Some scholars would trace the introduction of Dada teaching in the humanities to the beginning of American education, with its menu of “electives.” “Electives” are Dada by nature, a quality that did not escape Ezra Pound, who credited “electives” for the opening in his poetry to other languages, quotation, parody, ironies, essaying, verse free to dance on the page and out of prosodic strictures, and the introduction of elements hitherto alien to poetry, such as economic opinions.

Still, it was not until the mid-Sixties that the “teaching” of the creative arts became institutionalized. Not coincidentally, the Dada method became “natural” to working artists first, then to poets. After the Dada presence in New York facilitated abstract-expressionism and the poetry of Frank O’Hara, American artists and poets no longer felt provincial when they compared themselves to the Europeans. By the second generation of New York artists and poets, the Dada roots of the new art were starting to be forgotten, to make room by the third, fourth, and fifth generation to the “natural” sense of art-making and “poetry-writing” that then could, through such “normality,” become pedagogy.

Even the description of such an evolution can seem “normal, if it were not for the stubborness of Dada itself, a movement born during World War I out of disgust with all Western “civilized” institutions, including universities, especially the humanities, which the Dadas saw as particularly pernicious. The Dada generation of 1915-1927, led by the brilliant and insufficiently understood poets Tristan Tzara and Hugo Ball, and artists Marcel Duchamp, Marcel Janco Jean Arp, and George Grosz, called for an artist-led revolution in society, a revolution conducted by means of chance, randomness (“electives”), denial of previous esthetic pieties, and the shakeup of traditional institutions, including private property and the family.

The Dada arsenal was vast: laughter, joy, absurdity, unpredictability; in other words, an entirely different sense of existence than that of Unamuno’s “tragic” sense, or the Russian Constructivists’ and Italian Futurists’ aggressive mechanical utopias. Dada made use of everything for the sole purpose of undoing the ideas of everything. It’s not hard to see the dubious, if not downright dangerous, consequences of the Dada method, especially in universities, where rebellion, hormones, and questioning are the very things the institution is charged with controlling.

The students were seated at the seminar table with a fruit or a bowl of fruit in front of them when I walked in the following week. I sat at the head of the table. On my left was Amy, with a large cluster of white grapes in a blue bowl before her; at my right was Melanie, facing a Cassaba melon with several circles of words magic-markered around it; Matt faced a grapefruit; Martin an apple. The 12 students in “intermediate poetry” stood before their inscribed fruits like figures in a tableau-vivant, waiting for the signal to begin the performance. Amy distributed one grape to each of us and asked us to write a word on it. I wrote “peach” on mine; Melanie wrote “love,” and others wrote whatever they wrote, and then some of them ate it, and some of them threw their grape back at Amy who ate them all; six students ate their own word written on Amy’s grape, and Amy ate six words others had written on her grapes, including “peach” and “love.”

Melanie stood holding her Cassaba melon like a globe or Yorick’s skull in her left hand and read it slowly rotating it to see all the lines; she then passed the Cassaba around and everyone read a line; amazingly, there were exactly 13 circular lines on the melon; she then cut it open with a sharp folding knife of illegal dimensions (on an airplane, certainly) and passed slices that everyone ate like communion, there being present also an eerie, nearly sacerdotal silence. And so it went, fruit after fruit, read, performed, eaten, in an order that could have not been more perfect if Noah’s monitors had been there. We thus learned that: a) poetry can be edible (and perhaps it should be); b) fruit is a sexier medium than paper or pixels; c) school could be fun, d) “intermediate” could mean that even though the medium had not been quite reached (advanced), the closeness to experience itself (beginning), made it worthwhile, e) it’s not so easy to write on fruit without good magic markers, and f) T.S. Eliot need not be memorized.

Was Dada domesticated by this pedagogical demonstration? The Dadaists were prolific generators of forms: assemblage, collage, decoupage, simultaneous readings, collaoration (cadavre-exquis), noise making, tattooing innocents, placing people on bookshelves and books in spectators’ seats, wearing hats made from bird cages. Their fertility gave birth to the styles, looks, attitudes, and objects of the 20th century, but the best results were not the objects, but the process of making them.

The current thinking in the humanities is that creativity and artistic production are good things, so good, in fact, that their subversive qualities could be overlooked. After all, Dada, like other modern movements, has been studied to death; nothing alive could survive such exegesis. I am willing to bet, however, that 10 years hence, my fruit-writing students, now in advertising and new media, will look back on their school years and remember nothing except the Dada moment in their “intermediate” poetry class. Is Dada pedagogy useful in today’s clasroom? There isn’t any other worth the absurd price of “higher education.” More Dada!

Andrei Codrescu
Author's email: 

Andrei Codrescu is the author of The Posthuman Dada Guide: Tzara and Lenin Play Chess, forthcoming from Princeton University Press.

Cookie-Cutter Monsters, One-Size Methodologies and the Humanities

I know what you’re thinking: Why is a poet writing about assessment in higher education? Honestly, I wonder that myself. One day, when assessment came up in conversation, I commented that it could be useful to programs as they make curricular decisions. Within 48 hours, the dean placed me on the institution’s assessment committee. Suddenly, assessment is a hot topic and, of all people, I have some expertise.

My years on that committee convinced me that we must pay attention to the rise of assessment because it is required for accreditation, because demands have increased significantly, and because it might be useful in our professional lives. Accrediting bodies are rightly trying to stave off the No Child Left Behind accountability that the Spellings Commission proposes. Maybe the incoming secretary of education will consider how we might be better -- not more -- accountable. Perhaps, too, Wall Street should be held accountable before the Ivory Tower. But assessment for higher education will likely become more pressing in a weak economy.

One tool to which many institutions have turned is the National Survey of Student Engagement (NSSE, pronounced Nessie). NSSE was piloted in 1999 in approximately 70 institutions, and more institutions participate each year. This survey appeals especially to college and university presidents and trustees, perhaps because it’s one-stop, fixed-price assessment shopping. NSSE presents itself as an outside -- seemingly objective -- tool to glean inside information. Even more appealing, it provides feedback on a wide array of institutional issues, from course assignments to interpersonal relationships, in one well-organized document. Additionally, the report places an institution in a context, so that a college can compare itself both with its previous performance and with other colleges generally or those that share characteristics. And it doesn’t require extra work from faculty. NSSE seems a great answer.

Yet, NSSE does not directly measure student learning; the survey tracks students’ perceptions or satisfaction, not performance. Moreover, respondents appraise their perceptions very quickly. In the 2007 NSSE, students were informed, “Filling out the questionnaire takes about 15 minutes” to complete 28 pages, some of which included seven items to rate. So, as with its Scottish homonym, NSSE presents a snapshot of indicators, not the beast itself.

Importantly, NSSE is voluntary. A college or university can participate annually, intermittently, or never. If a college performs poorly, why would that college continue? If a university uses the report to, as they say in assessment lingo, close the loop, wouldn’t that university stagger participation to measure long-term improvements? Over its 10-year existence, more than 1,200 schools have participated in NSSE, and participation has increased every year, but only 774 schools were involved in 2008, which suggests intermittent use. In addition, some institutions use the paper version, while others use the Web version; each mode involves a different sample size based on total institutional enrollment. NSSE determines sample size and randomly selects respondents from the population file of first-years and seniors that an institution submits.

Perhaps, all these factors lead NSSE to make the following statement on its Web site: "Most year-to-year changes in benchmark scores are likely attributable to subtle changes in the characteristics of an institution’s respondents or are simply random fluctuations and should not be used to judge the effectiveness of the institution. The assessment of whether or not benchmark scores are increasing is best done over several years. If specific efforts were taken on a campus in a given year to increase student-faculty interaction, for example, then changes in a benchmark score can be an assessment of the effectiveness of those efforts."

This statement seems to claim that an increase in a score from one year to the next is random unless the institution was intentionally striving to improve, in which case, kudos. Yet, NSSE encourages parents to “interpret the results of the survey as standards for comparing how effectively colleges are contributing to learning” in five benchmark areas, including how academically challenging the institution is.

I have larger concerns, however, about assessment tools like NSSE, which are used for sociological research on human subjects. The humanities and arts are asked to use a methodology in which we have not been trained and for which our disciplines might not be an appropriate fit. NSSE is just one example of current practices that employ outcomes-based sociological research, rubric-dominated methodology, and other approaches unfamiliar in many disciplines.

Such assessment announces , anyone can do it. I’ve seen drafts of outcomes and rubrics, and that’s not true. Programs like education and psychology develop well-honed, measurable outcomes and rubrics that break those outcomes down into discernable criteria. Programs in the sciences do a less effective job; some science faculty assert that the endeavor is invalid without a control group, while admitting that a control group that denies students the environment in which they most likely learn would be unethical.

Those of us in the arts and the humanities want wide, lofty outcomes; we resist listing criteria because we disagree, often slightly or semantically, about what’s most important; we fear omission; and we want contingencies in our rubrics to account for unexpected — individual, creative, original — possibilities. Writing and visual art cannot easily be teased apart and measured. Critical thinking and creative thinking are habits of mind. How can NSSE or rubrics capture such characteristics?

Moreover, by practicing social science, often without reading a single text about those methods, arts and humanities faculty diminish the discipline we poach as well as lessen the value and integrity of our conclusions. If we don’t know what we’re doing — how many of us really understand the difference between direct and indirect measures or between outcomes, objectives, goals, and competencies — the results are questionable. To pretend otherwise is to thumb our noses at our social science colleagues.

Further, this one-size-fits-all, cookie-cutter mentality ignores that different disciplines have different priorities. Included in Thomas A. Angelo and K. Patricia Cross’s Classroom Assessment Techniques is a table of top-priority teaching goals by discipline. Priorities for English are Writing skills, Think for oneself, and Analytic skills, in that order. Arts, Humanities, and English have just one goal in common: Think for oneself. We can survey student perceptions of their thinking — an indirect measure — or maybe we know independent thinking when we see it, but how do we determine thinking for oneself in a data set? These priorities aren’t even grammatically parallel, which may not matter to social scientists, but it matters to this poet!

Other priorities for Arts — Aesthetic appreciation and Creativity — and Humanities — Value of subject and Openness to ideas — are difficult, if not impossible, to measure directly. The priorities of Business and Sciences are more easily measured: Apply principles, Terms and facts, Problem solving, and Concepts and theories. So, a key issue is to determine whether the arts and humanities can develop ways to assess characteristics that aren’t really measurable by current assessment methodology or whether we must relinquish the desire to assess important characteristics, instead focusing on easily measured outcomes.

Another table in Classroom Assessment Techniques lists perceived teaching roles. Humanities, English, and Social Sciences see Higher-order thinking skills as our most essential role, whereas Business and Medicine view Jobs/careers as most essential, Science and Math rank Facts and principles most highly, and Arts see Student development as primary. Both knowledge of Facts and principles and job placement can be directly measured more easily than Student development. For English, all other roles pale in comparison to Higher-order thinking skills, which 47 percent of respondents rated most essential; the next most important teaching role is Student development at 19 percent. No other discipline is close to this wide a gap between its first- and second-ranked roles. Surely, that’s what we should assess. If each discipline has different values and also differently weighted values, do we not deserve a variety of assessment methodologies?

Lest I bash assessment altogether, I do advocate documenting what we do in the arts and humanities. Knowing what and how our students are learning can help us make wise curricular and pedagogical decisions. So, let’s see what we might glean from NSSE.

Here are items from the first page of the 2007 NSSE:

  • Asked questions in class or contributed to class discussions
  • Made a class presentation
  • Prepared two or more drafts of a paper or assignment before turning it in
  • Worked on a paper or project that required integrating ideas or information from various sources
  • Included diverse perspectives (different races, religions, genders, political beliefs, etc.) in class discussions or writing assignments

Students were asked to rate these and other items as Very often, Often, Sometimes, or Never, based on experience at that institution during the current year. These intellectual tasks are common in humanities courses.

In another section, students were questioned about the number of books they had been assigned and the number they had read that weren’t assigned. They also reported how many 20+-page papers they’d written, as well as how many of 5-19 pages and how many of fewer than five pages. We can quibble about these lengths, but, as an English professor, I agree with NSSE that putting their ideas into writing engages students and that longer papers allow for research that integrates texts, synthesizes ideas, and encourages application of concepts. And reading books is good, too.

Another relevant NSSE question is “To what extent has your experience at this institution contributed to your knowledge, skills, and personal development in the following areas?” Included in the areas rated are the following:

  • Acquiring a broad general education
  • Writing clearly and effectively
  • Speaking clearly and effectively
  • Thinking critically and analytically
  • Working effectively with others

The English curriculum contributes to these areas, and we are often blamed for perceived shortcomings here. While NSSE measures perceptions, not learning, this list offers a simple overview of some established values for higher education. If we are at a loss for learning outcomes or struggle to be clear and concise, we have existing expectations from NSSE that we could adapt as outcomes.

In fact, we can reap rewards both in assessment and in our classrooms when students become more aware of their learning. To do this, we need some common language — perhaps phrases like writing clearly and effectively or integrating ideas or information from various sources — to talk about our courses and assignments. Professional organizations, such as the Modern Language Association in English or the College Art Association in the visual arts, could take the lead. Indeed, this article is adapted from a paper delivered at an MLA convention session on assessment, and the Education Committee of CAA has a session entitled “Pedagogy Not Politics: Faculty-Driven Assessment Strategies and Tools” at their 2009 conference.

We needn’t reorganize our classes through meta-teaching. Using some student-learning lingo, however, helps students connect their efforts across texts, assignments, and courses. Increasingly, my students reveal, for instance, that they use the writerly reading they develop in my creative writing courses to improve their critical writing in other courses. I have not much altered my assignments, but I now talk about assignments, including the reflective essay in their portfolios, so that students understand the skills they hone through practice and what they’ve accomplished. Perhaps, I’m teaching to the test — to NSSE — because I attempt to shift student perceptions as well as the work they produce. But awareness makes for ambitious, engaged, thoughtful writers and readers.

Good teachers appraise their courses, adapt to new situations and information, and strive to improve. As Ken Bain points out in What the Best College Teachers Do, “a teacher should think about teaching (in a single session or an entire course) as a serious intellectual act, a kind of scholarship, a creation.” We are committed to teaching and learning, to developing appropriate programs and courses, and to expectations for student achievement that the Western Association of Colleges and Schools asks of us. We can’t reasonably fight the North Central Association of Colleges and Schools mandate: “The organization provides evidence of student learning and teaching effectiveness that demonstrates it is fulfilling its educational mission.” Assessment is about providing evidence of what we do and its effects on our students. Our task in the arts and humanities is to determine what concepts like evidence, effects, and student learning mean for us. If NSSE helps us achieve that on the individual, program, or institution levels, great. But NSSE is best used, not as an answer, but as one way to frame our questions.

Anna Leahy
Author's email: 

Anna Leahy teaches in the English M.F.A. and B.F.A. programs at Chapman University. Her poetry collection Constituents of Matter won the Wick Poetry Prize, and she is the editor of Power and Identity in the Creative Writing Classroom.

The Mind on Fire

In a lecture, the novelist Robertson Davies once gave a wry characterization of the life of a full-time writer. It is, he said, every bit as gratifying as non-writers usually imagine it to be -- “except for the occasional complete collapse of the will to go on.”

This is quoted from memory (my effort to relocate the passage has not gone well) so the wording may be inexact. But the turn of phrase certainly rhymes with experience -- in particular, the tension of emphasis in “occasional complete collapse,” with its mix of casual surprise and total devastation. I feel less certain that Davies used the expression “the will to go on.” It sounds a bit melodramatic. But then, he was a satirist, and he might well have been making fun of the impulse to indulge in overacted displays of artistic temperament. (Making fun of this need not preclude indulging it.)

Anyone who spends much time trying to put the right words in the right order will accumulate a private anthology of passages like this one: quotations that map the high and the low points on the interior landscape of the writing life. Knowing that others have been there before you is reassuring – if only just so much help.

For Robert D. Richardson – the author of, among other things, William James: In the Maelstrom of American Modernism (Houghton Mifflin Harcourt), which won the Bancroft Prize for 2007 – one such landmark passages appears in “The American Scholar.” There, Ralph Waldo Emerson writes: “Meek young men grow up in libraries believing it their duty to accept the views that Cicero, Locke, and bacon have given, forgetful that Cicero, Locke, and Bacon were only young men in libraries when they wrote those books.”

In First We Read, Then We Write: Emerson on the Creative Process (to be published in March by University of Iowa Press), Richardson says the line “still jolts me every time I run into it.” I think I know what he means, but the quality and intensity of the jolt varies over time. Reading “The American Scholar” as a meek young man, I just found it irritating – as if Emerson were translating the anti-intellectualism of my small town into something more refined and elegant, if scarcely less blockheaded.

This was a naive reading of a remarkable and (at times) very weird essay. "The American Scholar" is actually something like a Yankee anticipation of Nietzsche’s “On the Use and Abuse of History for Life” – with the added strangeness that, when Emerson gets around to pointing out a prototype of the new-model American scholar, the example he gives is ... Emanuel Swedenborg, the 18th century Swedish polymath. Who, when not writing huge works on the natural sciences, spent his time talking to angels and devils and the inhabitants of other planets. WTF?

Rereading Emerson a couple of decades beyond adolescence, I saw that the target of his scorn was meekness -- not bookishness, as such. He was in any case not so genteel as he first appeared. There was a wild streak. There were depths beneath the oracular sentences that made him a kind of cultural revolutionary. You are not necessarily prepared to detect this when reading Emerson as a teenager. Like Bob Dylan says, "Ah, but I was so much older then, I'm younger than that now."

Richardson’s award-winning Emerson: The Mind on Fire (University of California Press, 1996) retraced his subject’s voracious and encyclopedic reading regimen, which seems to have been tinged with the urgency of addiction. That book was intellectual biography. The new one, which is far shorter, is something else again -- a synthesis of all the moments when Emerson muses over his own process, a distillation of his ethos as a reader and (especially) as writer.

“A good head cannot read amiss,” says Emerson. “In every book he finds passages which seem confidences or asides, hidden from all else, and unmistakeably meant for his ear.” Full attention and active engagement are always, by Emerson's lights, present-minded: “I read [something] until it is pertinent to me and mine, to nature and to the hour that now passes. A good scholar will find Aristophanes and Hafiz and Rabelais full of American History.”

Not being prone to foolish consistency, Emerson also maintains that some academic works are incapable of coming to life themselves, let alone revitalizing anyone else. “A vast number of books are written in quiet imitation of the old civil, ecclesiastical, and literary history,” he says. (One may quietly updates this by thinking of comparable 21st century tomes.) “Of these we need take no account. They are written by the dead to be read by the dead.”

By contrast, meaningful writing is an effort “to drop every dead word.” Emerson rules out any effort to rub pieces of jargon together in hopes they will generate sparks. “Scholars are found to make very shabby sentences out of the weakest words because of exclusive attention to the word,” he notes. You don’t say.

The struggle to connect with living currents of thought and meaning should begin with a notebook -- the place to cultivate, as Emerson puts it, “the habit of rendering account to yourself of yourself in some rigorous manner and at more certain intervals than mere conversation.” The important thing is to keep at it: “There is no way to learn to write except by writing.”

This may sound like generic advice, and to some degree it is. But from long years of scholarly attention to the daily progress of the essayist’s labors, Richardson hears the anxious undercurrents in Emerson’s reflections on writing. “There is a strangely appealing air of desperation, finality, of terminal urgency,” he writes, “to many of Emerson’s observations.... In every admonition we hear his willingness to confront his own failures; indeed, he never seems more than a few inches from total calamity. He urges us to try anything – strategies, tricks, makeshifts. And he always seems to be speaking not only of the nuts and bolts of writing, but of the grain and sinew of his – and our – determination.”

When necessary, Richardson points out, Emerson would “just sit down and start writing – anything – to see whether something would happen. He was quick to spot the same trick in others. ‘I have read,’ he noted, ‘that [Richard Brinsley] Sheridan made a good deal of experimental writing with a view to take what might fall, if any wit should transpire in all the waste of pages.”

Kenneth Burke once described Emerson’s prose as a “happiness pill” – that being a common enough assessment, though there is more to the sage than his role as dispenser of transcendental Prozac. It makes some difference to know that the pharmacist also had to heal himself. He cannot have been free from all of the worldly desires felt by lesser writers. The same wishes mean the same frustrations. The challenge is to keep faith with the rest of one’s reasons for writing – the motivations that break through the rubble.

First We Read, Then We Write is worth keeping at hand for moments of occasional complete collapse. I'll end with a passage that now belongs in the anthology, for emergency use:

“Happy is he who looks only into his work to know if it will succeed, never to the times or to the public opinion; and who writes from the love of imparting certain thoughts and not from the necessity of sale – who writes always to the unknown friend.”

Scott McLemee
Author's email: 

Toward a 21st Century Renaissance -- in My Day


Given this chilly climate for administrators -- salary freeze, hiring freeze -- I turn for relief to that dusty ghost-town in my mind’s geography, the one labeled Intellect. This turn has been further encouraged by the publication in recent months of an article on the influence, or lack thereof, of a book I wrote 20 years ago on the relations between American and British writers in the 19th century, titled Atlantic Double-Cross. This book tried to explain why the writers of each country hated each other’s guts and how this animosity informs the great literary works of the period. In it, I argued for a new subdiscipline of comparative literature that would take up the Anglo-American relationship. The book pretty much flopped, in my view, and so I was delighted to read even a measured discussion of the book’s effect on my discipline — delighted, that is, until I arrived at a paragraph beginning, “In Weisbuch’s day. ...”

At first I was tempted to call the gifted, clearly youthful Columbia professor who wrote this sentence to say, “listen, it may be late afternoon; it may even be early evening. But it is still my day.”

More to the point, the phrase made me realize that I am pretty old, and that made me think — I guess I am supposed to speak like a codger now and say instead, “that got me to thinking...” — about the changes in academe in my lifetime. I thought about the move of psychology, for instance, away from the humanities through the social sciences over to the sciences, a journey by which Freud was moved from being a point of reference to a butt for ridicule.

I considered the tendency for economics to forego fundamental questions for refining an accepted model. I note as well a decline in the influence of the humanities, from whence most university presidents arose in the 1930’s, say, and an ascendancy of the sciences, and in particular genetic science, from which field an increasing number of our institutional leaders now emerge.

But going through these admittedly contentious thoughts, I saw something more substantial, which was that my thinking was taking place via the disciplines — and to that I added the realization that my poor book of so long ago had stated itself as an attempt to create a subdiscipline. I have just recently reread Douglas Bennett’s very perceptive quick history of the liberal arts in the 20th century where he notes that the organization of colleges by these disciplines that we now take so for granted in fact was a fast and dramatic occurrence between about 1890 and 1910. Today, it seems, we really care more about them than we do about the whole self or whatever the liberal arts ideal is.

So I got angry at the disciplines, and there is reason for that. It gets difficult to understand, especially at the graduate level, why a doctorate in literature and a doctorate in physics exist on the same campus when it seems they might as well be pursued on different planets. During a year when I served as graduate dean at the University of Michigan, I attended a physics lecture and was seated next to the chair of the comparative literature program. As the scientist went on, my neighbor whispered to me incredulously, “This guy thinks the world is real.” That takes C.P. Snow’s two-worlds problem to a new and desperate place.

Or again, I invited the scientist from NYU who had successfully submitted an article of post-structuralist nonsense to a journal of literary theory and had his hoax accepted to speak at Michigan, with a panel of three scientists and three humanists responding. As the large crowd left the room, the conversations were remarkable. The scientists in the audience to a person found the critique of the pretension of literary theory wholly persuasive. The humanists to a person felt that their colleagues had successfully parried the attack, no question about it, reminding the physical and life scientists that their language could be pretty thick to an outsider too and that the very history of science could be seen as the overturning of accepted truths to be revealed as unintended hoaxes.

And so, enraged at the disciplines, I tried to imagine what it would be like to have a university, a world, a mind that did not rely on the disciplines — and failed.

And my next move is to say, perhaps this is fine. If general education is tantamount to a mild benevolence toward humanity, involvement in a discipline is like falling passionately in love with a particular person. We need both. It is okay to be captured by an ecstatic interest. But we also know the danger of early love. In the words of Gordon McRae or somebody, “I say that falling in love is wonderful.” And indeed it is arguable at least that we do not induct students into a love of the life of the mind by abstractions but by finding the single discipline that fixes their fascination.

Even so, we want that fascination to be versatile, to be capable, that is, of moving from one arena of thought to another, or at least to understanding why someone else would care passionately about something else. Every summer, I spend a week on an island off Lake Winnipesaukee. This is very odd for me, as my relation to nature is such that a friend once asked if I had suffered a traumatic experience in a forest or a park. I prefer my nature in iambic pentameters, and this family island, without electricity or plumbing, I have dubbed The Island Without Toilets. Still, it is restful, and each year we campers read and discuss a book or essay. One year it was Bill McKibben’s book, The Age of Missing Information. In this tome, McKibben contrasts a day spent hiking to a modest mountaintop to a day spent watching a full 24 hours of each channel of a cable television system in Virginia. (The fact that there were only 90 channels in 1992 tells us that we are losing more information all the time.) The book is somewhat eco-snobby, but McKibben’s main contrast is really not between the natural world and its vastly inferior electronic similitude or replacement but between deep knowledge and sound bites.

He illustrates deep knowledge by an Adirondack farmer’s conversation concerning each and all the species of apple. There is so much to know, it turns out, about apples; indeed, there is so much to know about everything. As I wrote a few years ago, “Life may appear a deserted street. But we open one manhole cover to find a complex world of piano-tuning, another to discover a world of baseball, still others to discover intricate worlds of gemology and wine, physical laws and lyric poetry, of logic and even of television.” And I asked, “Do our schools and colleges and universities reliably thrill their charges with this sense of plenitude?”

They do not. And while I cannot even imagine a world without the disciplines — which are really the academic organization of each of these microcosms of wonder -- I can imagine them contributing to an overall world flaming with interest. Falling in love is great and irreplaceable, but how about reimagining the campus as Big Love, Morman polygamy for all sexes, or at least as a commune, where each of us is mated to a discipline but lives in close proximity with family-like others on a daily basis.

That is, I believe, what we are, however awkwardly, attempting by having the disciplines inhabiting the same campus. However much general education has been swamped by disciplinary insistence, a remnant remains. Even academics tend to tell other people where they went to college, not so much in what they majored. We probably already possess the right mechanism for a 21st century renaissance. It just needs some adjustments.

I want to suggest two such adjustments. One is in the relation of the arts and sciences to the world; and another readjusts the arts and sciences in relation to themselves and to professional education.


When I was at the University of Michigan several years ago, something shocking took place. The sciences faculty, en masse, threatened to leave the college of liberal arts. “How could the sciences leave the arts and sciences any more than Jerry could leave Ben and Jerry’s?,” I asked someone who had been present at these secession meetings. “The same way another Jerry could leave Dean Lewis and Jerry Martin,” he replied. Somehow to Michigan’s credit, the rebellion was quelled, but to me it is suggestive of the weakness of the liberal arts ideal at many of our institutions.

There are many signs of its frailty, beginning with the frequent statistic that more students at four-year colleges now major in leisure studies than in mathematics and the sciences. It is difficult to find a middle or high school where anyone speaks of the liberal arts, and much as I have been worrying about the disciplines, aside from scattered efforts they seem to have been missing in action from much of the last forty years of discussion of school reform. In speaking about the arts and sciences in relation to the world, I want to suggest, though, that the lording of the disciplines over general education and the absence of the excitement of the disciplines in the schools have everything to do with each other.

This near paradox can be illustrated best if I stay within my own neighborhood of the humanities for this aspect of the argument. Last month, filled with nostalgia, I agreed to serve on a panel for the National Humanities Association, which advocates to Congress for funding for these impoverished disciplines. My job was to provide one version of the speech for the public efficacy of English and history, religion and philosophy and so on. I decided to fulfill this assignment rapidly and then to ask why, if we believed in the public efficacy of the humanities, we utterly ignored it in our mentoring of graduate students in these disciplines.

My argument for the humanities is exactly the same as my argument for the arts and sciences generally. As a young person, I never expected a major battle of my lifetime to be the renewal of dogmatic fundamentalism in opposition to free thinking. I find myself again and again referring to an episode of the television program "The West Wing" that aired shortly after 9-11. The president’s youthful assistant is speaking to a group of visiting school children and he says, “Do you really want to know how to fight terrorists? Do you know what they are really afraid of? Believe in more than one idea.”

This is not always as simple as the Taliban versus Shakespeare. There are subtle discouragements within our own society to the freedom to doubt and the freedom to change one’s mind. And there are elements within each of us that tend toward dogmatism and against the embracing of difference and a will to tolerate complexity. The campus, ideally, is a battleground for this freedom.

Against the many who would tyrannize over thought, we need to fight actively for our kind of education, which is far deeper than the usual political loyalties and divisions. God and George Washington are counting on us. And so are all those kids in East LA scarred by violence and poverty. In a nation of inequality and a world of sorrows, damn us if we neglect to advocate effectively for the only education that lifts up people.

Having said that, I asked why, paraphrasing Emerson, we do not turn our rituals and our rhetoric into reality. Over the last 40 years, the professoriate in the humanities has been a mostly silent witness to an atrocity, a huge waste of human resources. According to Maresi Nerad in the "Ten Years After" study, in a class of 20 English Ph.D.'s at whatever prestigious institution, three or four will end up with tenure-track positions at selective colleges or research universities. And yet this degree program, and all others in the humanities, pretend that all 20 are preparing for such a life. It’s a Ponzi scheme.

When I led the Woodrow Wilson Foundation, we began a Humanities at Work program, one aspect of which was to give little $2,000 scholarships to doctoral students who had found summer work beyond the academy. A cultural anthropologist at Texas worked at a school for delinquent girls who had been abused as children. She employed dance, folktales, autobiographical writings and a whole range of activities related to her expertise to improve these girls’ self-images. A history student at U.Va. created a freedom summer school for fifth graders in Mississippi, teaching them African American history. Meanwhile, we secured thirty positions at corporations and non-profits for doctoral graduates.

Our point was not to become an employment agency but to suggest that every sector of society, from government to K-12 to business, could benefit hugely by the transferable talents of people who think with complexity, write and speak with clarity, and teach with verve and expertise. We wanted such graduates to comprehend the full range of their own possibilities. Should they then decide to enter academia, at least they would perceive this as a free choice. And in the meantime, the liberal arts would populate every social sector as never before. I do not mean it ironically when I look to the liberal arts takeover of the world.

For that to take place at any level of education, I think, we need to marry intellectual hedonism to the responsibility of the intellectual. If we want our professoriate and our students to apply their learning -- and I do -- if we want them not simply to critique society but to constitute it, we must first acknowledge the simple joy of learning as a prime realistic moment. My dear friend Steve Kunkel is a leading investigator at Michigan of the AIDS virus. He is a fine fellow and I am certain that he would wish to reduce human suffering. But when I call Steve at 7 in the morning at his lab, because I know he will be there already, he is there less out of a humanitarian zeal than because he is crazy about science, the rattiest of lab rats. Just so, when I unpack a poem’s meaning, I experience a compulsive enjoyment. This is half of the truth, and it leads someone like Stanley Fish to scorn the other half by writing a book with the title Save the World on Your Own Time.

I think we can devote some school time to saving the world without prescribing or proscribing the terms of its salvation. Louis Menand, surely no philistine, argues that we need to get over our fear of learning that may brush shoulders with the practical and more generously empower our students. Granted, and granted enthusiastically, academic enclosure, the distancing of a quiet mind from the harsh noise of immediacy, is a great joy, even a necessity in the growth of an individual. But when it becomes the end rather than the instrument, we approach social disaster. We must travel back and forth between the academic grove and the city of social urgencies.

This is to say, and beyond the humanities, that a certain precious isolation — is it a fear? — has kept the fruit of the disciplines within the academy, away even from our near neighbors in the schools. The absence of the disciplines from the public life and the bloating of the disciplines to squeeze out the liberal arts ideal in the colleges are part and parcel of the same phenomenon. It is not that the world rejected the liberal arts but that the liberal arts rejected the world.

In a brilliant article, Douglas Bennett provides a brief history of 20th century college in which he notes an increasingly exclusionary notion of the arts and sciences. And this seems to me part and parcel of the same dubious ethic that so distrusts the messiness of the social world. As I read that we arts and science adepts kept purifying ourselves — education is too messy, throw it out, along with the study of law, along with business, along with anything material (again, “That guy thinks the world is real”) -- I am reminded of Walt Whitman’s critique of Matthew Arnold, whom he termed “one of the dudes of Western literature.” To Arnold, Whitman says, “the dirt is so dirty. But everything comes out of the dirt, everything; everything comes out of the people, the people as you find them and leave them: not university people, not F.F.V. people: people, people, just people.”

The liberal arts became pure and they became puerile. Having greatly expanded the old curriculum by addition and subdivision, they spent the rest of the century apologizing by limiting themselves. They expelled fascinating areas of human endeavor that then came to constitute professional education, and professional education proceeded to eat the libbies’ lunch.

Who or what can teach us to do what Menand urges, empower not only our students but our academic disciplines? The answer, plain as can be, is the sciences. Is it any wonder, given the exclusionary bent of the liberal arts, that scientists, whose subject and whose instruments of investigation are often frankly material, might consider secession, especially when social influence, which is also to say funding, was getting thrown away along with whole areas of crucial consequence?

And by the same token, it is the sciences that can teach the humanities in particular how to reconnect. Indeed, a few moments ago, I was calling for the humanities equivalent of tech transfer; and that is half of my hope for a 21st century renaissance.


By a renaissance in our time — in Weisbuch’s day -- I do not mean the recovery of classical learning and its inclusion in a Christian worldview that marked the original. I want to invoke instead the extreme interdisciplinarity of that time when the arts and sciences came so spectacularly into, if not unity, vital relationship, and when learning and worldliness ceased their contradiction. Here is what I mean. I do not in fact live on the campus of Drew University, but in a town 15 miles away, Montclair, New Jersey. Aside from the filming of some scenes featuring AJ Soprano down the street at our high school, the neighborhood was all too quiet when we moved in, with neighbors at most stiffly waving to one another from a distance. Then Tom and Janet and their three moppets moved in, along with Tom’s insane white limousine, the backyard hockey rink, the Halloween scare show, the whole circus. As Tom started offering the middle-school neighborhood kids “rock-star drop-offs” to school in his limo, everything changed. Some of our houses have large front porches, and neighbors began to congregate on summer evenings. Soon, whenever we lit the barby a few families would turn up with their own dogs and steaks and ask if they could join in. There are about ten families now that assist each other in myriad ways, that laugh together and, when necessary, provide solace and support.

The university can become a porch society in relation to the disciplines. Indeed, for the last 40 years we have been experiencing a loosening of the boundaries, as the prefix “bio” gets attached to the other sciences; as environmental studies unites the life sciences, theology, the physical sciences, public policy, even literary criticism; as Henry Louis Gates, as historian, employs genetic research to revise and complicate the notion of racial heritage. And then there is the huge potential of democratizing knowledge and recombining it through the burst of modern technology, one of whose names, significantly, is the Web.

You cannot intend a zeitgeist but you can capitalize upon one, and this is one. A few simple administrative helps occur to me as examples. We can invite more non-academics to join with us in our thinking about the curriculum. We can require our doctoral students to take some time learning a discipline truly a ways from their own rather than requiring the weak cognate or two, and we can take just a few hours to give them a sense of the educational landscape of their country. We can start meeting not with our own kind all the time but across institutional genres, and we can especially cross the divide into public education not so much by teaching teachers how to teach but with the rich ongoing controversies and discoveries of the living disciplines.

Less grandly, within our own institutions, we can pay a bonus to the most distinguished faculty member in each department who will teach the introductory course and a bigger bonus to those who will teach across disciplines, with the size of the bonus depending upon the perceived distance between the disciplines. We can stop attempting to formulate distribution requirements or core curricula via committees of 200, which is frankly hopeless in terms of conveying the excitement of the liberal arts, and instead let groups of five or ten do their inspired thing, spreading successes. We can create any number of rituals that encourage a porch society. As one new faculty member told me at a Woodrow Wilson conference years ago, “My graduate education prepared me to know one thing, to be, say, the world’s greatest expert on roller coasters. But now in my teaching position, I have to run the whole damn amusement park and I know nothing about the other rides, much less health and safety issues, employment practices, you name it.”

We might name this zeitgeist the whole damn amusement park, but I would suggest a naming in the form of a personification: Barack Obama. When I am fundraising, I often chant something of a mantra, and I ask you to forgive its sloganeering. The new knowledge breaks barriers. The new learning takes it to the streets. The new century is global. And the new America is multi-everything. There you go and here he is. Our fresh new president is indeed international, multi-racial, multi-religious, multi-ethnic, a liberal-arts major and law school grad who became a community organizer and breaks barriers with an ease that seems supernal. He was not required; like the courses we choose freely, he was elected.

Barack Obama was born on an island, and at the start of this essay I mentioned the site of my summer challenge, the Island Without Toilets. Our disciplines are islands. Our campuses are islands. And islands are wonderful and in fact essential as retreats for recuperation. But in the pastoral poems of an earlier Renaissance, the over-busy poet rediscovers his soul in a leafy seclusion but then returns, renewed and renewing, to the city. It is time for us to leave our islands. We are equipped.

Robert Weisbuch
Author's email: 

Robert Weisbuch is president of Drew University. This essay is adapted from a talk he gave at the 2009 annual meeting of the American Educational Research Association.

Business and the Relevance of Liberal Arts

At first glance, Peter Drucker might seem an unlikely candidate to have published an academic novel. Famous for writing books such as Concept of the Corporation and The Effective Executive, Drucker was dubbed “The Man Who Invented Management” in his 2005 Business Week obituary. Drucker’s audience was to be found among the Harvard Business Review crowd, not the Modern Language Association coterie, and, not surprisingly, his two novels are no longer in print.

But the university he presented in his 1984 novel, The Temptation to Do Good, confronted some key questions that face higher education institutions in today’s unprecedented financial downturn: Are current practices sustainable? Have we strayed from our core mission? Will the liberal arts survive increasing budget pressures?

As these questions -- hardly the usual literary fare -- demonstrate, Drucker’s work is a rarity among academic novels. These texts typically provide a send-up of academic life, by making fun of intellectual trends through characters such as Jack Gladney, who chairs the department of Hitler studies in Don DeLillo’s White Noise, or by parodying the pettiness of department politics, as in Richard Russo’s Straight Man, in which one English professor’s nose is mangled during a personnel committee meeting, courtesy of a spiral notebook thrown at him by one of his peers. By contrast, The Temptation to Do Good is almost painstakingly earnest in its portrayal of Father Heinz Zimmerman, president of the fictional St. Jerome University.

Like other contemporary academic novels, The Temptation to Do Good depicts the problems of political correctness, the tensions between faculty and administration, and the scandal of inter-office romance. But St. Jerome’s problems are no laughing matter. Lacking the improbable events of other academic novels -- in James Hynes’s The Lecturer’s Tale, the adjunct-protagonist even gains super-human powers -- the plot of The Temptation to Do Good is completely plausible, and the problems above destroy a good man.

St. Jerome’s chemistry department decides not to hire Martin Holloway, a job candidate with a less-than-stellar research record. Feeling sorry for the soon-to-be-unemployed Ph.D., Zimmerman decides to recommend Holloway to the dean of a nearby small college. Zimmerman knows he shouldn’t interfere, but he feels he must do the Christian thing, and so, succumbing to “the temptation to do good,” he makes the call. Meanwhile, Holloway’s angry wife spreads unfounded rumors about a dalliance between the president-priest and his female assistant. The faculty overreact to both events, and although most of them come to regret it, Zimmerman’s presidency is brought down, and he is eased out by the church into a sinecure government position.

Often reading like an intricate case study of one university’s internal politics, The Temptation to Do Good aims to do more than that, too, raising questions about the purpose of higher education institutions writ large. Representing the contemporary university as a large, bureaucratic institution -- much like the companies that Drucker’s theories would shape -- The Temptation to Do Good portrays Zimmerman as a successful executive, one who “converted a cow college in the sticks” into a national university with a reputation unrelated to its religious roots. He even makes the cover of Time magazine for increasing his endowment by a larger percentage than any other university over the past five years.

Although some faculty recognize, as one physics professor admits, that they wouldn’t be able to do their research without the money he has brought in, many of them are also disenchanted with Father Zimmerman, CEO. The chemistry chair chose to come to St. Jerome because he expected it to be “less corrupted by commercialism and less compromised by the embrace of industry” than other institutions, which he realizes isn’t the case.

“We have a right,” says the chair of modern languages, upset over the abolition of the language requirement, “to expect the President of a Catholic university to stand up for a true liberal education.” In both cases, we see the ideals of a Catholic university being linked to the ideals of a liberal arts education, both focused on a pure devotion to the pursuit of knowledge seen as incompatible with Zimmerman’s expanded professional schools and intimate sense of students’ consumer needs. Can St. Jerome be true to both the liberal arts and the practical, professionalized realm at the same time?

This question is never resolved in the novel, but outside of his fiction writing, Drucker was deeply interested in the practicality of the liberal arts. In his autobiography, he discusses his deep appreciation of Bennington College, a school designed to combine progressive methods -- connecting learning to practical experience -- with the ideas of Robert Hutchins, the University of Chicago president and famed proponent of classical liberal ideals. William Whyte’s sociological classic Organization Man cites Drucker as saying that “the most vocational course a future businessman can take is one in the writing of poetry or short stories.”

Although Drucker was unusual in actually writing novels himself, he was not alone among business thinkers in expressing the values of the liberal arts. Tom Peters and Robert Waterman’s In Search of Excellence: Lessons from America’s Best-Run Companies describes an investment banker who suggests closing business schools and providing students with a “liberal arts literacy,” that includes “a broader vision, a sense of history, perspectives from literature and art.”

More recently, Thomas Friedman’s The World is Flat includes a section focusing on the importance of a liberal arts education in the new integrated, global economy. “Encouraging young people early to think horizontally and to connect disparate dots has to be a priority,” writes Friedman, “because this is where and how so much innovation happens. And first you need dots to connect. And to me that means a liberal arts education.”

Books like Rolf Jensen’s The Dream Society: How the Coming Shift from Information to Imagination will Transform Your Business, Joseph Pine II and James H. Gilmore’s The Experience Economy: Work is Theatre and Every Business a Stage, Daniel H. Pink’s A Whole New Mind: Why Right Brainers Will Rule the Future, and Richard Lanham’s The Economics of Attention: Style and Substance in the Information Age make these points more specifically, often showing how certain “literary” skills, such as storytelling and empathy, are crucial to success in the current time.

Out of the authors mentioned above, only Lanham is a humanities professor, and in a field (rhetoric) largely out of scholarly vogue today. “Let’s go back to the subject of English a moment. Of all subjects none is potentially more useful,” Whyte writes. “That English is being slighted by business and students alike does not speak well of business. But neither does it speak well for English departments.”

What’s significant about Whyte’s account -- along with that of Drucker, Friedman, and others -- is that none of them claim that colleges and universities should merely churn out students of technical writing or focus on the practicality of the composition course; instead they want students to think about narrative complexity and story-telling through the liberal arts. Whyte himself focuses on the study of Shakespeare and Charles Lamb.

However, instead of embracing these potential real-world allies, liberal arts disciplines have seemed to withdraw, letting others become the experts in -- and proponents of -- the relevance of their subjects. Consider, for example, that in January 2008, one of the most famous English professors in the world proclaimed on his New York Times blog that the study of literature is useless. Asserting that the humanities don’t do anything but give us pleasure, Stanley Fish wrote that, “To the question of ‘what use are the humanities?’ the only honest answer is none whatsoever.” The arts and humanities, Fish contended, won’t get you a job, make you a well-rounded citizen, or ennoble you in any way.

Not surprisingly, readers were appalled. Within the next 48 hours, 484 comments were posted online, most of them critical of Fish. The majority of these comments, from a mix of scientists, humanists, business people, and artists, could be divided into two categories: first, the humanities are useful because they provide critical thinking skills that are useful for doing your job, whether you’re a doctor or CEO; and second, the humanities are useful for more than just your job, whether that means being a more informed citizen or simply a more interesting conversationalist.

However, perhaps the most fascinating comments came from those who recognized Fish’s stance as a professional one: in other words, one that relates to attitudes toward the humanities held by practitioners inside the academy (professors), as distinct from those held by general educated readers outside it (the Times audience). “Let’s not conflate some academics -- those who have professionalized their relationship with the humanities to the point of careerist cynicism -- with those [...] still capable of a genuine relationship to the humanities,” said one reader. Another added that the “humanities have been taken over by careerists, who speak and write only for each other.”

In other words, while readers defend the liberal arts’ relevance, scholars, who are busy writing specialized scholarship for one another, simply aren’t making the case. This was an interesting debate when Fish wrote his column over a year ago; now in 2009, we should consider it an urgent one.

Traditionally, economic downturns are accompanied by declines in the liberal arts, and with today’s unparalleled budget pressures, higher education institutions will need to scrutinize the purpose of everything they do as never before. Drucker’s academic novel provides an illustrative example of the liberal arts at work: as Fish’s readers would point out, literature can raise theoretical questions that help us understand very practical issues.

To be sure, the liberal arts are at least partly valuable because they are removed from practical utility as conceived in business; the return on investment from a novel can’t be directly tied to whether it improves the reader’s bottom line.

But justifiable concerns among scholars that the liberal arts will become only about utility has driven the academy too far in the opposite direction. Within higher education, we acknowledge that the writing skills gained in an English seminar might help alumni craft corporate memos, but it is outside higher education where the liveliest conversations about the liberal arts’ richer benefits -- empathic skills and narrative analysis, for example -- to the practical world seem to occur.

Drucker and his antecedents may be raising the right questions, but these discussions should be equally led by those professionally trained in the disciplines at hand. In today’s economic climate, it may become more important than ever for the liberal arts to mount a strong defense -- let’s not leave it entirely in the hands of others.

Melanie Ho
Author's email: 

Melanie Ho is a higher education consultant in Washington. She has taught literature, writing and leadership development courses at the University of California at Los Angeles.

Cartoon Conservatism

The idea that "Little Orphan Annie" as a historical document full of clues to contemporary American political culture is not, perhaps, self-evident. Many of us remember the comic strip, if at all, primarily as the inspiration for a long-running Broadway musical; the latter being a genre of which I, for one, have an irrational fear. (If there is a hell, it has a chorus line.)

Yet there is a case to make for Annie as an ancestor of Joe the Plumber, and not just because both are fictional characters.The two volumes, so far, of The Complete Little Orphan Annie (issued last year by IDW Publishing in its "Library of American Comics" series) come with introductory essays by Jeet Heer, a graduate student in history at York University, in Toronto, who finds in the cartoonist Harold Gray one of the overlooked founding fathers of the American conservative movement. Heer contends that the adventures of the scrappy waif reflect a strain of right-wing populism that rejected the New Deal. He is now at work on a dissertation called "Letters to Orphan Annie: The Emergence of Conservative Populism in American Popular Culture, 1924-1968."

Heer is the co-editor, with Kent Worcester, of Arguing Comics: Literary Masters on a Popular Medium (2004) and A Comics Studies Reader (2009), both published by the University Press of Mississippi. I recently interviewed him about his work by e-mail. A transcript of that exchange follows.

Q: You've co-edited a couple of anthologies of writings on the critical reception of comics and are now at work on a dissertation about one iconic strip, "Little Orphan Annie." Suppose a cultural mandarin like George Steiner challenged the whole notion of "comics studies" as manifesting a trivial interest in ephemeral entertainments on rotting newsprint. In the name of what values would you defend your work?

A: Since I think George Steiner is a fraudulent windbag, he’s perhaps a bad hypothetical example. But let’s talk about some genuine mandarins, rather than those who just put on airs. I came to comics studies partially as a lifelong reader of comics (after my family immigrated to Canada from India I learned to read English by deciphering Archie comics as if they were hieroglyphics) but also intellectually via high modernism. As a graduate student, I was fascinated by mid-century Catholic intellectuals who did so much to inform our understanding of modernism (Marshall McLuhan, Walter Ong, Hugh Kenner). Erudite as all get out and working to reconcile Catholicism with modernity, these thinkers constantly emphasized that the great modernists (Joyce, Eliot, Pound) were deeply shaped by modern mass culture (Joyce kept a copy of the comic strip "Gasoline Alley" on his mantelshelf and stuffed Finnegans Wake with countless allusions to comics). McLuhan and company taught me that high and low culture don’t exist in hermetically sealed compartments but rather are part of an organic, mutually enriching, conversation: Culture is not an exclusive club, it’s a rent party where anyone can join in and dance.

Aesthetically, I’d argue that the best comics (Herriman’s "Krazy Kat," Art Spiegelman’s "Maus," Lynda Barry’s "Ernie Pook Comeeks") are as good as anything being done in the fine arts or literature. Most comics aren’t as good as "Krazy Kat," of course, but the sheer popularity and longevity of ordinary comics like "Archie" or "Blondie" makes them historically and sociologically interesting. "Little Orphan Annie" is a good example, although it is more than ordinary as a work of art, it is also historically fascinating since it helped reshape conservatism in America, giving birth in the 1930s to a form of cultural populism that you can still see on Fox News. Read by millions (including politicians like Clare Booth Luce, Jesse Helms, and Ronald Reagan), Orphan Annie has a political significance that makes it worth studying.

Finally comics are very interesting on a theoretical level. Comics involve a fusion of works and pictures (this is true even of pantomime strips, where we “read” the images as well as look at them). Therefore, comics are inherently hybrid, existing at the crossroads between the literature and the fine arts. As French theorist Thierry Groensteen has noted, the hybrid nature of comics makes them a scandal to the “ideology of purity” that has long dominated art theory (i.e., philosophers and critics ranging from G.E. Lessing to Clement Greenberg). The best writing on comics (a sampling of which can be found in A Comics Studies Reader) all grapple with formal issues raised by hybridity: How can words and pictures interact in the same work? What’s the relationship between seeing and reading? Do visual artifacts have their own language? These are all very challenging questions, which makes comics studies an exciting field.

Q: In addition to two deluxe volumes of the complete "Little Orphan Annie" from the 1920s (with more on the way), you have put together a collection of the proto-surrealist strip "Krazy Kat." Would you describe the process of assembling this sort of edition? It seems like the work would be as much curatorial as editorial.

A: Until fairly recently, most cartoonists didn’t keep their original art, which meant that reprints of old comic strips and comic books had to be shot from the published work (often yellowing old newsprint). This meant that the art often looked like photocopies of 1970s vintage: smudgy and muddy, frequently off-register.

In the last decade, thanks to digital technology, it’s become much easier to clean up old art and restore it to how it originally looked (the cost of printing in color has also gone down). I’m lucky to work with a group of publishers that are willing to put in the hours necessary to do the restoration work. I’ll single out Pete Maresca whose books Sundays with Walt and Skeezix and Little Nemo in Slumberland: So Many Splendid Sundays reprint old Sunday pages at the exact size they originally appeared, with the dimension of a newspaper broadsheet. Pete is meticulous in trying to restore the colors to their original form (the strips were, among other things, a marvel of engraving craftsmanship brought to the United States by German immigrants). To do so, Pete often has to spend a week or more on each page, in effect taking longer on the restoration work than the cartoonist took in drawing the page.

This is not a project I'm directly involved with, but my publisher Fantagraphics recently published an amazing edition of Humbug (a sophisticated humor magazine from the 1950s edited by Mad magazine founder Harvey Kurtzman). Paul Baresh, who also works on the "Krazy Kat" books, did a remarkable job, equal perhaps to someone cleaning up the icons on a medieval church, in restoring the original art. He talks about the production process here.

Q: We'll get to your dissertation's focus on the ideological dimension of character and plot in "Little Orphan Annie" in a moment. But first let me ask about the artwork. As someone who has studied comics closely, do you see anything innovative or distinctive in its visual style? Also, what's the deal with Annie's eyes? It looks like she just got back from a rave....

A: The earliest newspaper cartoonists mostly came out of Victorian magazine illustrations, which meant their art tended to be florid, dense with decoration and unnecessary details. Harold Gray, Annie’s creator, belonged to the second generation of comic strip artists who did art that was sensitive to the fact that newspaper drawings didn’t need to be so elaborate, indeed that simpler art was more effective because it pushed the narrative along quicker. Gray’s great gift was in character design. In her pilgrim's progress through the world Annie meets all sorts of people, ranging from decent, hard-working farmers to sinister, hoity-toity pseudo-aristocrats. Gray was able to distill the essence of each character so that you can tell, at a glance, that the farmers were care-worn and dowdy but decent, while Mrs. Bleating-Hart (the Eleanor Roosevelt stand-in) was pompous and exploitative.

The best description of Gray’s art I’ve ever seen was written by the 15-year-old fan John Updike, in a letter I found in Gray’s papers at Boston University. “The drawing is simple and clear, but extremely effective,” Updike wrote. “You could tell just by looking at the faces who is the trouble maker and who isn't, without any dialogue. The facial features, the big, blunt fingered hands, the way you handle light and shadows are all excellently done." Updike’s reference to “light and shadows” refers to Gray’s other skill, in creating mood and atmosphere. Annie lives in a dark, somber, gothic world, where evil blank eyes are always peering out of windows.

Annie’s blank eyeballs were a convention Gray inherited from his artistic mentor Sidney Smith, who did "The Gumps." But artistically, as Gray explained to a fan in 1959, the blank eyeballs served to enhance reader involvement with the strip: not seeing what is going on in the eyes of the characters, readers could impose their own fears and concerns into the narrative. Recent comics theorists, most famously Scott McCloud in his frequently-cited book Understanding Comics, have argued that blank, empty characters (Charlie Brown, Mickey Mouse) are easier to identify with. Gray seems to have understood that instinctively.

Q: You argue that from its start in the mid-1920s the strip manifests a strain of conservative populism. The honest, hard-working, "just folks" Annie makes her indomitable way in a world full of elitists, social-climbing poseurs, and pointy-headed do-gooders. How did the strip respond to the economic catastrophe of 1929 and the New Deal that came in its wake?

A: While big business Republicans like Herbert Hoover were politically vanquished by the Great Depression, Harold Gray actually prospered during the 1930s, with Annie becoming the star of the most popular radio show of the decade. How can we explain this, given that Gray was as much an advocate of two-fisted capitalism as Hoover?

Whatever the merits of Hoover’s policies, the President was tone deaf in responding to the Depression because he adopted a harsh rhetoric that denied the reality of poverty. “Nobody is actually starving,” Hoover said as millions had to line up in soup kitchens. “The hoboes, for example, are better fed than they ever have been. One hobo in New York got ten meals in one day.”

Orphan Annie and “Daddy” Warbucks never voiced such complacently unfeeling indifference to poverty. Annie was poor even in the prosperous 1920s, often living as a hobo and begging for food when separated from her capitalist guardian Warbucks. In the 1920s Gray was a progressive Republican in the tradition of Theodore Roosevelt: He praised labour unions, public schools, feminist reforms (Annie dreams of being President like her hero Lincoln), and mocked anti-socialist rhetoric. In reaction to the New Deal, Gray became much more of a partisan right winger, turning the template of his story (Annie and Warbucks battling against powerful and corrupt forces) into an explicitly conservative populist allegory.

In 1931, Daddy Warbucks loses his fortune to unscrupulous Wall Street speculators, is blinded, and lives for a time as a street beggar. But after hitting bottom he regains his fighting spirit and outwits the Wall Street sharks who brought him and America low. By 1932, the villains in the strip are increasingly identified with the political left: snide bohemian intellectuals who mock traditional values, upper-crust class traitors who give money to communists, officious bureaucrats who hamper big business, corrupt labour union leaders who sabotage industry, demagogic politicians who stir up class envy in order to win elections, and busybody social workers who won’t let a poor orphan girl work for a living because of their silly child labor laws. Gray started to identify liberalism with elitism, a potent bit of political framing which continues to shape political discourse in American today.

Q: What have you learned from going through the archives, including the cartoonist's fan mail? What does it tell you about how people responded to his politics? Were there people who supported FDR but still rooted for the plucky little orphan?

A: Reading Gray’s correspondence with his fans was what made me fall in love with this project. There was such a rich array of letters from such a wide spectrum of readers: some from little kids, some from famous or soon-to-be famous people (as mentioned John Updike and Clare Boothe Luce but also the journalist Pete Hamill), most from ordinary, run-of-the-mill but often eloquent adults. Politically the letters are all over the place; some readers loved the way Gray attacked liberals, but many readers (Updike and Hamill are good examples) were New Deal supporters. One such reader wrote that we love Annie because she’s just plain folks like the rest of us and Gray should stop ruining her stories by attacking the President Roosevelt, who is trying to help the real Annies.

What I’ve learned is that people don’t read comics in a passive way: Many Annie readers were bringing to the strip their own life experiences and worldviews. This really helps us understand the way comics can weave themselves into the everyday life of readers.

One example close to my heart: In 1942 Annie forms a group call the Junior Commandos to help the war effort by collecting scrap metal. One of the Junior Commandos is a African-American boy named George, who is shown to be intelligent and resourceful. Rare for the time, George was drawn in a realistic, non-stereotypical way. Gray received many letters from black readers, praising him for showing that their race was contributing to winning the war (although some black readers also felt George was a little bit too servile). Gray also received a letter from an editor of newspaper in Mobile, Alabama, who was upset that a white girl was shown consorting with a black boy. In these letters, we can see how Annie provoked discussion about wartime racial politics.

Q: How did the strip respond to the Sixties? Did Warbucks support the Goldwater campaign? Was Annie menaced by hippies?

A: With the rise of “movement conservatism” and the Goldwater campaign, Gray responded to the times by making Daddy Warbucks and his allies even more militantly anti-communists than before (mind you, the strip featured communist villains since the early 1930s.) Dissatisfied with the mealy-mouth diplomacy of the State Department, Daddy Warbucks and his private army fight a Castro-style Latin American dictator.

Throughout the 1960s beatniks and hippies are cast as villains. As one sympathetic character complains, “I wonder why we see all these peculiar people nowadays” like “th’ beatnik types, th’ ones with long hair, the ones with beads and funny clothes.” There is a fascinating sequence in 1967 showing anti-war protesters burning American flags, and then being attacked by a group of patriotic immigrants from around the world who love America. “We are loyal Americans defending our flag!” one immigrant proclaims. “What are you unclean vermin?” In some ways, this episode prefigures the hardhats versus hippies drama that Rick Perlstein describes in his great book Nixonland.

Like the conservative writers at National Review and the Republican Party itself, Gray also became more sympathetic to the South as a region, seeing it as a bastion of traditional values. Many of Annie’s adventures in the 1960s are set in the South, although the issue of civil rights is scrupulously avoided. This was a big shift for Gray, who in the 1920s started off as a Lincoln Republican (his middle name was even Lincoln), with Annie explicitly and implicitly criticizing racism. We can see the emergence of the “Southern strategy” in Annie.

Q: In the dissertation, you draw a parallel between Gray's populist sensibility and the work of Wilmoore Kendall, the right-wing political philosopher. Actually that's where you hooked my attention -- very few people remember Kendall, let alone write about him (though Gary Wills portrays him in Confessions of a Conservative and Kendall is the inspiration for the title character of Saul Bellow's story "Mosby's Memoirs").Since you aren't arguing that the thinker influenced the cartoonist or vice versa, how do you account for the affinity between Kendall's take on John Locke and Little Orphan Annie?

A: Kendall is a fascinating figure, deserving much more attention than he’s received (although John Judis in his biography of William F. Buckley does a good job of describing Kendall’s pivotal role as mentor to the founder of National Review). Prior to Kendall’s path breaking work (he flourished as a thinker from the 1940s till his death in 1968) conservative intellectuals were almost always openly elitist and anti-democratic: think of T.S. Eliot’s royalism, Albert Jay Nock’s pinning his hope on the “saving remnant,” H.L. Mencken’s Nietzsche-inspired scorn for the booboise, F.A. Hayek’s belief that the courts should be used to curb the rise of the welfare state.

Kendall broke with this tradition of celebrating hierarchy and fearing the masses. He firmly believed that the vast majority of the American people were “conservative in their hips” and that the American political institutions were designed not to thwart the will of the majority but to articulate the deeply held conservative principles of the masses. As a conservative who was closer on a theoretical level to Jean-Jacques Rousseau than to Edmund Burke, Kendall recast the language of American populism in an anti-liberal direction. To his mind, liberals were “the establishment” which needed to be overthrown. As the historian George Nash noted, Kendall was Nash, Kendall was “a populist and a conservative. The contrast with much aristocratic, even explicitly antipopulist, conservatism in the postwar period was striking.”

The reason Kendall’s in the thesis is that the overwhelming majority of the literature on mid-century American conservatism deals with elite political and intellectual figures like Buckley, James Burnham, Whittaker Chambers, Richard Nixon, Barry Goldwater, etc. Historians haven’t done such a good job at locating the origins of conservative ideas in the broader culture, in movies, songs, and comic strips. In drawing parallels between Kendall’s worldview and the ideas that were earlier articulated in Annie, I’m trying to show that high and low culture don’t exist in isolation, but are part of a large conversation with common ideas and images percolating up and down the line. Consider the phrase “egg-head” which Kendall often used when insulting liberal professors (a rather cheeky term since he himself was a Yale man).

Long before the phrase “egg-head” was coined, cartoonists like Gray drew oval-faced professors who lacked common sense and sneered at practical minded business men. I don’t know whether Kendall read Annie or not (although some of his colleagues at National Review clearly did since they wrote about her in their magazine). But it seems to me incontestably that Kendall and Gray shared an overlapping worldview, and can usefully be compared. The fact that Gray’s conservative populism preceded Kendall’s work by a decade also raises interesting questions as to whether elite intellectuals are always at the vanguard of ideological change.

Q: When cultural studies began implanting itself in American academic life about 20 years ago, there was a strong tendency to discover and celebrate the "subversive" and "emancipatory" aspects of popular culture. There was some blend of wishful thinking and willful ignorance about this, at times -- along with a narrow present-mindedness that tended to ignore popular culture from earlier decades, or to look only at things that seemed "counter-hegemonic" in comforting ways. Do you see your work as an explicit challenge to that sort of cultural studies, with its ahistorical perspective and cookie-cutter hermeneutics? Or do you understand what you are doing as part of the "cultural turn" within the historical profession itself?

A: I completely agree with your characterization of early American cultural studies, especially in the form it took in the 1980s and 1990s. The whole “Madonna is subversive” schtick exhausted whatever limited value it had very quickly. So yes, I hope my work tries to challenge the limits of this mode of thinking by being more historical, more grounded in archival research, and attentive to the divergent political voices found in popular culture. One of the great things about working in archives is that the very diversity of voices you find in the past (as in the letters to Orphan Annie I’m working with) force you rethink any “ahistorical perspective” or “cookie-cutter hermeneutics” you may have started off with.

Having said that, I wouldn’t be able to do the work I do without the opening created by cultural studies. One of the points Kent Worcester and I make in our two anthologies is that there was a wide variety of very interesting writers (ranging from Gershon Legman to Thomas Mann) who wrote on comics in the past but it was only with the advent of cultural studies that comics were able to find a secure home in the academy, with an infrastructure of journals, conferences, and library support. Cultural studies has greatly expanded the academic opportunities for anyone interested in popular culture.

My own discipline of history has been transformed by cultural studies. As you properly note, there has been a “cultural turn” in history. To my way of thinking, this “cultural turn” can be traced back to the original British New Left of the 1950s, and especially the writings of E.P. Thompson. My own work might seem far afield from Thompson’s epic work on the British working class, the moral economy of food riots, and the politics of Romantic poetry. Still for all his work has been criticized and challenged in the last few decades, it remains for me the best example of how to do cultural history. Thompson had a great ear: he could pick up nuances from the past that other historians were simply too tone-deaf to hear. The voices of ordinary people, in all their tangled complexity, came through in Thompson’s work. As more historians grapple with culture, Thompson remains the model to follow. I doubt if my work has anywhere near the value of Thompson’s, but as a close friend always tells me, you have to aim high.

Scott McLemee
Author's email: 

Requiring Revision

Over dinner party talk about my work directing a university writing center, a friend remarked that while reading to her kids a British edition of Harry Potter, she came across passages where revise meant study, as in Harry and Ron revised all night for the potions exam.

She asked whether I was familiar with that usage. I wasn't. But I had, in recent weeks, been mulling over the ways that faculty across the disciplines define revision.

My university, like many others that have formal writing-intensive (W) requirements, demands that W course instructors assign a minimum number of pages (in our case 15) and build in a deliberate process for revision. In a climate of budget cutting, when everything is under scrutiny, some faculty have been asking: Should we keep the current requirement of two Ws? Can we afford to keep them capped at 19 when enrollment caps for so many other courses are rising? Are labor-intensive W courses really the best use of faculty time? Are the W courses working?

The debate has revealed a range of unstated assumptions about what revision is and how it should be taught. Two of the more vexing assumptions -- held by a few in favor of the W requirement and a few critical of it -- strike me as especially persistent: that revision is about correcting student deficiencies and that requiring revision breeds dependency.

Conflating revision with correction is quite natural: Students submit (usually flawed) drafts; faculty prescribe how to fix them; and students fix the flaws. Such a process, as anyone who has worked with a skilled editor knows, may not always be fun but it leads to a better final product.

The problem is that the ultimate aims of editing and teaching are different: editors want better writing; teachers may want that too, but they ultimately want better writers.

Certainly students can learn a great deal by following the lead of a good editor, but when teachers slip into editor mode, students in turn focus on delivering what the teacher/editor wants more than on either learning or inquiry. Consider the extreme version (but I've seen it happen): a student submits a draft electronically; a dedicated teacher makes extensive, time-consuming edits in Track Changes; and the student scans the first few edits and then hits the "Accept All" button. Revision done.

The lesson here is not that we need to force students to march through a correction process more deliberately. It is that we need to craft our responses to drafts in ways that encourage students to take responsibility for their own texts.

In practical terms this may mean following some of the pedagogical recommendations of writing across the curriculum experts: when students submit drafts, require them to include cover letters that articulate their own revision plans; attend to macro-level matters such as purpose and argument early in the writing process and sentence-level errors later; rather than copyedit start to finish, line edit only a small portion of a draft, noting patterns of error and leaving the rest of the editing for the writer; balance critique for what isn't working with praise for what is; invite writers to focus on just two or three manageable priorities for revision; and so on.

This does not mean that we should shy from pointing out flaws; nor does it mean that we should avoid giving direct advice on matters large and small. But does mean that we should guard against complicity in an "I'll tell you what's wrong and you fix it" transaction.

The ubiquity of the fix-it orientation may help explain one finding from a recent assessment of student writing done at our university. By collecting course syllabi and student final papers from W courses in four departments, we discovered much good news: that faculty are assigning long, research-driven papers on challenging topics and are requiring drafts; and that over 90 percent of the papers met at least minimal proficiency for undergraduate writing as judged by faculty and graduate students who scored anonymous papers from their home departments.

However, we also discovered that instructor grades were, on average, more than a full letter grade higher than the quality scores given to those same papers. Grade inflation, the stripping of context, and a number of other factors may explain that disparity, but part of the reason for high grades may also be that well-intentioned students and teachers are tacitly locked in a correction mode revision: students draft, teachers point out what to fix and how to fix it, and students correct. Grades ratchet up with each draft as the system rewards compliance above all.

Is the alternative then to have students to work more independently? So think many who believe that requiring revision breeds dependency and that the job of faculty is to wean students from such dependency.

The best students, the logic goes, do need little or no help drafting and revising, so requiring them to do so is at best unnecessary and at worst infantilizing. The worst students tend to submit dashed off drafts, trusting that faculty will essentially write the paper them for them, which turns revision into an empty exercise. This leaves a small slice of motivated but flawed writers who will really benefit from teacher-assisted revision, and they can always come to office hours.

We all want students to be independent learners and to take responsibility for their own education, but that does not mean that the best writers draft, revise and edit on their own. That may work for some but it does not characterize the process of most successful writers, academic or otherwise. We know that writing demands stretches of solitary work but we also know that writers who are willing to share their work early and often typically do better than those who muscle it out entirely on their own.

The aspiration we should have for student writers is not independence as much as interdependence.

Teaching revision as encouraging interdependence does not mean withholding critique or going soft on students. But it does mean that more than deliver prescriptions or justify grades, teacher comments on drafts should challenge writers with options and spark further conversation. Only then can we leverage what sets extended writing assignments apart from other modes of assessment, such as exams: that by working across drafts and with others writers can, within the bounds of academic expectations, walk their own paths through the material, making their own connections and claims along the way. If what we really want is coverage and correction, better that we stick to exams.

A policy that requires revision should be justified not on the grounds that students need remediation but on the reality that scholarly writing emerges from a condition of interdependence, a process that typically includes the guidance of mentors and sharing of drafts as well as peer review and directive editing. Apprentice scholars deserve some approximation of that experience.

Tom Deans
Author's email: 

Tom Deans is associate professor of English and director of the University Writing Center at the University of Connecticut.

Poverty Studies

One favorable outcome of the current economic crisis might be that literary studies finally puts poverty near the top of the agenda and the center of the field. A few years ago, Hurricane Katrina reminded the nation about Americans living in poverty, and it seemed then, to some of us in literary studies who write about poverty, a possible turning point in critical priorities. But it was not to be. Though important work from literary critics on the subject of the poor has come out since then, especially Gavin Jones’s American Hungers (2007) and Walter Benn Michaels’s The Trouble with Diversity (2007), it is perhaps not surprising that the suffering of the poor, even when it temporarily comes to light spectacularly, is not enough to prompt such a major change of direction in the professional discourse.

The engine of “cultural studies” has incredible momentum, and there is a concomitant tendency for the cultural or identity issues of race, gender, sexuality, and even class to subordinate that of poverty. To take just one example, a 2007, post-Katrina book, published by a major university press and called Slumming in New York, can nearly wipe away the poverty problem in New York in the 19th century in a single sentence unsupported by historical evidence: the author writes, “Unlike many European cities, New York in the 19th century promoted itself as a city free from rigid economic class distinctions. In some ways I believe this was true and that the more destructive conflict was fought over cultural legitimacy and representation.”

One of the reasons that the important issues of race, gender, and sexuality have had such traction in English departments is of course that English departments have numerous professors who have suffered and indeed continue to experience racial, gender, and sexual-preference discrimination or prejudice, and so the profession’s investment in the issues is not merely academic. Meanwhile, English departments have had very few tenured professors who have come out of poverty and, by definition one might have supposed, none who were still living in poverty — literary-critical poverty studies has had almost no “insider” advocates. Or at least such was the case until the current economic crisis. While tenure-track professors may not be experiencing true poverty, many are facing furloughs and pay and benefit cuts that will indeed have a real impact on their standards of living. And adjuncts – some of whom are indeed living in poverty – are losing positions all over the country.

Now that acute socioeconomic suffering has hit home or threatens to hit home among university faculty -- not only English Department instructors and adjuncts, but even some tenure-track and tenured professors are facing or anticipating economic difficulties that make the poverty issue less academic, less other.

This is a terrible moment for many people, and it has reminded many of us, in the most painful way, that socioeconomic suffering is not merely the others’ problem. Let’s take this crisis as an opportunity to put poverty on the front burner in our profession, along with race and gender.

Such a change may be especially possible now, given also, during the same moment, the election of the nation’s first African-American president. At this historical conjuncture, it is apparent that the nation has made progress on the problem of racial discrimination that it has not made on that of socioeconomic privation. And yet, of course, these problems are related, and blacks still suffer from poverty out of proportion to their numbers in the population.

We English professors might take a hint from other disciplines. The day after the election, sociology professor Orlando Patterson of Harvard University discussed on television the public triumph of the first African-American president-elect and the continuing private or social isolation of poor African Americans (he talked, for example, about de facto school segregation, more intense than the segregation that existed in the 1970s, and disproportionately high rates of incarceration for blacks).

We might also take a cue from the writers we study and teach. As Gavin Jones has reminded us, many of our great American writers, black and white, women and men, have been concerned with poverty in the United States, including Herman Melville, Edith Wharton, Theodore Dreiser, Stephen Crane, Richard Wright, and James Agee. And, I would add, some of the great American writers who wrote about the poor have in addition come out of poverty, such as Jacob Riis, Zora Neale Hurston, Richard Wright, and Claude Brown.

English departments have done tremendous social good by methodically studying issues of race, gender, and sexuality, good that has gone even beyond raising consciousness and changing attitudes among students; they have also made it a priority to hire minorities, women, and gays and lesbians. Can we English professors make similar contributions to addressing the ongoing poverty problem? Can we take a leading role in promoting poverty studies and affirmative action for the economically disadvantaged? Poverty is a problem, of course, that won’t go away when this economic crisis has passed, but this crisis might leave the literary profession more connected to it.

Keith Gandal
Author's email: 

Keith Gandal, professor of English at Northern Illinois University, is the author of The Virtues of the Vicious: Jacob Riis, Stephen Crane, and the Spectacle of the Slum (Oxford University Press).

The Dangling Man

The novelist and critic Isaac Rosenfeld died of a heart attack in 1956. He and Saul Bellow had been friends since childhood, and they had arrived on the literary and intellectual scene of the early 1940s as a team, “the Chicago Dostoevskians,” with Rosenfeld’s reviews and short stories making the larger initial impression on anyone paying attention to the world of little magazines.

In later years any lingering affection between them was cut with traces of bitterness, for Bellow went from triumph to triumph, while Rosenfeld drifted, turning out the occasional brilliant short piece while unpublished manuscripts piled up. I do not think the word “adjunct” was in wide use at the time, but Rosenfeld got by with dead-end positions at the University of Minnesota and the University of Chicago.

By the end of his friend’s life, wrote Bellow, his friend was living in “a hideous cellar room” from which any hint of bohemian glamor had long since fled. He had, Bellow wrote, “one of those ready, lively, clear minds that see the relevant thing immediately.” But Rosenfeld’s cutting lucidity left him filled with scorn for any motive involving the pursuit of success, let alone propriety. Bellow wrote that his friend “seemed occasionally to be trying to achieve by will, by fiat, the openness of heart and devotion to truth without which a human existence must be utterly senseless.”

He imposed a grim discipline on himself, a kind of squalid asceticism. To the naked eye it looked like failure. When he died in a shabby apartment, Rosenfeld was 38 years old.

As it happens, I was exactly half that age when, in the early 1980s, I discovered An Age of Enormity, the posthumous collection of his reviews and essays that, while long out of print, remains the single best introduction to his work and to his legend. To a teenager, of course, exiting the world in your late thirties hardly seems to qualify as a youthful death. But his example loomed in my imagination for many years as essentially heroic. Rosenfeld’s intransigence, his disdain for the gods of success, was somehow inspiring, albeit in ways that have not done me very much good over the long term. (In general it may be unwise for young people to accept career advice from the dead.)

The first book devoted to his life and work is Rosenfeld’s Lives: Fame, Oblivion, and the Furies of Writing (Yale University Press) by Samuel J. Zipperstein, a professor of Jewish history and culture at Stanford University. My review of the biography has appeared elsewhere, to which it bears adding something: Zipperstein makes very clear the resentments and resentiment of Rosenfeld’s final years, so that even a callow adolescent could not fail to understand their misery.

Point taken. Yet the legend of Rosenfeld, his aura as beautiful loser, still exercises a certain power over my imagination even after more than a quarter of a century.

It is no doubt understandable and fitting that Zipperstein should treat Rosenfeld’s life as a chapter in the story of Jewish-American ambivalence about assimilation. That does not seem to be the basis of any elective affinity in my case, however, unless growing up in a small Southern Baptist town is comparable to life in a shetl, which does seem like stretching it.

The grounds for this continuing fascination became clearer after looking, once again, at the title essay in George Scialabba’s new collection What Are Intellectuals Good For?

Scialabba does not simply repeat standard complaints about the decline of free-range public intellectuals and the rise of transgressive professorial jargonization. (That is a familiar story, even perhaps too familiar.) Scialabba points, rather, to the role played by a “new variety or mutation” of thinker in the “modern, efficient machinery of persuasion” necessary to hold highly developed societies together. Scialabba calls this type “the anti-public intellectual, whose function is not criticism, not defense of the public against private or state power, but the opposite.… As a result of the intellectuals’ incorporation en masse into the ‘power elite,’ it now requires far more training, leisure, and resources to penetrate the screen of corporate or government propaganda….”

And so the critic must redouble his efforts at challenging the arts of public manipulation, however Sisyphean those effort may be. The boulder will crash through the screen every so often, with enough luck and a good aim.

Such is the proper role of the intellectual, as Scialabba reminds us; and most of the time I would not argue otherwise. But it does not exhaust the options.

A different response can be found in a talk that Isaac Rosenfeld gave to the staff of The Chicago Review in the spring of 1956, a few months before his death. The talk was recorded, and a transcript appeared in the Review the following year under the title “On the Role of the Writer and the Little Magazine.” It was not reprinted in his collected essays, or anyplace else, it seems, which is strange because it counts as Rosenfeld’s final testament.

“I am used to thinking," he told his listeners, “because of my upbringing, of the writer standing at one extreme from society … over against the commercial culture, the business enterprise, the whole fantastic make-believe world which some people would like for us to believe is the real world. Of course it can’t be that for the writer.”

This condition of extremity had once been made tolerable by the solidarity of peers who were in the same condition. There was an avant garde – a culture apart, marginal but tough, its spirits fortified and even lifted by the experience of rejection. It was “a small but vigorous and very vital, active and conscious group,” said Rosenfeld, “which knew fairly well the sort of thing it stood for even if it had no specific program and whether or not it had any political allegiance.”

But now the vanguard was dead, or at least domesticated, and the writer was constantly solicited to assume a role in “the symbol manipulation industries,” whether in academe or government or the media. You had to go along to get along. After all, even bohemia was expensive. As psychic defense and compensation, there emerged a spirit of aloofness -- not just about your job, but towards life itself.

It led to “embarrassment with human subject matter,” said Rosenfeld, the desperate cultivation of a “flair for the abstract… for the ‘cool.’ ” This sensibility tolerated expression of “nothing too immediate, too direct or emotional, because that would be considered ‘square’ or ‘frantic.’ ”

(People now assume that prepackaged irony was invented sometime within, at most, the past couple of decades. Not so. The marketing has just improved.)

For Rosenfeld, this amounted to creative death, disguised as a lifestyle. No serious writer could indulge it. He had been a political radical in younger days -- and more recently a psychoanalytic radical, following Wilhelm Reich’s call for sexual revolution. But this, his final profession of faith, did not call on writers to play the role of intellectual activist, challenging the Empire’s mandarins on their own ground. The role of the writer, he said, was not to play a role at all. “Playing a role” was exactly the problem.

Writers had to earn an income, and working in academe was one option. (About teaching, the transcript shows that Rosenfeld twice said, simply, “It’s a living.”) But for any serious writer there was no escape from the need, if necessary, to go it alone -- to trust one’s sense of the important, even if no committee welcomes the effort. Intuition had to be developed, not ambition.

The writer "will have to play,” said Rosenfeld, “the role that is not a role; to be the living man, the one left alone at three o’clock in the morning, when it’s always the dark night of the soul; to be the man whom one encounters when there is no longer any uniform to wear… to be the man who is naked, who is alone, and the man who pretty much of the time is afraid: the man who sees himself as he really is in this flesh and in these bones and in these feelings, in these impulses, in these emotions; the man who confronts himself in his dreams and his reveries; the man who sees himself walking across the street, thinking there but for the grace of God go I, or in his envy: there but for God’s disdain of me I could have gone…. He has to see the light and the truth that can be seen even in our phony and artificial age.”

Saul Bellow recalled that his friend, who did graduate work in philosophy at New York University in the early 1940s, had abandoned that path when he discovered Herman Melville. After reading Moby Dick, logical positivism seemed too blinkered a take on the world. Rosenfeld turned into an intellectual equivalent of Bartleby the Scrivner, saying, “I would prefer not to,” over and over, as the years slipped past.

This was not a good way to live. But then what is? The question is not rhetorical but real -- the kind that is waiting at three o’clock in the morning.

Scott McLemee
Author's email: 


Subscribe to RSS - English
Back to Top