Literature

Afterglow From the MLA

I felt the need to get away, even as the pile of student papers I had to grade slowly dwindled. With final grades submitted, I still felt the impulse. I resisted as well as I could, but something within nagged me.

I considered a spiritual retreat, one to recharge and rest after a busy, even frenzied, semester. I had worked at three campuses, two writing centers, one community center. I did freelance writing. I’m not a workaholic, just a teacher trying to make ends meet. These days, it’s getting harder.

Catholic, Buddhist, ecumenical -- the retreat path did not matter. But calling and surfing for such a place, I found it was too late. Everything was filled. There was only one possibility -- in the twisting hills of Arkansas. I would have to bring my own food and get transportation from a distant airport. I appreciated the offer but felt too tired. Maybe in spring...

Then I had another thought. A whim. Just ninety minutes away, if I could get a direct flight… Could I?

I did.

I joined the Modern Language Association after 30 years in academia and flew to Philadelphia for the 2009 conference, tantalized by conference titles I had only read about before and noticing more than a few that dealt with the ups and downs of academia that I not only know but are etched on my heart. Student assistant, secretary, graduate assistant, writer/editor, teacher…

Although I was just beginning to recite poems when some of the long-term veterans joined, I’ve chalked up my flight miles in the classroom. If I had a banner across my chest like the Girl Scouts used to wear, I’d have badges for adjuncting at up to four institutions at a time, loving words, and being midwife, doula, mother to students in the classroom. A former boss called me a composition worker. Some people think people in my line of work are exploited. I call myself a professional muse.

Maybe going to a professional conference does not seem like a big deal to some. For some, it’s draining. For others, routine. For still others, a dreaded initiation or the key to a job.

I remember sitting behind my desk as a secretary in an English department in the early 1980s, hearing that people interviewed at MLA.

Over winter holidays? I thought. How strange.

“How did you like the meat market?” said a friend, hearing I had been there.

Actually, I didn’t even pack anything formal to wear. I went just to learn. Without expectation, I found myself transported back to a joy I have not felt since my undergraduate years.

“Yes, undergrad is a carefree time,” a colleague said upon patiently listening to my post-conference euphoria.

Actually, for me the undergraduate years were also full of care. But in tough times, it’s literature, art, music, drama that gives me hope, words, perspective.

“You know, those conference titles are often obscure, even ridiculed,” said another friend.

Well, I loved the sessions. Translation and Kafka. Awesome Yiddish. When will I be near Yiddish scholars again? Why study literature? Packed. Langston Hughes. Well worth the trip. Hurston screening. Couldn’t squeeze in. And others…

I’m old enough to feel like a mother to some of the presenters. And I’ve been to other conferences; I have one foot in English, one in counseling, and one in journalism. How is this possible with two feet? I keep shifting my stance, my focus, my efforts. In a world thought to be increasingly interdisciplinary, perhaps I can create a new dance. MLA, for me, was an imaginative leap. I am glad I took it.

Books, stories, and poems have added meaning to my life since I was a little girl. I was imaginative, as kids are – maybe beyond imaginative into the quirky. I “became” Cinderella and Snow White, responding not to my own name, but to the name of the character of the week. I learned French in an innovative, public elementary school and my parents spoke German. Whitman and Golding were among my beacons in junior high, with words I couldn’t utter but could understand. I devoured “the classics” my much older sisters brought home from their demanding high school. A sonnet by Shakespeare and a poem by Millay provided solace through very dark times as a teenager.

My heart further opened to and through the humanities as an undergraduate English major, even with a foot in psychology and another in other interests. I have not changed that much.

The humanities gave me some range to explore, and I majored in English for several reasons. Philosophy had beckoned, but one day I asked a question in a philosophy class and was told “that’s a question for an English class.”

In English class a few months later, a question I asked about what I now know was the teacher’s formalist analysis of The Scarlet Letter yielded an even harsher response from a teacher.

“Do you think the unexamined life is worth living?”

Teachers have bad days.

That teacher, like most of my mentors from school, is deceased now. With the strange quirks of fate, right before I began graduate school (in English), my path crossed his. “Of course I remember you,” he said. “You were the best student I ever had.”

I negotiated my way through the canon in graduate school in English, in a world before composition and rhetoric, but devising my own intuitions about the teaching of writing, and teaching writing, and beginning a career as a writer and editor.

I considered comparative literature studies in graduate school but thought that English was more -- I almost can’t type it out -- practical.

Fate landed me in a hotel room in the Loews in Philadelphia, where many of the modern language sessions were held. Just riding the elevator was fun. People entered and exited, speaking many languages.

Across the street was the Marriott and the Pennsylvania Convention Center, where many English sessions were held. I jaywalked with abandon, with absolute certainty that here on this side or there on that side was where I needed to be.

This jaywalking is a metaphor for my life; as the daughter of immigrants who struggled with English, I sometimes struggle with words, too. Why else would I strive to become a writer?

In my home town, I don’t jaywalk. But what is travel to a professional conference if not an expansion of boundaries?

I befriended three women by chance, each with Ph.D.'s and following different, intriguing, winding career paths.

One had been a high school teacher for 15 years and had also taught on Indian reservations and in China.

Another, formerly on the tenure track, was derailed and maintains energetic writing and teaching.

A third, originally from China, turned out to be a presenter.

As is my wont, I asked questions of everyone I met, no matter whether scrunched in a shuttle or in an elevator. Mainly I asked, “Are you enjoying your sessions?” “Did you get what you came for?” My response to one question from a man in a uniform covered by an overcoat was a gentle, “I work here.”

Enough of my questions. MLA for me was an immersion experience, a cross-cultural journey. The academic paper sessions I attended were mind-stretching. Translation was an echoing theme, and what could be more apropos as the academe struggles to define and express itself in difficult economic times. The sessions on the state of affairs in academe reassured me that I am not alone. And the session on writing teachers who write reassured me that I am on a valid path.

I also learned, among other things, that some people perceive rifts in the MLA. Other languages over there, English over here. Full-time issues there, part-time here. Writing here, literature there.

“I don’t know if I’ll come back,” one new friend said. “Some of this feels elitist.”

If so, that is a shame. What more powerful bridge between human differences than the humanities?

I had the good fortune of encountering people, at random, who attended sessions I wanted to make but couldn’t. On two-year colleges. On analyzing “The Moose” by Elizabeth Bishop. On standings of academic journals. Even my missteps seemed well-orchestrated.

I ate energy bars, instant oatmeal, salmon at a French restaurant, a side of mashed potatoes for a meal, a meal in Chinatown courtesy of a spontaneous Philadelphia friend. I am too shy for cash bars, so I drank cups of tea and coffee in my room.

I’ll be paying off the trip for a while. But it was worth it.

In the three decades that seem like three days that I have spent in academia, I was a student assistant in a college of education, secretary in an English department, a graduate assistant, a publications writer, a liaison with the news media, an adjunct lecturer in three departments at one school, a teacher without walls (adjunct) at four other schools.

In this economy, I won’t be retiring or stopping learning any time soon.

When I told my teenage son I planned to go to the conference, I asked him if he knew what MLA is.

“Those are the people that make the rules I use when I have to write a paper.”

When I have taught documentation in the classroom, MLA or APA, depending on the course, I typically have pointed out that scholars in these groups are not strictly documentation experts, but explorers, researchers, lovers of learning.

Finally, I have decided to count myself among them, even if all I did was sign up.

One new friend was, to my surprise, a presenter. We shared costs of our hotel the last day. She approached me as I indulged my habit honed in a pre-ecological, pre-Internet era. I was seeking out fliers. She offered that we share costs. Why not, I thought.

I had a room with two beds. I rushed across the street again to clear off my avalanche of paper. I had the joy of listening to part of her paper the night before. And, attending her session, by sheer chance. I got a stunning view of Philadelphia from the 33rd floor.

Returning to Cleveland was, of course, a descent. And the stacks of papers are piling up again.… It’s just a few weeks after, and I’m still walking around in post-conference delirium.

Few believe me when I say there is hope for the humanities. There has to be. The most difficult times in my life, the more I have needed books, art, music, drama. I have seen the value of humanities study for students of all ages, at colleges private and public, large and small. My memoir students, some in their eighth decade of life, still turn to the written and electronic word for solace, support, and inspiration.

What did I leave behind? My wide-tooth comb and fliers I could not stuff in my carry-on. That’s all right. It’s hat weather in Cleveland -- and there’s always the Internet.

Author/s: 
Maria Shine Stewart
Author's email: 
newsroom@insidehighered.com

Maria Shine Stewart teaches and writes in South Euclid, Ohio.

Gerrymandering the Canon

In a recent New York Review article on Byron, Harold Bloom makes the following passing remark: “In the two centuries since Byron died in Greece [...] only Shakespeare has been translated and read more, first on the Continent and then worldwide.” Bloom does not cite any statistics, and one cannot help but wonder: Really? More than Homer and Dante, or, among the moderns, more than Sartre and Thomas Mann? Of course, what Bloom really means is that Byron was translated and read more than any other English writer, and he may well be correct on that count. Yet this omission is telling, as it highlights an unfortunate tendency (recently diagnosed by David Damrosch) among certain English professors to equate literature in general with literature written in English. This disciplinary bias, less prejudice than habit, can distort their scholarship – the authors that they admire tend to be far more catholic in their reading. But this pattern also raises a larger academic question: Why do we still partition the literary canon according to nationalist traditions? Is this really the most intellectually satisfying and authentic approach to literary studies?

For an example of how disciplinary blinders can affect scholars as well-read as Bloom, we need only turn back to his article, where we find Byron described as “the eternal archetype of the celebrity, the Napoleon of the realms of rhyme... the still unique celebrity of the modern world.” What such hyperbole masks is the fact that the model for such literary celebrity is in reality to be located in another author, who unfortunately did not have the good sense to be born in England. Indeed, anyone familiar with the inordinate fame of Jean-Jacques Rousseau knows that he was the first genuine literary celebrity, lionized and sought out across Europe, much to his growing despair and paranoia (as this brilliant study by the historian Antoine Lilti details). Byron himself was smitten by Rousseau, touring the Lac Léman with his friend Shelley to visit the sites from Julie, ou la nouvelle Héloïse. Rousseau may not have provided his public with the same devilish scandals as the naughty Lord, but his Confessions, with their admission of a fondness for spankings and exhibitionism, were sultry enough.

Bloom is certainly no provincial, and his own, published version of The Western Canon includes German, Spanish, French, and Italian works – although this canon, too, is heavily tilted toward English authors. But can this be avoided? No doubt French scholars would produce a version of the canon equally tilted toward the French, just as scholars from other nations would privilege their own authors. To an extent, this literary patriotism is normal and understandable: every culture values its heritage, and will expend more energy and resources promoting it.

From the viewpoint of literary history, however, such patriotism is also intellectually wrongheaded. To be sure, writers are often marked most strongly by their compatriots: one must read Dante to understand Boccacio, Corneille to understand Racine, or, as Bloom would have us believe, Whitman to understand T. S. Eliot. But such a vertical reading of literature (which Bloom himself mapped out in The Anxiety of Influence) overlooks the equally – sometimes far more – important horizontal ties that connect authors across national borders. T. S. Eliot may have been “hopelessly evasive about Whitman while endlessly revising him in [his] own major poems,” yet by Eliot’s own admission, the French school of symbolist poetry had a far greater impact on his work. Some of Eliot’s first published poems, in fact, were written in French. Conversely, the French novelist Claude Simon may have endlessly revised Proust, but his own major novels – such as La route des Flandres and L’herbe – owe far more to William Faulkner. Such examples could be multiplied ad infinitum: they are, in fact, the stuff that literary history is made of.

To this criticism, English professors have a ready-made answer: Go study comparative literature! But they have only half a point. Comp lit programs are designed to give students a great deal of flexibility: their degrees may impose quotas for number of courses taken in foreign language departments, but rarely, if ever, do comp lit programs build curricular requirements around literary history. Yet that is precisely the point: Students wishing to study English Romanticism ought to have more than Wikipedia-level knowledge about German Idealist philosophy and Romantic poetry; students interested in the 18th-century English novel should be familiar with the Spanish picaresque tradition; and so on and so forth. Comp lit alone cannot break down the walls of literary protectionism.

The fact that we even have comp lit departments reveals our ingrained belief that “comparing” literary works or traditions is merely optional. Despite Bloom’s own defense of a “Western canon,” such a thing no longer exists for most academics. This is not because the feminists, post-colonialists, or post-modernists managed to deconstruct it, but rather because our institutions for literary studies have gerrymandered the canon, department by department. Is it not shocking that students can major in English at many colleges without ever having read a single book written in a foreign language? Even in translation? (Consider, by contrast, that history majors, even those desirous to only study the American Revolution, are routinely required to take courses on Asian, African, and/or European history, in many different time periods, to boot.) Given that English is the natural home for literary-minded students who are not proficient in another language, it is depressing that they can graduate from college with the implicit assumption that literature is the prerogative of the English-speaking peoples, an habeas corpus of the arts.

But wait a minute: how dare I criticize English curriculums for not including foreign works, when the major granted by my own department, French, is not exactly brimming with German, Russian, or Arabic texts, either? To the extent that French (or any other foreign language) is a literature major, this point is well taken. But there are differences, too. First, it is far more likely that our students will have read and studied English literature at some point in high school and college. They will thus already have had some exposure, at least, to another national canon. Second, and more importantly, a French, Spanish, or Chinese major is more than a literature major: it is to no small degree a foreign language major, meaning that the students must master an entire other set of linguistic skills. Finally, language departments are increasingly headed toward area studies. German departments routinely offer classes on Marx, Nietzsche, and Freud, none of whom are technically literary authors. Foreign language departments are sometimes the only places in a university where once-important scholarly traditions can still be studied: Lévi-Strauss’s Tristes tropiques probably features on reading exam lists more often in French than in anthropology departments. A model for such an interdisciplinary department already exists in Classics.

I do not wish to suggest that English professors are to blame for the Anglicization of literature in American universities: they reside, after all, in English departments, and can hardly be expected to teach courses on Russian writers. The larger problem is institutional, as well as methodological. But it bears emphasizing that this problem does not only affect undergraduates, and can lead to serious provincialism in the realm of research, as well. An English doctoral student who works on the Enlightenment once openly confessed to me that she had not read a single French text from that period. No Montesquieu, no Voltaire, no Rousseau, no Diderot, rien. Sadly, this tendency does not seem restricted to graduate students, either.

Literary scholars are not blind to this problem: a decade ago, Franco Moretti challenged his colleagues to study “world literature” rather than local, national, or comparative literatures. He also outlined the obvious difficulty: “I work on West European narrative between 1790 and 1930, and already feel like a charlatan outside of Britain or France. World literature?” While the study of world literature presents an opportunity for innovative methodologies (some of which were surveyed in a recent issue of New Literary History), students already struggling to master a single national literary history will no doubt find such global ambitions overwhelming.

What, then, is to be done? Rearranging the academic order of knowledge can be a revolutionary undertaking, in which ideals get trampled in administrative terror. And prescribing a dose of world literature may ultimately be too strong a medicine for the malady that ails literary studies, particularly at the undergraduate level. In fact, a number of smaller measures might improve matters considerably. To begin with, literature professors could make a greater effort to incorporate works from other national literatures in their courses. Where the funds are available, professors from neighboring literature departments could team-teach such hybrid reading lists. Second, language and literature majors could also require that a number of courses be taken in two or three other literature departments. A model for this arrangement already exists at Stanford, where the English department recently launched an “English Literature and Foreign Language Literature” major, which includes “a coherent program of four courses in the foreign literature, read in the original.” To fulfill this last condition, of course, colleges would have to become more serious about their foreign-language requirements. Finally, literature students would be better served if colleges and universities offered a literature major, as is notably the case at Yale, UC San Diego, and UC Santa Cruz. Within this field of study, students could specialize in a particular period, genre, author, or even language, all the while taking into account the larger international or even global context.

Will such measures suffice to pull down the iron curtain dividing the literary past? Unless they manage to infiltrate the scholarly mindset of national-literature professors, probably not. Then again, as many of us know firsthand, teaching often does transform (or at least inform) our research interests. A case could of course be made for more radical measures, such as the fusion of English and foreign language departments into a single “Literature Department,” as exists at UC San Diego. But enacting this sort of bureaucratic coup carries a steep intellectual (not to mention political) price. It would be unfortunate, for instance, to inhibit foreign literature departments from developing their area-studies breadth, and from building bridges with philosophy, history, anthropology, sociology, religious studies, political science, and international relations. English departments, moreover, are developing in similar, centrifugal directions: in addition to teaching their own majors, English departments contribute more widely to the instruction of writing (including creative writing), and have their own ties with Linguistics and Communications departments. This existing segmentation of the university may appear messy, but has the benefit of preventing new walls from being erected, this time between neighboring disciplines.

Author/s: 
Dan Edelstein
Author's email: 
info@insidehighered.com

Dan Edelstein is assistant professor of French at Stanford University.

Andy Warhol, Then and Now

In two weeks, the National Book Critics Circle will vote on this year’s awards, and so, of late, I am reading until my eyes bleed. Well, not literally. At least, not yet. But it is a constant reminder of one's limits -- especially of the brain's plasticity. The ability to absorb new impressions is not limitless.

But one passage in Edmund White’s City Boy: My Life in New York During the 1960s and ‘70s (a finalist in the memoir category, published by Bloomsbury) did leave a trace, and it seems worth passing along. The author is a prominent gay novelist who was a founding member of the New York Institute for the Humanities. One generation’s gossip is the next one’s cultural history, and White has recorded plenty that others might prefer to forget. City Boy will be remembered in particular for its chapter on Susan Sontag. White says that it is unfortunate she did not win the Nobel Prize, because then she would have been nicer to people.

But the lines that have stayed with me appear earlier in the book, as White reflects on the cultural shift underway in New York during the 1960s. The old order of modernist high seriousness was not quite over; the new era of Pop Art and Sontag's "new sensibility" had barely begun.

White stood on the fault line:

"I still idolized difficult modern poets such as Ezra Pound and Wallace Stevens," he writes, "and I listened with uncomprehending seriousness to the music of Schoenberg. Later I would learn to pick and choose my idiosyncratic way through the ranks of canonical writers, composer, artists, and filmmakers, but in my twenties I still had an unquestioning admiration for the Great -- who were Great precisely because they were Great. Only later would I begin to see the selling of high art as just one more form of commercialism. In my twenties if even a tenth reading of Mallarmé failed to yield up its treasures, the fault was mine, not his. If my eyes swooned shut while I read The Sweet Cheat Gone, Proust's pacing was never called into question, just my intelligence and dedication and sensitivity. And I still entertain those sacralizing preconceptions about high art. I still admire what is difficult, though I now recognize it's a 'period' taste and that my generation was the last to give a damn. Though we were atheists, we were, strangely enough, preparing ourselves for God's great Quiz Show; we had to know everything because we were convinced we would be tested on it -- in our next life."

This is a bit overstated. Young writers at a blog like The New Inquiry share something of that " 'period' taste," for example. Here and there, it seems, "sacralizing preconceptions about high art" have survived, despite inhospitable circumstances.

White's comments caught my bloodshot eye because I had been thinking about Arthur C. Danto's short book Andy Warhol, published late last year by Yale University Press. (It is not among the finalists for the NBCC award in criticism, which now looks, to my bloodshot eye, like an unfortunate oversight.)

It was in his article “The Artworld,” published in The Journal of Philosophy in 1964, that Danto singled out for attention the stack of Brillo boxes that Warhol had produced in his studio and displayed in a gallery in New York. Danto maintained that this was a decisive event in aesthetic history: a moment when questions about what constituted a piece of art (mimesis? beauty? uniqueness?) were posed in a new way. Danto, who is now professor emeritus of philosophy at Columbia University, has never backed down from this position. He has subsequently called Warhol “the nearest thing to a philosophical genius the history of art has produced.”

It is easy to imagine Warhol's response to this, assuming he ever saw The Journal of Philosophy: “Wow. That’s really great.”

Danto's assessment must be distinguished from other expressions of enthusiasm for Warhol's work at the time. One critic assumed that Warhol's affectlessness was inspired by a profound appreciation for Brecht’s alienation effect; others saw his paintings as a radical challenge to consumerism and mass uniformity.

This was pretty wide of the mark. The evidence suggests that Warhol’s work was far more celebratory than critical. He painted Campbell’s soup cans because he ate Campell’s soup. He created giant images based on sensational news photos of car crashes and acts of violence -- but this was not a complaint about cultural rubbernecking. Warhol just put it into a new context (the art gallery) where people would otherwise pretend it did not exist.

“He represented the world that Americans lived in,” writes Danto in his book, “by holding up a mirror to it, so that they could see themselves in its reflection. It was a world that was largely predictable through its repetitions, one day like another, but that orderliness could be dashed to pieces by crashes and outbreaks that are our nightmares: accidents and unforeseen dangers that make the evening news and then, except for those immediately affected by them, get replaced by other horrors that the newspapers are glad to illustrate with images of torn bodies and shattered lives.... In his own way, Andy did for American society what Norman Rockwell had done.”

It seems like an anomalous take on an artist whose body of work also includes films in which drag queens inject themselves with amphetamines. But I think Danto is on to something. In Warhol, he finds an artistic figure who fused conceptual experimentation with unabashed mimeticism. His work portrays a recognizable world. And Warhol’s sensibility would never think to change or challenge any of it.

Chance favors the prepared mind. While writing this column, I happened to look over a few issues of The Rag, one of the original underground newspapers of the 1960s, published in Austin by students at the University of Texas. (It lasted until 1977.) The second issue, dated October 17, 1966, has a lead article about the struggles of the Sexual Freedom League. The back cover announces that the Thirteenth Floor Elevators had just recorded their first album in Dallas the week before. And inside, there is a discussion of Andy Warhol’s cinema by one Thorne Dreyer, who is identified, on the masthead, not as the Rag’s editor but as its “funnel.”

The article opens with an account of a recent showing, of the 35-minute film Warhol film “Blow Job” at another university. The titular action is all off-screen. Warhol's camera records only the facial expressions of the recipient. Well before the happy ending, a member of the audience stood up and yelled, “We came to get a blow job and we ended up getting screwed.” (This anecdote seems to have passed into the Warhol lore. I have seen it repeated in various places, though Danto instead mentions the viewers who began singing “He shall never come” to the tune of the civil-right anthem.)

Dreyer goes on to discuss the recent screening at UT of another Warhol film, which consisted of members of the artist's entourage hanging out and acting silly. The reviewer calls it “mediocrity for mediocrity’s sake.” He then provides an interpretation of Warhol that I copy into the digital record for its interest as an example of the contemporary response to his desacralizing efforts -- and for its utterly un-Danto-esque assessment of the artist's philosophical implications.

“Warhol’s message is nihilism," writes Dreyer. "Man in his social relations, when analyzed in the light of pure objectivity and cold intellectualism, is ridiculous (not absurd). And existence is chaos. But what is this ‘objectivity’? How does one obtain it? By not editing his film and thus creating ‘real time’? By boring the viewer into some sort of ‘realization’? But then, is not ‘objectivity’ just as arbitrary and artificial a category as any other? Warhol suggests there is a void. He fills it with emptiness. At least he is pure. He doesn’t cloud the issue with aesthetics.”

And so the piece ends. I doubt a copy ever reached Warhol. It is not hard to imagine how he would have responded, though: “It gives me something to do.” The line between nihilism and affirmation could be awfully thin when Warhol drew it.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Doing More With Less

Every so often while visiting a university library, I will go through the periodical room and gather up an armful of journals, often fairly impulsively. This seems like a good way to get a quick glimpse of what is going on outside the rut of my normal preoccupations. It is, so to speak, the higher eavesdropping.

And so it was that some years ago I came across a publication called Weber Studies. Grabbing issues from the shelf, I figured, given the title, that it would be full of articles on Protestantism, bureaucracy, and social-science methodology. In fact, no -- though that probably happened a lot. It was named, not after Max Weber, but rather its sponsoring institution, Weber State University, in Ogden, Utah. In fact Studies was a general-interest publication of literature and the humanities with a focus on the culture and history of the region. A couple of years ago, it changed its name to Weber: The Contemporary West. (Which still sounds kind of Teutonic, somehow, but so it goes.)

Making the rounds last week, I caught up with the Fall 2009 issue and saw that it opened with a long editorial notice to readers. It announced that, for the time being, the journal would be doing more with less. (The phrase "doing more with less" was not actually used, but such was the upshot.) "Weber State University," the note explained, "was hit with a double-digit retrospective budget adjustment for the 2008-09 academic year and is projected to face similar downward corrections in the years to come." And so the journal would be appearing twice a year, instead of three times. Weber had to suspend the modest honorarium it paid to contributors, and so on. And yet (here was the surprising part) it would keep on publishing. You cannot take that for granted -- certainly not in this economy.

Just before getting in touch with Weber's editor, Michael Wutz, I visited its website, which showed no activity beyond the end of 2009. This was not as bad a sign as it seemed. Wutz, who is a professor of English, explained that he had been busy putting together the new issue, just back from the printer. He sent a copy. It is handsome, with a portfolio of color reproductions of paintings by contemporary Western artists, as well as the annual section of essays on international film (local angle: the Sundance festival is held in Utah), and much else besides.

Wutz is author, most recently, of Enduring Words: Literary Narrative in a Changing Media Ecology, published last year by the University of Alabama Press. We followed up our phone conversation with an interview by e-mail. A transcript follows.

Q: Your journal, Weber, has avoided the two most probable effects of a budget cut -- either shifting to online-only publication, or just shutting it down altogether. Did either possibility come up?

A: The possibility of shutting the journal down altogether was not, fortunately, openly on the table, but it was a distinct possibility in the background, especially in light of the severe budget cuts Weber State University, along with the College of Arts & Humanities, was facing. Given that we are facing another budget cut for the coming fiscal year, and perhaps for the year after that, we may end up feeling some of those effects as well.

Fortunately, we have farsighted administrators -- from my department chair and the dean for arts and humanities to the provost -- who see/saw value, aesthetic appeal, and perhaps promotional potential for the University in a tangible print version, even if that print version is published only twice a year instead of only three times.

Q: While reading the editorial note in your fall issue, I assumed that the journal's survival could only mean that people with administrative clout regard it as creating some kind of value for the institution as a whole. At the same time, it is definitely not a "service" publication like, say, an alumni magazine. How do you understand its role vis-à-vis Weber State?

A: Even though we have a number of national and international subscribers, the home base of our readership is the Intermountain West. As part of the journal's original vision (that is, as de facto in-house publication), the new Weber is coming back to its roots by serving, when appropriate, as a vehicle for faculty to publish their work. The work -- typically an essay -- is of course subject to the same review criteria as any outside submission. But by publishing the work of WSU faculty, the journal closes a kind of feedback loop between faculty research and the immediate dissemination of that work within our predominant readership within and outside of Utah.

Most generally, perhaps, Weber also does (genuinely, I feel) help promote the discourse within the arts, broadly conceived -- painting, literature, film, topics of relevance to the West, interviews. Our administrators appreciate the contribution the journal makes on that level. Given the emphasis most public universities seem to be mandated to place on the sciences by state legislatures, a journal such as ours can help re-validate the humanities and, in addition, seeks an active, interdisciplinary dialogue between the sciences and the arts.

Q: You said that submissions from WSU faculty are "subject to the same review criteria as any outside submission." Just to clarify -- the journal is peer-reviewed?

A: Yes, the journal is peer-reviewed, in the traditional sense, as you can also see, in part, from the editorial review board on the inside front cover. Typically, our submissions are reviewed by two members of our board, though if a submission comes back with a strong "yes" or "no" from one of our reviewers (that involves an explanation on our evaluation sheet), we tend to let the other reviewer know so that he/she can conserve their energy for the other work to be judged.

Given that faculty across the country, not just from WSU, publish work in their journal that becomes part of their tenure/promotion file or is important to their (annual) "productivity record," in today's administrative parlance, the competitive review process ensures a level of professionalism so that Weber can legitimately be listed as a "peer-reviewed" journal.

Q: The journal can no longer pay an honorarium to contributors. You've also had to ask for people to hold off on making submissions for a while. No doubt you'd want to get back to the old way of doing things just as soon as possible. But do you have a gut sense that perhaps some corner has been turned -- that you're stuck with this situation for the indefinite future? (Short of a new federal stimulus package for rebuilding intellectual infrastructure....)

A: Much of that depends on the support we receive from the Utah Arts Council, which in turn is partly funded by the National Endowment for the Arts. For this coming fiscal year, for the first time in many years, they have chosen not to support (or have not been able to support) Weber with funds that used to be restricted to honoraria for our contributors. Should any monies from that source materialize again, and come with said restrictions, we'd be able to make token payments to our authors and artists once again.

That being said, we are doing more with less. We've had to let go of a salaried managing editor (three-quarter time) and have had to replace that position with a non-salaried person who works about 20 hours per week for currently $12 per hour. That person is no less qualified than the formerly salaried position.

The dean of the college of arts and humanities, who is very supportive and understanding, had to cut my reassigned time from 9 hours per semester to 6 hours per semester as part of a comprehensive savings initiative in our college. This is on the assumption that one issue is the equivalent of one class (3 issues yearly = 3 classes per semester) and that we are now down to two. In terms of actual time commitment, it doesn't quite work out that way. Which, in essence means that if you combine my teaching load and work on the journal, I now work more and longer than I previously did.

Q: You have written a book on how literary narrative has responded to changes in the media environment. Any thoughts on what role the general-interest journal of literature and the humanities can or should play in the second decade of the new millennium?

A: My, this is a tough one. Generally, my "approach" is to think of print or any other media in terms of a larger media ecology. Various cultural forces enable the development of new/other (typically post-print) media, while pushing older media into a new niche if they don't want to get exterminated altogether.

If general-interest journals of literature and the humanities in the second decade of the new millennium and beyond want to survive, they will in effect have to reinvent themselves -- or at least make themselves responsive to the cultural pressures that post-print media put on it.

Specifically, I'd hazard the observation that precisely because digitization seems to become the new lingua franca of delivery, print media might be able to draw attention to their own material heft, to the feel one gets from holding a journal in one's hands. Both aesthetically pleasing (in terms of visual appeal/design) as well as materially specific (because of paper's haptic properties), literary print journals might get a new lease on life, perhaps paradoxically, because of the digital mediaverse surrounding them.

Maybe I am just naive and don't want to see the digital writing on the wall, but for the time being, I am guardedly hopeful that small literary and humanities magazines, just like the novel more generally, will continue to be viable (though not lucrative, of course) vehicles for enlightened public discourse.

For more information on Weber: The Contemporary West, including material from previous issues, see the journal's website.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Why 'Writing'?

What's in a name? that which we call a rose
By any other name would smell as sweet.

These lines from Romeo and Juliet are often quoted to indicate the triviality of naming. But anyone who has read or seen the play through to its end knows that the names Montague and Capulet indicate a complex web of family relationships and enmities that end up bringing about the tragic deaths of our protagonists.

Lore also has it that Shakespeare's lines were perhaps a coy slam against the Rose Theatre, a rival of his own Globe Theatre, and that with these lines he was poking fun at the stench caused by less-than-sanitary arrangements at the Rose.

I write now in response to the naming of a newly created department at my large state university called "the Department of Writing and Rhetoric." This new department is being split off from the English department and given the mandate to install a new Writing Across the Curriculum program, convert adjunct positions to "permanent" instructor positions, and establish a related B.A. degree.

While the acronym WAR may seem appropriate to some of my colleagues, many of them think we have more important things to worry about than a name right now. We have also been repeatedly told in the face of previous protests that referring to Composition as Writing is a trend nationwide. Nonetheless, I believe that this title is an indication of bad faith and a negative harbinger for the work of the new department and programs like it elsewhere.

Since the announcement of this change, I attended a tenure party for a colleague in another department. Every single person I spoke with at this party assumed from the title of the new department that "all" writing would be taught there, including my field of Creative Writing. People repeatedly asked me what I thought about being in a new department, and I repeatedly corrected them as confusion spread over their faces. They couldn't understand how the Department of Writing and Rhetoric would not include the writing of fiction, poetry, and so on. I repeatedly had to say that “Writing” in this usage means Composition. They repeatedly asked me why, then, the department will be using the title Writing.

That's a very good question, and one that indicates something disturbing, not just here, but in that nationwide naming trend mentioned above and so often cited. Referring to programs in Composition by the title "Writing" indicates that this field is the authority over all meaningful types of writing – in all other fields. By implication, it implies that no other type of writing but what Composition Studies teaches is valid or important – or even exists. Both of these claims are demonstrably false, although they are the silent assumptions that often underlie Composition's use of the term Writing to describe itself.

Perhaps even more disturbing is that using the name Department of Writing and Rhetoric indicates a willingness to write badly in order to empire-build. Good writing is always about clarity and insight, precision and accuracy. Therefore, this confusing name calls into question the very quality of the writing instruction that will be given in the new department. If the department cannot and will not name itself accurately, then what does that bode for the students to be educated there?

Don't get me wrong. I also differ from some of my colleagues in that I am happy about the creation of the new department. Composition is an upstart field that, like my own of Creative Writing, has often not gotten its due. Partly this is because it stems from a remedial function -- Composition became necessary when the sons and daughters of the working class began attending colleges and universities and were not adequately prepared in the finer points of belles lettres.

Naturally, due to the fact that the background -- and the goals -- of these individuals differed from those of the upper classes that had established belles lettres, Composition began to explore and defend less artistic, more practical forms of writing. This evolution differs from that of such programs in mathematics, for instance, where remedial algebra still focuses on the same formulas as those used in advanced courses. In Composition Studies and Writing Across the Curriculum programs, there has been a focus on supplanting the literary scholarly essay as the gold standard of writing. In the past few decades, Composition as a field has worked hard to establish the legitimacy and importance of other forms of writing and their teaching. Much of this effort I admire.

I am also happy that Composition will be given resources long absent. Having taught Composition courses myself for several years, I understand the need for acknowledgment and support, even if the specifics of the plan at my university have not been widely shared or discussed and seem to me based on suspect methods. I wish the new department nothing but the best in its attempts to improve basic writing instruction for our students.

However, many in the field of Composition have also brought resentment of old wounds and insults to bear by attempting to claim that it is foundational and that it is the expert in all types of writing. Advocates for the field have accomplished this by theorizing what they do and by selling it to those in other fields as the answer to literacy. Among other things, they have also tried to change its name to something less associated with its remedial roots and more grandiose in its scope. However, it remains the case that Composition Studies does not represent a universal approach to literacy, critical thinking, or writing.

In my own field of Creative Writing, for instance, we have far different assumptions about what constitutes effective writing instruction. Admittedly, we have somewhat different purposes. But let me also point out that the rise of Composition Studies over the past 30 or 40 years does not seem to have led to a populace that writes better.

In fact, it has coincided with a time when literacy rates have dropped and where complaints about the poor writing skills of college and university graduates (especially of large public universities) have continued to rise. Obviously many complex social factors contribute to this. It is also debatable whether universities have contributed to this state of affairs because the changing methods of teaching Composition are misguided or because there simply haven't been enough resources. I'm all for giving Composition the resources it needs, respecting its right to self-determination in its field, and letting us see what happens. I am all for the general population writing better, even if it is in an instrumental and limited form disconnected from the literary traditions that have fed most love of and respect for the written word in our culture.

Beyond the details of these various professional debates, my negative reaction to the new departmental name stems from the corruption of language that is so prevalent in our society today, where advertisers and politicians and many others lie through exaggeration, omission and indirection. The best analysis of this is perhaps Toni Morrison's 1993 Nobel Lecture in Literature. In it she talks about uses of language that are destructive, about language that obscures rather than clarifies, and how so often such language "tucks its fascist boots under crinolines of respectability and patriotism as it moves relentlessly toward the bottom line and the bottomed-out mind."

If we put the writerly education of our students into the hands of people who insist on rejecting the accurate term Composition for the grandiose and unclear one Writing, what will they learn? They will learn, I am afraid, that they can say whatever they want, even if it is sloppy, confusing, manipulative, or a knowing lie.

Misnaming this department also evokes the negative definition of the title's other half: Rhetoric. In academe we know that rhetoric can be "the study of effective use of language," but most of the world is more familiar with rhetoric defined as "the undue use of exaggeration and display; bombast." This latter definition seems apt when combined with Writing in this name.

I, for one, will never call it the Department of Writing and Rhetoric. I will call it what it actually is: the Department of Composition and Rhetoric. If its practitioners truly respected their own history, they would call it that, too. A "rose" sometimes can smell not so sweet, especially if it turns out not to be a flower at all.

Author/s: 
Lisa Roney
Author's email: 
doug.lederman@insidehighered.com

Lisa Roney is associate professor of English and coordinator for the undergraduate Creative Writing program at the University of Central Florida.

Prized Vocations

I know a professor who enjoys as much success as any of his colleagues would ever want: an endowed chair, numerous books from major publishers, and a position in the leadership of his professional organization…. This is the short list. But he once pointed out that something was missing from his CV. He had never won an award.

This came up a few years ago, not long after I’d won one award and been listed as the finalist for another. My initial assessment was that he was pulling my leg. But there was something mildly forlorn in his manner, and this did not seem like irony. Though neither was it envy, exactly. My worldly status is pretty small beans; and heaven knows that no money was involved in my award -- unlike, say, receiving an endowed chair. (That goes on my tombstone: No Money Was Involved.)

And yet the element of longing was unmistakable. So much so that I have pondered it ever since -- not in regard to my friend’s personality, as such, but for what it implies about the role of prizes and awards in general. More than fifty years have passed since Michael Young coined the word “meritocracy” in a work of social satire. It was not meant as a term of praise, by any means. He worried that the rise of meritocracy would be destructive of social solidarity -- filling those at the bottom with despair, and those at the top with ever more perfect arrogance.

This was a good guess. The term has long since lost any critical force; the very notion of meritocracy now seems self-legitimating. But prescient as he was, Young did not anticipate the excess of desire that the system might generate – and not only among individuals. The giving and getting of awards creates its own expansive dynamic. As the number of awards proliferates, so do the committees required to nominate and judge them. (Upon receiving an award, one’s chances of being co-opted onto such a committee approach 100 percent.) This situation may be beyond satire’s power to illuminate, although the Nobel for Literature should certainly go to anyone who manages it.

Meanwhile, a recent issue of Theory, Culture, and Society contains a paper called “The Sociology of Vocational Prizes: Recognition as Esteem” by Nathalie Heinich, research director in sociology at the National Center for Scientific Research, in Paris. It draws on interviews with winners of French literary and scientific awards -- although the data so harvested appear in the paper almost as an afterthought.

An old joke has it that natural scientists discuss findings and social scientists discuss methodology. In this case, one might go a step further; the center of gravity is almost metaphysical. And appropriately enough, perhaps. Heinich’s argument is that understanding the social function of awards should go beyond more or less economic analogies -- i.e., the award increases one’s access to consumption goods, either directly or by enhancing one’s power -- and instead look to the dimension of “ ’intangible’ outcomes.”

But this is not a matter of what Heinich calls “mere psychology.” Rather, the granting and receiving of awards is part of the intricate and interdependent processes of social recognition within democratic societies -- about which, see half a dozen or so sociologists and philosophers (Norbert Elias, Axel Honnith, Nancy Fraser, etc.) on the dialectics of respect and esteem.

The paper feels like the prolegomenon to something much longer: a book that would interpret how the drive for prestige operates in institutions where the spirit of collegiality must reign. Heinich is, in short, framing questions rather than giving answers. But what’s interested me about the paper, after reading it three or four times, are the passages when you get a whiff of her fieldwork.

Beginning in 1985, Heinich interviewed a dozen French authors who had received major literary awards, including the Nobel. In 2002, she conducted another 16 interviews, this time with “mostly French-speaking” scientists who had received the annual Jeantat Prize for research in medicine and biology.

She defines both literature and science as “vocational” endeavors -- borrowing from the old religious sense that a vocation is a calling: one that involves both demands and rewards that are distinct from those of the market place. (On this point, an American would tend to use the word “professional,” although the differences of implication would require opening a very much longer parenthesis than this to discuss.)

But the relative isolation involved in writing makes it a more purely “vocational” activity than is the work of scientists, which is conditioned by access to institutions and infrastructure. And this -- by Heinich’s account – means that literary awards tend to have a much larger impact on recipients than do scientific awards.

“There is no formal recruitment procedure” for poets and novelists, she writes, “no regular permanent salary, no career marked out in advance, no official titles and ranks, no regular collaborators, and no work premises to go to every morning to meet with one’s colleagues. Given such a weak socialization of the activity and the uncertainty of its value, a big literary prize can be a great event in the life of a writer. For a scientist, however, winning a prize is only one element among many within the highly structured stages of professional recognition … [which include] laboratories, procedures of institutional recruitment, the system of varied and peer-reviewed publications, collective work, the material registration of proceedings, the regular handling of considerable financial resources, etc.”

This study in contrasts is not beyond all dispute. Writing is a solitary activity, but the literary life also has its own politics and economics, even among the poets.(Especially among the poets, is my impression.) Interviews with playwrights might have generated very different data about the relationship between vocation and socialization.

And Heinich seems to treat literary prizes as falling outside the normal routine of a writer’s life -- while the sheer proliferation of awards now makes them a routine part of one’s daily awareness. The announcement of winners for awards come by e-mail at a steady clip. Indeed, one arrived as I was revising this.

So there is plenty more work to be done on the sociology of literary awards. But let me go on to cite an interesting observation from Heinich’s interviews with 16 Jeantat Prize-winning scientists:

“Only three of them, including two non-native speakers of French, have hung it on their office wall. The rest have stored it ‘somewhere,’ sometimes ‘in a nice place’ (but not on the wall) in their apartment, sometimes only to be put away by their spouse, and sometimes to be later packed away in a drawer or box, where nearly all of these prize winners would be hard put to find it again. ‘Don’t ask me where it is!’ begs one of the awardees, while another confesses, ‘I’ve got a lot of plaques; they’re collecting dust at my place. And I think the Jeantat Prize must be there, too, collecting dust.’ ”

The sociologist notes that “this openly asserted discretion on the part of the interviewees concerning the display of prizes is clearly a pronounced cultural trait that distinguishes them from prize winners from the English-speaking world, who seem to have no qualms about proudly displaying their distinctions.”

Asked to account for this reluctance to put the award up for all to see, one of the Swiss interview subjects responded that it might be a lingering effect of Calvinism. Either an awful lot of French biologists are of Huguenot extraction (someone should look into this) or the Puritans had less effect on American culture than is commonly supposed.

Of course, another explanation is possible, such as Heinich’s hypothesis. Anglophone cultures are, she writes, “often marked by the competitive spirit.” In them, “victory consecrates the good player but does not, however, signify an agonistic wish to eliminate the adversary.” By contrast, there is “the value of cooperation in Latinate cultures, where formal equality prevails and any claim to excellence appears as a moral shortcoming.” Hence “victory must not be asserted by the winner, only designated, more or less clearly, by others.… On the one hand, then, a performance imperative reigns, and on the other hand, a modesty imperative.”

Perhaps -- though as a worldly colleague points out, Sarkozy's effort to turn French educational and research institutions into so many lean, mean, reputation-generating machines may yet tip that fine balance.

And on this side of the water, all the awards anyone may ever find wall space to hang will never quite silence the feeling that, after all, you'd best keep nose to the grindstone. "For the night cometh, when no man can work," as we recovering Calvinists sometimes say.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

At the Rendezvous of Victory

One of the turning points in my life came in 1988, upon discovery of the writings of C.L.R. James. The word “discovery” applies for a couple of reasons. Much of his work was difficult to find, for one thing. But more than that, it felt like exploring a new continent.

James was born in Trinidad in 1901, and he died in England in 1989. (I had barely worked up the nerve to consider writing him a letter.) He had started out as a man of letters, publishing short stories and a novel about life among the poorest West Indians. He went on to write what still stands as the definitive history of the Haitian slave revolt, The Black Jacobins (1938). His play based on research for that book starred Paul Robeson as Toussaint Louverture. In 1939, he went to Mexico to discuss politics with Leon Trotsky. A few years later -- and in part because of certain disagreements he'd had with Trotsky -- James and his associates in the United States brought out the first English translation of Karl Marx’s Economic and Philosophical Manuscripts of 1844. (By the early 1960s, there would be a sort of cottage industry in commentary on these texts, but James planted his flag in 1947.)

He was close friends with Richard Wright and spoke at Martin Luther King, Jr.’s church. At one point, the United States government imprisoned James on Ellis Island as a dangerous subversive. While so detained, he drafted a book about Herman Melville as prophet of 20th century totalitarianism -- with the clear implication that the U.S. was not immune to it.

Settled in Britain, he wrote a book on the history and meaning of cricket called Beyond a Boundary (1963). By all accounts it is one of the classics of sports writing. Being both strenuously unathletic and an American, I was prepared to take this on faith. But having read some of it out of curiosity, I found the book fascinating, even if the game itself remained incomprehensible.

This is, of course, an extremely abbreviated survey of his life and work. The man was a multitude. A few years ago, I tried to present a more comprehensive sketch in this short magazine article, and edited a selection of his hard-to-find writings for the University Press of Mississippi.

In the meantime, it has been good to see his name becoming much more widely known than it was at the time of his death more than two decades ago. This is particularly true among young people. They take much for granted that a literary or political figure can be, as James was, transnational in the strongest sense -- thinking and writing and acting "beyond the boundary" of any given national context. He lived and worked in the 20th century, of course, but James is among the authors the 21st century will make its own.

So it is appalling to learn that the C.L.R. James Library in Hackney (a borough of London) is going to be renamed the Dalston Library and Archives, after the neighborhood in which it is located. James was there when the library was christened in his honor in 1985. The authorities insist that, in spite of the proposed change, they will continue to honor James. But this seems half-hearted and unsatisfying. There is a petition against the name change, which I hope readers of this column will sign and help to circulate.

Some have denounced the name change as an insult, not just to James's memory, but to the community in which the library is located, since Hackney has a large black population. I don't know enough to judge whether any offense was intended. But the renaming has a significance going well beyond local politics in North London.

C.L.R. James was a revolutionary; that he ended up imprisoned for a while seems, all in all, par for the course. But he was also very much the product of the cultural tradition he liked to call Western Civilization. He used this expression without evident sarcasm -- a remarkable thing, given that he was a tireless anti-imperialist. Given his studies in the history of Africa and the Caribbean, he might well have responded as Gandhi did when asked what he thought of Western Civilization: "I think it would be a good idea."

As a child, James reread Thackeray's satirical novel Vanity Fair until he had it almost memorized; this was, perhaps, his introduction to social criticism. He traced his ideas about politics back to ancient Greece. James treated the funeral oration of Pericles as a key to understanding Lenin’s State and Revolution. And there is a film clip that shows him speaking to an audience of British students on Shakespeare -- saying that he wrote "some of the finest plays I know about the impossibility of being a king.” As with James's interpretation of Captain Ahab as a prototype of Stalin, this is a case of criticism as transformative reading. It’s eccentric, but it sticks with you.

Harold Bloom might not approve of what James did with the canon. And Allan Bloom would have been horrified, no doubt about it. But it helps explain some of James's discomfort about the emergence of African-American studies as an academic discipline. He taught the subject for some time as a professor at Federal City College, now called the University of the District of Columbia -- but not without misgivings.

“For myself,” he said in a lecture in 1969, “I do not believe that there is any such thing as black studies. There are studies in which black people and black history, so long neglected, can now get some of the attention they deserve. ... I do not know, as a Marxist, black studies as such. I only know the struggle of people against tyranny and oppression in a certain political setting, and, particularly, during the past two hundred years. It’s impossible for me to separate black studies from white studies in any theoretical point of view.”

James’s argument here is perhaps too subtle for the Internet to propagate. (I type his words with mild dread at the likely consequences.) But the implications are important -- and they apply with particular force to the circumstance at hand, the move to rename the C.L.R. James Library in London.

People of Afro-Caribbean descent in England have every right to want James to be honored. But no less outspoken, were he still alive, would be Martin Glaberman -- a white factory worker in Detroit who later became a professor of social science at Wayne State University. (I think of him now because it was Marty who was keeping many of James's books in print when I first became interested in them.) James was the nexus between activists and intellectuals in Europe, Africa, and the Americas, and his cosmopolitanism included a tireless effort to connect cultural tradition to modern politics.To quote from the translation he made of a poem by Aimé Cesaire: “No race holds the monopoly of beauty, of intelligence, of strength, and there is a place for all at the rendezvous of victory.”

Having C.L.R. James’s name on the library is an honor -- to the library. To remove it is an act of vandalism. Please sign the petition.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

As Others See Us

A genome biologist, Gregory Petsko, has gone to bat for the humanities, in an open letter to the State University of New York at Albany president who recently (and underhandedly) announced significant cuts. (For those who haven’t been paying attention: the departments of theater, Italian, Russian, classics, and French at SUNY-Albany are all going to be eliminated).

If you are in academia, and Petsko’s missive (which appeared on this site Monday) hasn’t appeared on your Facebook wall, it will soon. And here’s the passage that everyone seizes on, evidence that Petsko understands us and has our back (that is, we in the humanities): "The real world is pretty fickle about what it wants. The best way for people to be prepared for the inevitable shock of change is to be as broadly educated as possible, because today's backwater is often tomorrow's hot field. And interdisciplinary research, which is all the rage these days, is only possible if people aren't too narrowly trained."

He's right. And if scientists want to speak up for the humanities, I’m all for it. But Petsko understands us differently than we understand ourselves. Why fund the humanities, even if they don’t bring in grant money or produce patents? Petsko points out "universities aren't just about discovering and capitalizing on new knowledge; they are also about preserving knowledge from being lost over time, and that requires a financial investment."

How many us willingly embrace that interpretation of what we do? "My interest is not merely antiquarian...." is how we frame the justification for our cutting edge research. Even as we express our dismay when crucial texts go out of print, any sacred flame that we were tending was blown out when the canon wars were fought to a draw. Why should we resurrect it? Because, says Petsko, "what seems to be archaic today can become vital in the future." His examples are virology and Middle Eastern studies. Mine is 18th-century literature — and with all the imaginative vigor at my disposal, I have trouble discerning the variation on the AIDS scare or 9/11 that would revive interest in my field. That’s OK, though: Petsko has other reasons why the humanities matter:

"Our ability to manipulate the human genome is going to pose some very difficult questions for humanity in the next few decades, including the question of just what it means to be human. That isn't a question for science alone; it's a question that must be answered with input from every sphere of human thought, including -- especially including -- the humanities and arts... If I'm right that what it means to be human is going to be one of the central issues of our time, then universities that are best equipped to deal with it, in all its many facets, will be the most important institutions of higher learning in the future."

Well, that would be great. I have no confidence, though, that we in the humanities are positioned to take advantage of this dawning world, even if our departments escape SUNY-style cost-cutting. How many of us can meaningfully apply what we do to "the question of just what it means to be human" without cringing, or adopting an ironic pose, or immediately distancing ourselves from that very question? How many of us see our real purpose as teaching students to draw the kinds of connections between literature and life that Petsko uses to such clever effect in his diatribe?

Petsko is not necessarily right in his perception of what the humanities are good for, nor are professionals in the humanities necessarily wrong to pursue another vision of what our fields are about. But there is a profound disconnect between how we see ourselves (and how our work is valued and remunerated in the university and how we organize our professional lives to respond to those expectations) and how others see us. If we're going to take comfort in the affirmations of Petsko and those outside of the humanities whom he speaks for, perhaps we need to take seriously how he understands what we do. Perhaps the future is asking something of us that we are not providing — or perhaps we need to do a better job of explaining why anyone other than us should care about what we do.

Author/s: 
Kirstin Wilcox
Author's email: 
info@insidehighered.com

Kirstin Wilcox is senior lecturer in English at the University of Illinois at Urbana-Champaign.

Let Us Now Praise KJV

Everyone must know the famous statement attributed to Texas Governor “Ma” Ferguson (1875-1961) during a debate over bilingual education in the 1920s: “If English was good enough for Jesus Christ, it’s good enough for Texas schoolchildren.”

Alas, there is no solid evidence that she actually said it. Variations on the formula go back at least to the 1880s. Both the sentiment and the urge to lampoon it were probably around well before that. The remark is, in any case, a tribute to the aura of authority surrounding the King James Version of the Bible. This year marks the 400th anniversary of its publication. Some people regard the translation as almost divinely inspired; and I can see the point, at least at the level of style. (A few fundamentalists do reject it, objecting to the lifestyle of King James, which was sodomitical.)

Now, it has been some while since my shadow darkened a church door. I regard the existence of the Almighty with curious skepticism, and suspect He would return the favor. But when it is necessary to consult the Bible, there is simply no question of whether or not to use the KJV. It is the only one with any flavor; the rest are as appetizing as a sawdust sandwich.

Belief is not a prerequisite for celebrating the KJV. The critic and essayist Dwight Macdonald put it best: “The King James Bible came at the end of the Elizabethan age, between Shakespeare and Milton, when Englishmen were using words more passionately, richly, vigorously, wittily, and sublimely than ever before or since. Although none of the divines and scholars who made it were literary men, their language was touched with genius -- the genius of a period when style was the common property of educated men rather than an individual achievement.”

The quadricentennial has inspired a flood of monographs on the history and literary intertextuality of the Authorized Version, as the translation is also known. It would be steady work just to keep up with these publications. “Of making many books,” sayeth the Preacher, “there is no end; and much study is a weariness of the flesh.” But one recent volume, David Crystal’s Begat: The King James Bible and the English Language (Oxford University Press), is both scholarly and diverting -- something the reader can dip into, when and where the mood strikes. On this anniversary it reminds us just how ubiquitous the KJV's influence is.

The dust jacket describes David Crystal as “the world’s greatest authority on the English language.” I pass this statement along without necessarily endorsing it. If someone else feels they have claim to the heavyweight title, take it up with Oxford UP. He has certainly recognized and assembled an enormous number of examples of how turns of phrase found in the KJV still echo in literature, politics, journalism, popular culture, and everyday speech. Only after finishing this column did it occur to me that Crystal also compiled an interesting volume on how text-messaging affects language, which I wrote about here. The man is a consummate word nerd, by any standard, and his books merit a place on the nightstand of anyone with that disposition.

It is sometimes said that the Authorized Version contains thousands of expressions that have passed into common usage. By Crystal’s reckoning, this is pushing it. He identifies 257 idiomatic English expressions that can be traced to the KJV. That’s plenty: “No other single source,” he writes, “has provided the language with so many idiomatic expressions. Shakespeare is the nearest, but the number of idioms we can confidently attribute to him (such as to the manner born) is under a hundred.”

The expressions he catalogs are words or phrases that have come to circulate without necessarily carrying a religious connotation. In Genesis, for example, we read: “Now Israel loved Joseph more than all his children, because he was the son of his old age; and he made him a coat of many colors.” The latter phrase echoes in a song by Dolly Parton, various fashion-magazine articles (“When looking for outerwear this cold-weather season, think coats of many colors”), and a joking reference to guys in a carnival parade (“float of many bubbas”).

A line from Isaiah, and alluded to by St. Paul, reads: “Let us eat and drink; for to morrow we shall die.” According to Luke’s gospel, Jesus tells a parable about a rich man whose attitude is expressed as “take thine ease, eat, drink, and be merry.” Somewhere along the way, the expressions fused into a common saying which now inspires headlines such as “Eat, drink, and be merry, for tomorrow we devalue the pound.” It is also used by people who are going on a diet, though not just yet.

The idiom "fly in the ointment" -- meaning a problem or distracting irritation -- is both very common and somewhat peculiar. Its source is a passage in Ecclesiastes: “Dead flies caused the appointment of the apothecary to send forth a stinking savour; so doth a little folly him that is in reputation for wisdom and honor.” Cut loose from the original context, the image loses the quality of moral warning it had in the original proverb.

Crystal notes that many turns of phrase appearing in the KJV were taken from earlier English translations of the Bible, including “Let there be light.” The Douai-Rheims version (a Roman Catholic translation coeval with the one James commissioned for the Church of England) renders this as “Be light made.” But, Crystal writes, “that never stood a chance of competing in the popular mind with ‘Let there be light,’ whose Beethovenesque ‘te-te-te-tum’ stress pattern reflected more naturally the language’s rhythmical norms.” By contrast, one comedian imagined how Genesis 1:3 would be released by the White House: “The Supreme Being mandated the illumination of the Universe and this directive was enforced forthwith.”

The last time I gave much thought to the KJV's force-field was while reading Let Us Now Praise Famous Men, the book James Agee wrote to accompany photographs of sharecroppers taken by Walker Evans during the Depression. The cadences of his prose and the quality of moral anguish (clearly Agee felt that making art out of other people’s misery was a dubious undertaking, perhaps a sin) revealed the hold that the Bible had on him as a writer. So did his book’s title, drawn from Ecclesiasticus, which the King James translators included in the Apocrypha now often left out of that edition: “Let us now praise famous men, and our fathers that begot us.”

Begat charts another sort of cultural power the translation has radiated over the past four centuries. Threads of it have become woven into everyday life, in conversation and countless utterly secular usages. Some of this is a matter of allusion: the long shadow of remembered texts. But it also an effect of the literary qualities of the translation -- in particular, its phonetic properties, as Crystal spells out: "especially iambic rhythms (from strength to strength), alliteration (many mansions), assonance (from the cradle to the grave), euphony (still small voice), [and] monosyllabicity (you know not what you do)."

There are passages in the King James Version that have become touchstones of high eloquence ("for ye are like unto whited sepulchres, which indeed appear beautiful outward, but within are full of dead men's bones, and of all uncleanness"). But it's in the small points of phrasing that, as Dwight Macdonald said, the translators were touched with genius, if not by some higher power.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Humanities' Constituencies

Smart Title: 

WASHINGTON -- C.P. Snow’s depiction of a “gulf of mutual incomprehension” separating scientists from humanists may date to 1959, but it’s still relevant – and cited -- in discussions of the humanities in 2009. Panelists speaking Monday on “The Public Good: The Humanities in a Civil Society” cited Snow in describing a need to better bridge that gulf -- with the consequences of failing to do so exacting a real and human price, argued Patty Stonesifer, chair of the Board of Regents for the Smithsonian Institution and senior adviser to the trustees of the Bill and Melinda Gates Foundation.

Pages

Subscribe to RSS - Literature
Back to Top