English

Resetting Priorities

Beyond looking for jobs, writes Monica F. Jacobe, new Ph.D.s need to ask themselves tough questions about what really matters.

Hypertext 101

It’s not always easy for professors to embrace technology. We find ourselves questioning everything from whether the transition to cyber-learning is really worth it to wondering how and when we will "master" working with computers. In a tenured profession, some think it’s better to stay in our safe, traditional worlds of literacy. Nevertheless, most of us realize that we cannot avoid our new century and the new pedagogical challenges created by its technological advances.

Having made that decision, more questions arise. If we change our courses, do we risk lowering the quality of our teaching because we are simultaneously learning how to teach with computers while also learning how to evaluate their effects? And even as we multi-task, writing e-mails to students while we surf the Web for the perfect text to teach tomorrow, can we ever keep up?

As we begin to realize the benefits and drawbacks of computerized writing, it may seem like there is an endless road of new learning challenges. As academics, we don’t often move as quickly as technology, and yet we also know that analyzing and reflecting on innovation is our ethical responsibility. Are professors today doing enough to use, improve, reflect and criticize our use of computers as tools to teach writing and support learning across the curriculum?

Despite the fears and uncertainties conveyed by these questions, it is beyond doubt that our students’ learning and literacies have changed because of the use of computers.  We must understand and adapt to computers, hypertext, and ultimately learn as much as possible about our disciplines’ experiences in cyberspace because our students demand it of us, and because language is always changing. Essentially, we must move to the point where we focus on ways to fuse academic discourse with our students’“netspeak.”

Since the 1980s, writing teachers have increasingly focused on the need for their teaching to reflect the ways technology and traditional literacies are converging. In her 1998 keynote address to the Conference on College Composition and Communication, Cynthia Selfe, an early adapter and leading authority on teaching writing with computers, remarked that we "have, as a culture, watched the twin strands of technology and literacy become woven into the fabric of our lives."

All college teachers, and particularly writing teachers, must now learn to avoid overly narrow, official, 20th century versions of literacy practices or skills if we are going to effectively reach our students as readers, writers, and thinkers. Whether we teach mathematics, biology, or literature, we all know that literacy skills are really the responsibility of all educators. It is no longer acceptable for professors to claim ignorance of using computers and the Internet while claiming to be literacy teachers in the 21st century.

Innovative thinkers like Selfe not only get at why we need to teach with technology, they make it clear that we need to offer concrete ways to use hypertext. Even "Web in a can" programs like Blackboard, Blogger and WebCT can be used effectively to get students and teachers reading, writing and thinking critically about all the literacies that are used in any college classroom. Through a blend of theory and practice, we want to encourage readers of Inside Higher Education to realize that just as our profession’s news has begun to move from paper to screen, we professors must make the same move too.

Computer technology has swiftly become our key writing tool but it’s too easy to imagine everyone "gets it." Just as we take writing samples to learn about the literacy levels of our students when we teach composition, we need to determine what computing skills our students bring to our classes because we need to teach students how to fuse traditional and online writing skill. For example, instructors of first-year college writing typically work with students to teach them how to do academic research and it has become increasingly clear that it no longer makes sense to shun the Internet for the "safe" confines of the library. However, research on the net means much more than typing a few words in to Google.  

A more sophisticated approach to teaching students how to do Internet research involves showing students some of the ways online searches use Boolean logic, and this is simply accomplished by visiting the Google Guide.

This self-teaching Google tutorial will sharpen awareness of how the search engine works, and it will also help students in their library research as well. Also, by using checklists and guides, we can help students to critically evaluate sources on the Internet -- not just accepting what is written as an alternate form of "the gospel." A good example of this can be seen in some checklists, created by Jan Alexander and Marsha Tate, that can easily be used with any research-based college assignment.  These checklists ask that students classify and validate Web sites, as well as help students think more carefully about the qualities of information that the Web sites present.

Of course it isn’t only students who have to think carefully and critically about computers, the way they convey information, and the way that they are perceived as learning tools.  Teachers have to do the same thing because computers have changed our writing and learning worlds, and as educators we can never take these changes for granted. Andrea Beaudin, a professor at Southern Connecticut State University who teaches in a wireless, laptop equipped classroom is constantly amused at the ways her students perceive the technology in the classroom. Recently, she recalled “being surprised during a writing lab when I asked students to take out paper to start jotting down ideas, and a student said, “now it’s a real English class.”

Andrea’s story reminds us that there will be times and places for other types of literate activities in a computerized classroom. Just as Andrea asked her students to use the “old” technology of paper and pen to do the work of learning because she practices hybrid notions of literacy, we will continue to work at what Cindy Selfe calls “multilayered literacy” -- a literate practice where people “function literately within computer-supported communication environments” by layering “conventions of the page and conventions of the screen.”

However, conventions of page and screen certainly converge more smoothly in theory than they ever do in practice. Something as simple as deciding to create a Web page, choosing Web weaving software, and learning it, can be a huge step for most teachers. Both of us have experienced working through steep learning curves while learning to use new Web technologies. Four years ago, Chris moved from using raw HTML code to working with Adobe GoLive and Photoshop. At times, he wanted to shot-put his monitor through any open window. However, the end result was a personal Web page that looks better, contains more useful information for students, and is much easier to update. Right now Will is working through learning DreamWeaver, and he has already started to see new possibilities for his page.

Our point here is that even techie teachers get technological blues. However, once we begin to figure a few things out, then interesting and good things begin to happen. We learn a new skill, our students get better Web resources, and both teacher and students have yet another new technology to think through practically and critically.

Professors and students both need to think critically about technology. As a key part of the critical thinking, teachers need to focus on pedagogy and how it is affected and changed by computers. Maybe some things haven’t changed -- traditional, academic literacy has always converged with new ways to use language -- though it’s fair to say that computers certainly seem to speed up an exciting convergence of language uses. We educators are working in an exciting point in literacy development and we can be more mindful of why and how to use our computers. As the traditional classroom adds cyberspace, we must work closely with students and teachers to ensure that we enter new learning spaces with critical awareness and pedagogical wisdom.

Author/s: 
Will Hochman and Chris Dean
Author's email: 
info@insidehighered.com

Will Hochman is an associate professor of English and Christopher Dean is an assistant professor of English at Southern Connecticut State University.

The Writer's Writer

Ask almost any American writer today for a list of his or her literary idols, and Frank Conroy’s name usually rises near the top.

The author of one of the best books of our age, Stop-Time, published in 1967, as well as the director of the greatest incubator of literary talent ever assembled, the Iowa Writers’ Workshop, Conroy was as close to legend as any living writer gets.

Not to mention a Grammy winner—for best liner notes.

Despite a rough beginning, he made the most of a life that ended last week, when he died at age 69 of colon cancer.

Stop-Time slays everyone who reads it.

The poignant, tough and lean prose is every bit as great as J.D. Salinger’s Catcher in the Rye or Jack Kerouac’s On the Road. The literary establishment, from Norman Mailer to William Styron, fell before Frank’s wobbly 31-year-old knees when the effervescent memoir was published. Every shimmering word in Stop-Time seemed to detonate as Frank, from a teenager’s perspective, detailed the pain and legacy of an abusive, manic-depressive father and an absentee mother.

The book was the best kind of fiction because it was numbingly true.

It’s not "Genesis," but to many writers, the opening paragraphs of Stop-Time are the bible of literary beginnings:
 
My father stopped living with us when I was three or four. Most of his adult life was spent as a patient in various expensive rest homes for dipsomaniacs and victims of nervous collapse. …

I try to think of him as sane, yet it must be admitted he did some odd things. Forced to attend a rest-home dance for its therapeutic value, he combed his hair with urine and otherwise played it out like the Southern gentleman he was. He had a tendency to take off his trousers and throw them out the window. (I harbor some secret admiration for this.) At a moment’s notice he could blow a thousand dollars at Abercrombie and Fitch and disappear into the Northwest to become an outdoorsman. He spent an anxious few weeks convinced that I was fated to become a homosexual. I was six months old. And I remember visiting him at one of the rest homes when I was eight. We walked across a sloping lawn and he told me a story, which even then I recognized as a lie, about a man who sat down on the open blade of a penknife embedded in a park bench. (Why, for God’s sake would he tell a story like that to his eight-year-old son?)

Absent any sentimentality, Frank had created an instant classic, ultimately changing how we think of memoir and American literature, as well as how we perceive of the vulnerability of children and the passage each of us goes through to become an adult.

Premature adoration and fame can turn even the most humble of men and women into fools, but Frank seemed to manage. He used his writing to chronicle his personal struggles, publishing perfect-pitch short stories and novels, including Midair, Body and Soul and Dogs Bark but the Caravan Rolls On. His precision with language earned him the respect of legions of journalists, including David Halberstam and Russell Baker.

Unable to corral his prodigious creativity, Frank blossomed as a jazz pianist. He became director of the literature program at the National Endowment for the Arts in 1982. He arrived as director at the Iowa Writers' Workshop in 1987 and quickly developed a reputation as a no-nonsense teacher who lived and breathed writing.

Admiring From Afar

Frank was one of the most unpretentious writers I’ve ever known.

I came to the University of Iowa as an eager journalist wholly unfamiliar with the trappings of academic life. A great perk of my job as a journalism professor was living in the shadow of the Writers’ Workshop, known locally as "The Workshop."

Like many others in the business of putting words on paper for a living, I revered Frank from a distance.

I used to see him around town: nose in a book at Prairie Lights, the wonderful bookstore on Dubuque Street; hunched over a newspaper, his lanky legs and arms taking over a booth in the Chesapeake Bagel Company down the block, holding forth with Guinness in hand at The Mill on Burlington Street.
Frank and I shared at least one thing: The University of Iowa had hired us as full-time faculty members.

This was much less of an accomplishment for Frank than it was for me, but few universities then and now would consider hiring such undereducated writers. As far as I know, outside of an abstract painter in the Art School, Frank and I were the only full-time faculty members at the university with just bachelor's degrees.

Frank distrusted most academics, a healthy instinct for any writer. Many are long-winded and imprecise with language (a cardinal sin for Frank); they study memorable writing but seldom create writing that’s memorable.

When I got here in 1993, the Writers' Workshop was housed in the same dreary brick-and-concrete building as the English department. There was no love lost on either side when Frank was able to move the Workshop to a lovely renovated 19th Century home, high on a bluff overlooking the Iowa River.

He surrounded himself with wonderful writers who also were wonderful teachers of writing, including Marilynne Robinson (who just won the Pulitzer Prize in fiction), Jim McPherson (who won the Pulitzer in 1978) and Jorie Graham (who won for poetry in 1996).  Flannery O’Conner, Wallace Stegner, W.P. Kinsella, John Irving, Raymond Carver, T.C. Boyle and Jane Smiley cut their teeth as young writers at the Workshop.

Each year, Frank enrolled students who would go on to change the way we look at the written word. The Workshop is probably harder to get into than Harvard Law School: 800 applicants vie for 25 slots. Since its inception in 1936, 26 Pulitzer Prizes have been awarded to former Workshop students.

For all his facility with words, Frank was an anachronism, a technophobic dinosaur.

He didn’t do e-mail. He surrounded himself with felt-tipped pens, yellow pads and clipboards. He wrote lying in his bed, his back propped against pillows. Whenever he finished a draft, his wife (and best friend) Maggie, would type his longhand into a computer. Frank would then wildly mark up the printout and revise at a compuer.

While a tough teacher, he also was a generous one.

The coveted blurb

When I wrote a nonfiction book in 2000, like all authors, I slogged through the merciless business of trawling for blurbers. Blurbs are the pithy endorsements on the backs of book jackets that publishers hope will persuade otherwise clueless browsers to plunk down cash or credit card. Frank’s policy was not to blurb. Period. I think he probably felt that if he started blurbing, he’d surely never have a free moment for anything else.

At that time, I had not yet met Frank. Personal idols, particularly of the literary variety, are usually best left undisturbed, and I was satisfied to admire Frank and his work from a distance. But someone had handed him an advance copy of my book. Frank packed away the manuscript in his suitcase and took it to his summer house in Nantucket. "Don’t expect anything," I was told.

I blocked out what this great writer and teacher of writing could possibly say about my prose -- until word got back to me that Frank loved the book and was willing to say so. Blurb on the back cover, Frank’s endorsement probably didn’t carry much clout with the ordinary buyer at Barnes & Noble, but to me it meant the world.

We met finally at a reading shortly after the book came out. Frank had been playing piano for a local radio program that night and, just as the reading was winding down, this stranger/mentor arrived. He made a beeline for the podium and gave me a bear hug of congratulations.

Writers aren’t like that. They are morose, moody, competitive, gossips at heart who look askance instead of straight ahead.

Since that evening, Frank and I had often run into each other in this literary town among the cornfields. We talked about writing and politics, especially about our fears that our teenage sons might eventually get pulled into the widening war in Iraq.

Raising some eyebrows

The last time I saw Frank was right after he had caused some eyebrows to arch by accepting from President Bush the National Humanities Medal on behalf of the Workshop. It was the first time a university program had ever achieved such an accolade.

Frank was at Prairie Lights, the bookstore, and as we were both flipping dust covers, checking out too-serious visages of up-and-coming authors, I asked him about his experience at the White House.

And Frank, always the writer, always working, always trying to make sense of the world, said he enjoyed meeting Bush, despite their profound differences. Bush, he said, was caught up in the gears of some grinding machinery that couldn’t be shut down. The president might be at the switch, but he wasn’t in control. Bush, Frank said, really was a likable fellow but had become a victim, a hapless innocent.

And then I saw it once again, Frank turning generous, even magnanimous. But I could also see he was working. There was a magnificent story brewing here, and Frank was mapping out its plot.

Author/s: 
Stephen G. Bloom
Author's email: 
stephen-g-bloom@uiowa.edu

Stephen G. Bloom, the author of "Postville: A Clash of Cultures in Heartland America," teaches narrative journalism at the University of Iowa.

Read This!

It isn't a prize or an award, exactly. But next month, the Litblog Co-op -- a consortium of 20 literary bloggers -- will announce the first novel it has selected for its quarterly "Read This" campaign. The participants will urge their audiences to buy the book, and will open discussions of it at their respective Web sites.

My impression from a conversation with Mark Sarvas in late December, when he was first rounding up collaborators on the project, is that the whole enterprise is a kind of laboratory experiment in literary sociology. Can a group of people frustrated with prevailing trends in the publishing industry (which is constantly on the lookout for the next Da Vinci Code, as if one weren't enough) and with mainstream media (where reviewing space shrinks constantly) win recognition for a worthy, but otherwise potentially overlooked, piece of fiction? Or, to put it another way: Do literary bloggers have any power? Considering  how many novels and short story collections they now publish, university presses may well want to monitor the results.

Members of the co-op will take turns serving on a five-person nominating committee, each member of which proposes a book. All members then read, debate and vote on the five titles. The winner will be announced on May 15. My efforts to get various people to leak the current slate of books underconsideration have come to nothing.

But on Friday, Daniel Green, who was until recently an adjunct instructor in English at the University of  Maine at Presque Isle, did agree to answer some questions by e-mail about the whole process. Green's blog The Reading Experience is part of the co-op. He also contributes to The Valve, sponsored by the Association of Literary Scholars and Critics. (For a critical take on the ALSC's sponsorship, check this   out.)

(One passage in the transcript below perhaps requires clarification for readers not up to speed on contemporary cultural exotica. Green refers in passing to "ULA-type 'transgressive' fiction" -- an allusion to the Underground Literary Alliance, a group best known for denouncing all other writers as effete elitists who are terrified of the ULA's plebian manliness. Those not persuaded by the polemics note that all ULA fiction tends to resemble an uninspired imitation of Charles Bukowsi by some inebriated adolescent recently hit on the head with a bowling ball.)

Q:  I'm struck by the sense that the Litblog Co-op embodies a strong criticism of how mainstream book publishing and reviewing are organized, with a tiny percentage of new novels getting a strong push, and the rest being left to fend for themselves. At the same time, the fact that you will be urging readers to take a chance on a novel without much of a market presence seems to be a rejection of what we might call the Amazon algorithm -- the notion that if the reader enjoyed X and Y, then the next logical choice is Z. Was there a shared sense, implicit or explicit, of what is wrong with things as they now stand?

A:   I definitely see the whole enterprise as a repudiation of the status quo in book publishing. I've put up some posts on my blog expressing rather forcefully my dismay at the status quo. (For example, see this and this.) Most of the other members of the co-op are critical of contemporary publishing as well, but perhaps they're not as cynical as I am. I do think there was agreement that most of the "awards" being given out were designed to puff up the publishing industry, and had very little to do with identifying good books that serious readers might want to read. We wanted to fulfill this responsibility as much as we were able to, given that literary weblogs seem to be acquiring a little more "presence." There wasn't much talk about the Amazon syndrome, but obviously the spirit of the litblog co-op is opposed to the Amazon way of selling books.

Q: So what particular impact might this enterprise might have? It seems that some care has been taken to define it as other than an award -- as if the intent is as much to influence readers as to recognize authors. But do the people involved have any larger goal, in terms of influencing the larger literary culture?

A: You're right that the intent is to influence readers. Thus the "selection" is simply called "Read This." I think that all of the participants believe that litblogs have reached an untapped, or at least undertapped, source of readers for both contemporary fiction and (in my case, at least) the critical discussion of literature more broadly. I also think that most of us hope that our quarterly selection and, if it catches on, the popularity of same, will serve notice to publishers and to the editors of book reviews and magazines that this audience exists. I myself don't have any illusions that serious fiction of the sort we're promoting will suddenly become very popular, or that the litblog co-op will begin to wield enormous influence, but I would hope that our selections would bring additional attention to worthy books from smaller or less well-endowed presses. Probably everyone would agree that that is the main goal.

If writers, readers, editors, book columnists, etc. would pay more attention to litblogs and to the tastes in fiction we're expressing, that would be nice. Not because of the attention per se but because we're illustrating that there are serious readers of fiction in this country.

Q: How much of your internal discussion has been on the merits of any particular title, and how much on the overall standards for what books are worth considering?

A:  To some extent, the deliberations on merit are just beginning. We've yet to make the selection, and I don't really know how much contention there will be. The bloggers involved do have diverse interests, but a surprising number of us really do like books that are "off the beaten track." The nominating process was free of contention. The standards/criteria for eligible books were worked out during the listserv discussions, but they have been left pretty wide open. The real goal is to focus during the nominating process on books that aren't being well-promoted in the mainstream press.

Q:  You just referred to books that are "off the beaten path," and not served well by the status quo. Can you characterize things more precisely than that? Do the books now under consideration have anything in common at a literary (rather than publishing) level?

A:   I'm looking for fiction that takes risks, but that also has a sense of literary craftsmanship. I'm not looking for ULA-type "transgressive" fiction that doesn't really transgress anything except aesthetic sensibility. Although I look first for fiction that is interesting on a formal level, I don't by any means rule out books that also "say something" -- as long as it's not just the same stuff everybody else is saying. Some of the nominated books do these things, while, as far as I'm concerned, the jury's still out
(literally) on others. Speaking only for myself, I want to avoid a situation where we start looking for the "representative" -- so many from a certain gender, so many from a certain ethnicity, so many expressing a known point of view, etc. I want to identify books whose authors are committed first of all to extending what's possible in fiction as a literary mode. I think we will probably also be more open to genre fiction than other awards or book selections tend to be.

Q: Last week the Associated Writing Programs (the professional organization for creative writing professors) had its convention. That made me wonder about something. I don't know how many participants in the Litblog Co-op have graduated from MFA programs in creative writing, but at least a few did. Will that have any effect on the process?

A: My suspicion is that more than a few of the participating bloggers will actually look askance at books by authors from MFA programs -- or at least the high profile programs. A few others will consider such books as just as deserving of attention -- if they're good -- as any others. So far, MFAers have not been that welcoming to literary blogs, nor have many creative writing programs themselves taken much note of what's going on vis-a-vis blogs and their influence on reading tastes. To the extent that MFA programs are perceived as part of the "literary establishment," there will probably be some resistance among some co-opers to emphasizing writers who've come from them.

I just put up a post on creative writing, so most of what I think of it can be found there (also in an earlier essay). Some good writers go through creative writing programs, but there's an awful lot of stagnation as well.

Q:  You've seen the shaping "Read This" from the inside. Would you comment on how difficult this sort of thing is to organize? How practical would it be for academic bloggers to unite to do something comparable -- say, for nonfiction from university presses that might have a potential audience among nonspecialist readers?

A: From my point of view it didn't seem that difficult. But Mark probably would disagree. He contacted a lot of people -- publishers, editors, publicists, etc. Even now he's busy getting some media notice for the project. He's pretty committed to this, so perhaps he would say it's an effort worth the labor involved. But I'm sure there is labor involved. The listserv discussions were mostly productive, and the logistics were all worked out gradually. Again, Mark drafted the "charter," and then the rest of us made revision suggestions. To me, it seemed to be a group of people who were excited about what they were doing and thus worked things out relatively harmoniously. If a comparable group of academic bloggers wereable to approach things in a similar way, I'm sure it could be done. Academic egos being what they are, however....

I can't say I'm perfectly happy with every detail of the final product (no one else in the group would probably be able to, either), but, all in all, it was a worthwhile endeavor that, as we were putting it together, seemed enjoyable as much as anything else.

Q: Any final thoughts?

A: Only that I was flattered to be invited to be part of the original group of co-op bloggers, and that the amount of interest our "selection" seems to be gathering suggests that in its very short existence litblogging has managed to establish itself as a medium of some influence and potential value.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee writes Intellectual Affairs on (usually) Tuesdays and Thursdays.

Write On

Later this year, I'll give a paper at the annual convention of the American Political Science Association. For someone who is not a political scientist, this is a bizarre prospect -- like one of those dreams in which you must take a final exam in a course you’ve never actually taken. My topic involves tracing one strand of neoconservative ideology back to its source in a far-flung mutation of Marxist theory. I’ve been doing the research for about 20 years, off and on, without ever quite supposing that it would culminate in a presentation in front of a bunch of professors.

Then again, the matter is sufficiently esoteric that "bunch" may not be the exact word. Chances are there will be more than enough chairs.

In any case, a mass of old books and photocopies are now stacked up, to an unstable height, on my desk. And on top of the pile there is a notebook. The reading notes, the rough outline, the first draft or two ... all will be written there, in longhand.

My friends and colleagues are occasionally nonplussed to learn that someone trying to make a living as a writer actually spends the better part of his workday with pen in hand. (It’s probably comparable to finding out that your doctor grows blood-sucking leeches in the basement.) Like an interest in the fine distinctions made by the ancient Trotskyists, my writing habits are idiosyncratic, anachronistic, and more or less impossible to justify in terms making any sense given the state of 21st century American culture.

Yet the rut is now too deep to crawl out of it. I have my reasons. Or perhaps, to be more precise, my rationalizations. Not that they persuade anybody else, of course. It’s particularly awkward when an editor asks for a progress report. There is a certain uncomfortable silence when I say, "Well, the notebook is almost full...." 

Nowadays, the word "text" connotes an artifact that is "always already" digitized -- something to be fed into a streamlined apparatus for circulating information. But the word itself comes from the Latin root texere, to weave, as in "textile."

In my own experience, though, writing is not so much the crafting of paragraphs as it is a matter of laboriously unknotting the thread of any given idea. And the only way to do that is by hand. The process is messy and not terribly efficient.

Writing this column twice a week, for example, is a matter of juggling two legal pads of different sizes, plus anywhere from one to three notebooks. It is easy to detect which parts were written with a cup of coffee in one hand: The sentences are long, the handwriting spiky, the parentheses nestled one inside the next. By its penultimate phase, the draft is a puzzling array of arrows, boxes, Venn diagrams, and Roman numerals. (Also, as the case may require, whatever lower-case letters of the Greek alphabet I can still remember.)

The effect resembles the flow chart for a primitive computer program to be run on a wheezy old tube-driven UNIVAC.

Only as the deadline approaches is anything actually typed up, in a kind of spastic marathon. By that point, a certain passage from Walter Benjamin always comes to mind: "The work is a death mask of its conception."
 
Actually, with hindsight, it’s easy to see that Benjamin got me started on this erratic and circuitous course. In a collection of essays and fragments called One Way Street, he offers a set of aphorisms on writing, including the one just quoted. (First published in 1926, it is now available in the first of a four-volume edition of his work in English published by Harvard University Press.)

"Let no thought pass incognito," Benjamin insisted, "and keep your notebook as strictly as the authorities keep their register of aliens." (A line that became more poignant after the Nazis came to power, forcing Benjamin to spend the rest of his life in exile.)

But one passage in particular made a huge impression on me. "Avoid haphazard writing materials," admonished Benjamin. "A pedantic adherence to certain papers, pens, inks is beneficial. No luxury, but an abundance of these utensils is indispensible."

As if to clinch it, there is an interview that Roland Barthes gave in 1973 that seems to ratify Benjamin’s point. Under the title "An Almost Obsessive Relation to Writing Instruments," it was  reprinted posthumously in a collection called The Grain of the Voice: Interviews 1962-1980, published by the University of California Press.  

In a gesture very typical of his structuralist penchant for creating categorical distinctions, Barthes notes that his own writing process goes through two stages: "First comes the moment when desire is invested in a graphic impulse," said Barthes. It was a phase of copying down "certain passages, moments, even words which have the power to move me," and of working out "the rhythm of a sentence" that gives shape to his own ideas. Only much later can the text be "prepared for the anonymous and collective consumption of others through transformation of into a typographical object" – a moment, according to Barthes, when the writing "is already beginning its commercialization."

Clearly the important phase is the one in which "desire is invested in a graphic impulse." And for that, you need the right tools. "I often switch from one pen to another just for the pleasure of it," Barthes told the interviewer. "As soon as I see a new one, I start craving it. I cannot keep myself from buying them."

The one exception was the Bic, which Barthes found disgusting: "I would even say, a bit nastily, that there is a 'Bic style,' which is really just for churning out copy...."

So the penchant for haunting stationery stores (and otherwise indulging a fetish for writing supplies) has the endorsement of  distinguished authorities. But my efficiency-cramping distaste for the computer keyboard is somewhat more difficult to rationalize.

Walter Benjamin and Roland Barthes died long before word processors were available, of course. But a good excuse not to write first drafts that way comes from the poet Ted Hughes, in a passage quoted by Alice W. Flaherty in her fascinating book The Midnight Disease: The Drive to Write, Writer’s Block, and the Creative Brain.

In an account of judging in a contest for children’s writing, Hughes recalled that the entries once tended to be two or three pages long. "But in the early 1980s," he said, "we suddenly began to get seventy and eighty page works. These were usually space fiction, always very inventive and always extraordinarily fluent – a definite impression of a command of words and prose, but without exception strangely boring...."

In each case, the kid had composed the miniature magnum opus on a word processor.

"What’s happening," according to Hughes, "is that as the actual tools for getting words onto the page became more flexible and externalized, the writer [could] get down almost every thought or extension of thought. That ought to be an advantage. But in fact, in all these cases, it just extends everything slightly too much. Every sentence is too long. Everything is taken a bit too far, too attenuated."

Which sounds, come to think of it, somewhat like what Barthes called "Bic style." And quite a bit like the output of various academic presses it would be discrete leave unnamed.

Not that writers had to wait for the advent of the word processor to produce work that was (in Hughes’s terms) "extraordinarily fluent" yet "strangely boring."

Indeed, in the mid-1920s, Walter Benjamin gave practical tips to scholars who wanted both to impress their readers by clobbering them into a stupor. In a satiric chapter of One Way Street called "Principles of the Weighty Tome, or How to Write Fat Books," he laid out the principles that many still follow today.

"The whole composition must be permeated with a protracted and wordy exposition of the initial plan," Benjaim wrote. "Conceptual distinctions laboriously arrived at in the text are to be obliterated again in the relevant notes....Everything that is known a priori about an object is to be consolidated by an abundance of examples.... Numerous opponents who all share the same argument should each be refuted individually."

Benjamin himself never got an academic position, of course. Even so, good advice is timeless.  

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee writes Intellectual Affairs on Tuesdays and Thursdays.

Strangely Enough

Over the past few days, an essay by Paul Maliszewski in the latest issue of Bookforum has stirred up a discussion that has been sometimes passionate, if seldom particularly well-informed.

In it, Maliszewski, who teaches creative writing at George Washington University, takes a close look at a lecture that Michael Chabon has given several times in which the Pulitzer-winning novelist recounts his childhood friendship with C.B. Colby, the author of Strangely Enough! and similar works of paranormal hokum, and also (Chabon says) the author of a Holocaust memoir called The Book of Hell, published under his real name, Joseph Adler. Only that, too, was a pseudonum. In fact, “Adler” was Viktor Fischer – a Nazi journalist who, after the war, concealed his identity, even to the extent of having a concentration-camp serial number tattooed on his arm.

Maliszewski, who heard Chabon give the lecture a few times, reports that the audience listened with fascination and horror. "The only problem was," he writes, "the personal story Chabon was telling, while he may have presented it as an authentic portrait of the artist, just wasn't true. There was no Adler; and no Fischer either, for that matter. Nor does there exist a Holocaust memoir called The Book of Hell, nor an investigation by The Washington Post. There is a young-adult book titled Strangely Enough!, which is pretty much as Chabon describes it; and it is written by a man named Colby -- though he wasn't, it must be said, a Nazi journalist who disguised himself as a Jewish survivor and holed up in the Maryland suburbs, but rather a real author, based in New York City and residing in Westchester County, who served in the US Air Force Auxiliary after World War II. . . .”

While the essay has been in print for a few weeks, only a portion of it is available online.  It provoked little discussion until the appearance, on Monday, of a gossipy and curiously inane New York Times article suggesting that Maliszewsi’s article was itself something of a hoax.

The stakes of the discussion are high: they include the role of the Holocaust in American Jewish identity, the ethical dimension of storytelling, and the fine line between fantasy and the will to believe. But the terms of the argument have degenerated at impressive speed. People who haven’t bothered to read the essay are denouncing Bookforum for irresponsibility in publishing it, and attributing all sorts of interesting motives to Maliszewski. There is a certain vigor of hysteria that goes with confusing uninformed indignation with critical perspective.

The particulars of the case are not up for dispute. Maliszewski demonstrates that Chabon’s lecture is a fiction. A search of relevant databases confirms that no title called The Book of Hell by Joseph Adler exists, nor was there (as Chabon stated) an expose on the author’s true identity  in The Washington Post.

On Tuesday, I spoke by telephone with Eric Banks, the editor of Bookforum, who said, "We did due diligence with the article. There was no intention of scandalmongering. It’s not that kind of piece, in any way, shape, or form. You can argue about the extent to which Chabon drops hints that the lecture is meant to be understood as the product of an unreliable narrator. But our factchecker listened to a recording of Chabon’s lecture at least four times, and we felt like Maliszewski’s account of it is accurate."

The claim that Chabon meant his talk to be interpreted as a "tall tale" has been argued by Matthew Brogan,  the program director of the Jewish literary association Nextbook, which is making one version of Chabon’s lecture available online.

But what has really sent the dispute into the red zone of bitter conflict is Maliszewski’s admission that he has more than a casual interest in the question of hoaxes. In a series of essays for The Baffler, he recounts working as a business reporter during the economic boom of the late 1990s. Writing under a number of pseudonyms, he began submitting letters-to-the editor that mimicked what he found particularly obnoxious about the market-worshipping mentality of The Business Journal of Syracuse, New York, the publication where he worked.

During the Teamsters strike of 1998, for example, one Maliszewski persona wrote in to wonder why it was such a big deal that UPS had been fined $4 million dollars for more than 1600 violations of worker safety. After all, the company was very profitable. And think how much more money it could make if it had twice as many violations. They should just consider the fines an operating expense.

His exercises in crackpot punditry started appearing in print, and Maliszewski entered the ethical grey zone where satire and hoaxery meet. He is now writing a book on the topic, with the essay on Chabon as part of it. (For now, you can read some of his Baffler confessions here.

Given his first-hand experience of perpetrating a hoax, the Times article and parrots in the blogosphere have suggested that Maliszewski is disqualified from ever commenting on the phenomenon. That notion is every bit as rational as demanding that sex research be done by certified virgins.

But there is another, stronger, and even more dubious axiom just beneath the surface of the discussion. It is the implicit belief that (so to speak) all hoaxes are created equal. They are morally and epistemologically identical. In particular, it assumes that Stephen Glass or Jayson Blair provide a sort of key to everything -- that hoaxing is, in effect, just a way of parlaying the time-saving convenience of fabrication into social status and ready money.  

Well, things are not always quite that straightforward. Of course, some literary hoaxes of a purely mercenary inspiration, such as the phony Howard Hughes memoirs of the early 70s or the forged Hitler diaries 10 years later. But it can be a very different matter when the hoax is intended primarily to make a satirical point -- a category that includes the reductio ad absurdum of a pseudonymous letter to the editor suggesting that a company violate more OSHA regulations to increase its profitability.

Then there are the satirical hoaxes that go horribly wrong, as happened with Report From Iron Mountain, published at the height of the Vietnam War. A parody of what C. Wright Mills once called the “crackpot realism” of the Cold War era, the Report pretended to think the unthinkable -- namely, to imagine how society could still get the benefits of war preparations in the unfortunate event of world peace. The document suggested instituting “a modern, sophisticated form of slavery” and “socially oriented blood games,” as well as fabricating “an established and recognized extraterrestrial menace.” (All of this proposed in a near-perfect imitation of social-science and think-tank prose.)

Although prepared as a critique of the military-industrial complex by humor writer Leonard Lewin -- with advice from Nation editor Victor Navasky -- the bogus Report was rediscovered in the 1990s by the militia movement, whose leaders decided that it was the blueprint for the New World Order. It was reprinted by the Free Press in 1996, in an edition intended to debunk the militias' enthusiasm for it by revealing the hoax. It was an occasion for much chuckling at the pathetic yahoos. But during the 1960s and 70s, the hoax had been effective enough to fool at least some academic social scientists.

Finally, there are the hoaxes that seem to defy any simple explanation. A case very much in point (one that Maliszewski cites in his essay on Chabon) is the memoir Fragments, by Binjamin Wilkomirski, in which the author recounts his childhood in a Nazi death camp in Poland. It was published to favorable reviews and became an international best-seller. Researchers have shown that Wilkomirski actually grew up in Switzerland, where he was raised in a secure, middle-class home. But he appears to believe his own story. You can certainly call his book a hoax, though doing so raises far more questions than it answers.

Maliszewski’s essay in Bookforum is alive to such problems -- which is what makes it particularly disgraceful that so few people have bothered to weight its actual argument before denouncing its author. From the commentary, you might suppose that Maliszewski’s purpose is to trash Michael Chabon -- to call him out as a fraud, at best, or perhaps someone with the mental problems implied by Binjamin Wilkomirski’s fantasies of a childhood in hell.

If you take the time to read the essay, though, you find a nuanced and searching analysis of the relationships between author and audience, between memory and fantasy, between story-telling and truth-telling. Maliszewski’s point is less that Chabon intends to trick his audience than that (for a variety of reasons) his listeners want the story to be true.

Nor does the corrosive effect of that desire mitigated by dismissing Chabon’s lecture as a "tall tale." There was actually someone named C.B. Colby who published a book called Strangely Enough! He wasn’t a Holocaust survivor or a secret Nazi – just a volunteer fireman, library-board member, and author of children’s books. "Real life," writes Maliszewski, "apprantely requires exaggerated stakes – a few teaspoons of the Holocaust, say, or some other dramatic supplement to fortify the work’s seriousness."

I’ve been trying to figure out why it doesn’t shock me so much to find this point being made by someone who has himself been on the other side of the process of fabrication. Perhaps it is the realization that there is a pretty good precedent for serving as both hoaxer and debunker.

Edgar Allen Poe perpetrated his share of raids on the public’s gullibility, including a couple of bogus news stories involving balloon flight. But Poe also published essays deducing that a famous 19th century chess-playing robot actually had a midget inside. Either way, there is the same fascination with the mechanisms of the hoax – with the game of revelation and concealment, the careful balance of plausibility and outrageousness.

Perhaps it’s a good thing Poe isn’t around now. The Times would denounce him as a fraud, and the bloggers would go nuts. On the other hand, it’s entirely possible that he might turn the whole situation to his own advantage. After all, when enough people feel entitled to form an opinion without the inconvenience of thinking very hard, the hoaxster’s work is already halfway done.

Full disclosure: I do not know Paul Maliszewski very well, but have met him on a couple of occasions, and have read some of his short stories and essays in various literary magazines. From a conversation, it appears that he shares my great fascination with the strange history of Report From Iron Mountain -- a factor which may have biased this column somewhat. His interest in literary hoaxes strikes me as, well, pluperfectly literary.

Be that as it may, it ought to be a prerequisite for any future commentary regarding his essay on Michael Chabon that the discussants have actually read it. It would also be useful to everyone if Chabon himself commented on the matter, which so far he has not done.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee writes Intellectual Affairs on Tuesdays and Thursdays.

Impure Literature

The publication, 100 years ago, of  The Jungle, by Upton Sinclair, in the popular American socialist newspaper Appeal to Reason had an enormous effect -- if not quite the one that its author intended. "I aimed at the public’s heart," Sinclair later said, “and by accident I hit it in the stomach.”

Drawing on interviews with workers in Chicago and his own covert explorations of the city’s meat-processing factories, Sinclair intended the novel to be an expose of brutal working conditions. By the time it appeared as a book the following year, The Jungle’s nauseating revelations were the catalyst for a reform movement culminating in the Pure Food and Drug Act. In portraying the life and struggles of Jurgis Rudkus, a Lithuanian immigrant, Sinclair wanted to write (as he put it), “The Uncle Tom’s Cabin of wage slavery,” thereby ushering in an age of proletarian emancipation. Instead, he obliged the bourgeoisie to regulate itself -- if only to keep from feeling disgust at its breakfast sausages.

In his introduction to a new edition of The Jungle, just published by Bedford/St. Martin’s, Christopher Phelps traces the origins and effects of Sinclair’s novel. Phelps, an associate professor of history at Ohio State University in Mansfield, is currently on a Fulbright fellowship in Poland, where he occupies a distinguished chair in American studies and literature at the University of Lodz. The following is the transcript of an e-mail interview conducted this month.

Q: At one of the major chain bookstores the other day, I noticed at least four editions of The Jungle on the shelf.  Yours wasn’t one of them. Presumably it's just a matter of time. What’s the need, or the added value, of your edition? Some of the versions available are pretty cheap, after all. The book is now in the public domain.

A:  Yes, it’s even available for free online these days, if all you want is the text. This new edition is for readers seeking context. It has a number of unique aspects. I’m pleased about the appendix, a report written by the inspectors President Theodore Roosevelt dispatched to Chicago to investigate Upton Sinclair’s claims about the meatpacking industry. In one workplace, they watch as a pig slides off the line into a latrine, only to be returned to the hook, unwashed, for processing. No other version of The Jungle includes this report, which before now had lapsed into obscurity. The new edition also features an introduction in which I survey the scholarship on the novel and provide findings from my research in Sinclair’s papers held by the Lilly Library at Indiana University. Finally, there are a lot of features aimed at students, including a cartoon, a map, several photographs, a bibliography, a chronology of Sinclair’s life, and a list of questions for discussion. So it doubles as scholarly edition and teaching edition.

Q: Let me ask about teaching the book, then. How does The Jungle go over in the classroom?

A:  Extremely well. Students love it. The challenge of teaching history, especially the survey, is to get students who think history is boring to imagine the past so that it comes alive for them. The Jungle has a compelling story line that captures readers’ attention from its very first scene, a wedding celebration shaded in financial anxiety and doubts about whether Old World cultural traditions can survive in America. From then on, students just want to learn what will befall Jurgis and his family. Along the way, of course, Sinclair injects so much social commentary and description that teachers can easily use students’ interest in the narrative as a point of departure for raising a whole range of issues about the period historians call the Progressive Era.

Q:  As you've said, the new edition includes a government report that appeared in the wake of the novel, confirming the nauseating details. What are the grounds for reading and studying Sinclair's fiction, rather than the government report?

A:  Well, Teddy Roosevelt’s inspectors had the singular mission of determining whether the industry’s slaughtering and processing practices were wholesome. Sinclair, for his part, had many other concerns. What drew him to write about the meatpacking industry in the first place was the crushing of a massive strike of tens of thousands of workers led by the Amalgamated Meat Cutters and Butcher Workmen of North America in 1904. In other words, he wanted to advance the cause of labor by exposing the degradation of work and exploitation of the immigrant poor.

When The Jungle became a bestseller, Sinclair was frustrated that the public furor centered almost exclusively on whether the companies were grinding up rats into sausage or disguising malodorous tinned beef with dyes. These were real concerns, but Sinclair cared most of all about the grinding up of workers. I included this government report, therefore, not only because it confirms Sinclair’s portrait of unsanitary meat processing, but because it exemplifies the constriction of Sinclair’s panorama of concerns to the worries of the middle-class consumer.

It further shows how Sinclair’s socialist proposal of public ownership was set aside in favor of regulatory measures like the Pure Food and Drug Act and Meat Inspection Act of 1906. Of course, that did not surprise Sinclair. He was proud, rightly so, of having been a catalyst for reform. Now, just as the report must be read with this kind of critical eye, so too the novel ought not be taken literally.

Q:  Right. All kinds of problems come from taking any work of literature, even the most intentionally documentary, as giving the reader direct access to history.

A: Nowadays The Jungle is much more likely to be assigned in history courses than in literature courses, and yet it is a work of fiction. You point to a major problem, which we might call the construction of realism. I devote a good deal of attention to literary form and genre in my introduction, because I think they are crucial and should not be shunted aside. I note the influence upon The Jungle of the sentimentalism of Harriet Beecher Stowe, of naturalist and realist writers like William Dean Howells and Frank Norris, and of the popular dime novels of Horatio Alger. Sinclair was writing a novel, not a government report. He fancied himself an American Zola, the Stowe of wage slavery.

A good teacher ought to be able to take into account this status of the text as a work of creative literature while still drawing out its historical value. We might consider Jurgis, for example, as the personification of a class. He receives far more lumps in life than any single worker would in 1906, but the problems he encounters, such as on-the-job injury or the compulsion to make one’s children work, were in fact dilemmas for the working class of the time.

In my introduction, I contrast the novel with what historians now think about immigrant enclaves, the labor process, gender relations, and race. There is no determinant answer to the question of how well The Jungle represented such social realities. Many things it depicted extremely well, others abominably, race being in the latter category. If we keep in mind that realism is literary, fabricated, we can see that Sinclair’s background afforded him a discerning view of many social developments, making him a visionary, even while he was blind in other ways. Those failings are themselves revelatory of phenomena of the period, such as the racism then commonplace among white liberals, socialists, and labor activists. It’s important that we read the novel on all these levels.

Q: Sinclair wrote quite a few other novels, most of them less memorable than The Jungle. Well, OK, to be frank,  what I've heard is that they were, for the most part, awful. Is that an unfair judgment? Was The Jungle a case of the right author handling the right subject at the right time?

A:  That's precisely it, I think. Sinclair was uniquely inspired at the moment of writing The Jungle. I've been reading a lot of his other books, and although some have their moments, they sure can give you a headache. Many of them read like failed attempts to recapture that past moment of glory. He lived to be ninety and cranked out a book for every year of his life, so it's a cautionary tale about allowing prolixity to outpace quality. The book of his that I like best after The Jungle is his 1962 autobiography, a book that is wry and whimsical in a surprising and attractive, even disarming, way.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee writes Intellectual Affairs on Tuesdays and Thursdays.

The Death of English

This year, even The New York Times could not work up the enthusiasm to perform a real pillory of the annual meeting of the association for for English and comparative literature professors, the Modern Language Association. Back in the day, the media would have a field day with the MLA panel listings and their wacky titles; and they would follow up their fish in a barrel shooting endeavors by visiting an MLA cocktail party and commenting upon the sad efforts of earnest English profs to look hip, cool and vaguely relevant.

A velvet jacket, a matching tie, a paper title that managed to combine canonical figures like Jane Austen with trendy theoretical currents like deconstruction or “outrageous” sexual behavior like masturbation were all fair game. But this year, it seemed, all the sport had gone out of the hunt for the pretentiously self-important humanities professor, and we had to settle for a lame review with its requisite cringing at paper titles with the word “lesbian” in them or “black” or postmodern,” or, God forbid, “postmodern black lesbian.” What’s gone wrong with our blatant attempts to convert the student population to communism by thinking up outrageous paper titles that combine sex and politics? Are we not owed at least a decent swipe, a casual cuff by the medias that be to remind us that they and not we control the minds and hearts of America?

I guess the sad truth is that no one really thinks (if they ever did) that English as a discipline poses a real threat to the status quo. The Culture Wars of the 1980s and 1990s led some of us to believe that the end of the canon, the end of seemingly objective appraisals of “aesthetic complexity” through close readings, the end of the representation of the culture of white males as culture per se, meant that some major battles in the politics of representation had been won. Some scholars however, suspected that the battle had simply shifted elsewhere and so while the critiques of the canon held strong, while courses on queer theory, visual culture, visual anthropology, feminist theory, literary theory began to nudge the survey courses, the single-author studies and the prosody classes aside, the discipline itself lost currency faster than the dollar. Nowadays, some English departments and most comparative literature departments are beset by massive declines in enrollment and petty squabbles within the ranks.

The Birmingham School in England in the 1970’s probably brought an end to English as we know it by proposing that the study of a small selection of texts written in English by a small group of mostly male white writers served to legitimate certain class interests in the university and elsewhere. By recognizing the predictable and unpredictable effects of culture upon politics and by insisting that the university begin to reflect the new forms of class and racial community in a postcolonial Britain, Stuart Hall, Raymond Williams and others buried the notion of literary study as a review of a great tradition and a consideration of what made it great.

The work that emerged from the Birmingham School and that came to be called cultural studies has combined with postcolonial studies, black studies, queer studies, ethnic studies and women and gender studies to create the humanities as we know it and to spawn the constellation of debates and arguments about empire, subjectivity, hegemony, resistance, subversion, imagination and representation that currently occupy contemporary academics and that briefly but powerfully impact the lives and consciousnesses of the students who pass through humanities class rooms and others who interact in a public sphere with versions of these conversations.

I do not want to be misinterpreted as saying that English, and all those who teach in the discipline are redundant; or that the conflicts that made English such a great site for vigorous debate for so long are over. Rather, the study of culture and the function and meaning of culture has moved far beyond the boundaries of the English department and rather than respond by expanding, morphing, shifting and transforming into some other kind of discipline or inter-discipline, English professors have made and keep making the mistake of digging in. We in English need to update our field before it is updated by some administrations wishing to downsize the humanities and before student questions about the relevance of the 18th century novel or Victorian poetry or restoration drama become a referendum on the future of the field.

And it is not that the18th century novel or Victorian poetry or Restoration drama have become irrelevant as areas of study; it is that the way we pursue the teaching of genres and periods has not kept up with the way we study and write about culture and literature and history. The beauty of English as a discipline in the last decade has been how flexible the field became, how receptive to new scholarship, how hospitable to queer theory, feminist studies, the study of race and ethnicity, political economy, philosophy and so on. "English" is in fact the anachronistic name we give to a far more protean field of interests and animating concerns; and the fights that we now have over English, over its relationship to the interdisciplinary forms it has given rise to, are really the aftershocks of an event that is well past.

I propose that the discipline is dead, that we willingly killed it and that we now decide as serious scholars and committed intellectuals what should replace it in this new world of anti-intellectual backlash and religious fundamentalism. While we may all continue doing what we do — reading closely, looking for patterns and disturbances of patterns within cultural manifestations, determining the complex and fractal relations between cultural production and hegemonies — once we call it something other than "English," (like cultural studies, critical theory, theory and culture, etc.) it will neither look the same nor mean the same thing and nor will it occupy the same place in relation to the humanities in general, or within administrative plans for down-sizing; it will also, I propose, be better equipped to meet the inevitable demands (which already began to surface after the last election) for an end to liberal bias on college campuses and so on.

Recent debates in women’s studies have led to the renaming of many of these departments. Some are now called women and gender studies, others have become critical gender studies or just gender studies. In the process of changing from women’s studies to critical gender studies, these programs have rearticulated their theoretical projects and shifted the emphasis away from reclamations of lost pasts and affirmation of neglected perspectives and towards the consideration of transnational feminisms, gender and globalization, gender and sexuality in relation to race and so on. In other words, a change of name changes everything and reflects everything that has already changed: it signals a re-conceptualization of the field, its foci and its methods and it notes an historic shift in the politics of knowledge.

English departments are now regularly supplemented in humanities divisions by interdisciplinary programs like American studies, Modern Thought & Literature (Stanford) and History of Consciousness (University of California at Santa Cruz). These interdisciplinary programs emerged as the result of shifts in the discipline that English could not accommodate and, in my opinion, they should be able to replace the traditional English department in the future by recognizing the impossibility of studying literature separate from other forms of cultural production and by exposing the counter-intuitive logic of building Humanities divisions around departments dedicated to the study of the literature and culture of the British Isles. American national culture, after all, does not derive in any obvious way from Britain and it certainly cannot any longer claim stronger links to British cultural history than to the cultural histories of the Americas or the Pacific Rim.

In a recent book titled The Death of a Discipline, Gayatri Spivak, one of the humanities’ most important and effective spokespeople, argues along slightly different lines that comparative literature as we know it needs a make over to acknowledge the move that has been made already away from comparative studies of European literatures to studies of the literatures of the Global South. Spivak argues that comparative literature and area studies, like certain forms of anthropology, constitute a colonial legacy in terms of the circulation of knowledge and that in order to confront and replace such a legacy, we have to reconstitute the form and the content of knowledge production.

The argument is typically elliptical but powerful and timely. Surprisingly, however, Spivak does not see the reorganization of the humanities as part and parcel of the rise of cultural studies, queer studies and ethnic studies; indeed, she tends to cast these interdisciplinary rubrics as part of the problem. For example, in an unfortunate move designed to recognize and hold on to the importance of the "close reading," Spivak designates "close reading" as a usable skill in the new comparative field she envisions and she prefers it to another kind of intellectual labor that, in her opinion, has come to be associated with the entirely "unrigorous" fields of ethnic and cultural studies, namely "plot summary."

The designation of the method of cultural studies and ethnic studies as "plot summary,” by Spivak, is supposed to indicate to the reader how mired these fields have become in plodding, identitarian concerns  and “plot summary” indicates a crude tendency to rehearse the “what” rather than the “why” or “how” of political process and cultural production.

I want to make common cause here with Spivak’s diagnosis of the problems of the discipline. But, while Spivak’s investment in the “close reading” and formalism betrays the elitist investments of her proposals for reinvention, I urge a consideration of non-elitist forms of knowledge production upon the otherwise brilliant formulations of The Death of a Discipline. If the close reading represents a commitment to a set of interpretive skills associated with a very particular history of ideas and a very narrow set of literatures, the plot summary indicates a much wider commitments to knowledge production, high and low. As any freshman comp instructor knows, the plot summary is a skill rarely mastered by the freshman writer and so even at the point when the neophyte English major is eating up metaphors and similes in gorgeous close readings of even the most banal passages, the plot summary, the skill to say what has happened succinctly and enagagingly while separating the relevant out from the irrelevant, the meaningful detail from the misleading truism, remains elusive.

Clearly we need both close reading skills and plot summaries to grapple with the confusing political realities of our times: what is the plot summary of the last election for example, or of the drama of the red versus the blue states? What is the plot summary for the narrative of gay marriage? How do we say what happens in a novel like Ulysses or Mrs. Dalloway? What is Michael Cunningham’s novel The Hours but a gorgeous summation of the plot of Mrs. Dalloway? What happens in Toni Morrison’s Beloved, in W.G. Sebald’s Austerlitz? What on earth is the plot outline of Spongebob Squarepants: The Movie?

Being able to say what happens, ultimately encapsulates the ability to say why and how it happens and for students and non-academics, plot summary may be a fruitful, relevant and crucial form of intellectual engagement. Still, Spivak’s obituary for a fallen discipline is timely and important and it begins the hard working of mapping a future for the interdiscipline of literary and cultural studies in terms of the development of more non-European language skills, more engagement with non-European literatures and more recognition of what Dipesh Chakravorty has termed the “provincialism of Europe.”

On the road to re-imagining our field or institutionally acknowledging how it has already changed, we have to make both practical and abstract changes. In addition to the proposals that Spivak and others have made for the future of the discipline, I would propose that we abandon the meat market hiring procedures of the MLA by breaking the discipline down into more manageable forms; we should also allow and encourage graduate students to write dissertations that do not nod to the canon or fall within the genre/period requirements of MLA hiring protocols.

We must imagine new categories of jobs: not Victorian Studies but studies of “Empire and Culture,” not 19th century American or English literature but “popular literatures of the Americas” or “modern print culture,” not romanticism but “the poetries of industrialization.” Or something. Let’s rename the interdisciplines within which we, and our students, work (Culture and Politics Program, World Literature, Global Cultures, Transnational American Culture) and let’s insist upon a wide range of language study at a moment when the United States is actively imposing monolingualism on an increasingly heterogeneous, multilingual population. Let’s serve up histories of English and American culture along with healthy doses of non U.S.-centric, non-contemporary cultural studies and let’s recognize that the university may be the last place in this increasingly conservative and religious country to invest in critical and counter-hegemonic discourse.

The end of English is not the wishing away of a traditional field, nor is it a fantasy of its replacement with something shiny, new and perfect; rather, it is the acknowledgement of the seismic shifts that have already changed the field and that have allowed for the rise of new forms of analysis and new areas of focus. In my career thus far, I have been in only two departments as a professor and each one, in its own way, has been committed to the transformation of the field.

At University of California at San Diego, the literature department, which basically represents the new discipline that Spivak calls for, has never privileged the study of English and has always located the study of English in modest relation to the study of Spanish, French, German and Italian. And in more recent years, the UCSD literature department has recognized the Eurocentricism of focusing on those literatures to the exclusion of the literatures of Vietnam, Taiwan, China, India and so on. Hiring in that department has recently looked to Asia, to the Americas, to a comparative version of Europe for its rubrics and to organize the field.

At my new job at the University of Southern California, recent clusters of senior hires have allowed for a fertile mix of the study of music, popular culture, sexuality and race to combust with the already impressive breadth of interests that characterized the department and to allow folks to contemplate the meaning of the field in new and exciting ways. The end of English is not the end of the relevance of the study of the literature of the British Isles, it is simply an opportunity, as I have found, to place that literature, English and others, in context in a rapidly changing world and on behalf of the invention of a new intellectual function in the humanities. Let’s hope that in another decade The New York Times has to attend not one monstrous conference like the MLA to report on a bundle of provocative titles but is forced to spend the entire year reporting in meaningful ways on the reinvigoration of the humanities after the death of English.

Author/s: 
Judith Halberstam
Author's email: 
info@insidehighereed.com

Judith Halberstam is a professor of English and director of the Center for Feminist Research at the University of Southern California.

Finding Her Place

There’s nothing like a class reunion for putting you in your proper place.

Last weekend I went to my first one – the 25th anniversary of my graduation from college. In years gone by, it never seemed like a good time to go back to my alma mater,  La Salle University. First I wasn’t making much money. Then I didn’t have a kid, own a house, or have tenure. My classmates, to judge by the alumni publications, were all well into six-figure incomes and had at least three kids each by the time of our 10th or 15th anniversary gathering. I couldn’t bear to go.

But this year I ran out of excuses. I’d published a fair amount, including a book; I’d served as department chair; I’d been promoted to full professor. I had little or nothing, professionally or personally, to be embarrassed about anymore. I could hold my head high amongst my peers from the class of 1980.

So I set off for the five-hour ride south on I-95 to Philadelphia. I planned to stay at my mother’s house, to arrive two hours early, shower, fix my hair, and change into the fabulous new outfit I’d bought for the occasion -- the first new clothes I’d bought in ages. Five hours later I was still two hours away from my college, listening to a Harry Potter book for the fourth or ninth time and cursing myself for not having gone to the public library for a new book on tape.

The reception I’d been looking forward to, the one where I’d see all my old friends from the school newspaper, was fast approaching, and I was not. There was no time to drive to my mom’s to shower and change. I would have to go straight to the college in my jeans and sneakers and change in a bathroom.  

But then it hit me. I work at a college; I know the way alums are treated. So I called the alumni office and explained my plight. No problem, they assured me. They had a spare townhouse in which I could shower and change and still make it to the reception on time. I did so and arrived at the reception, clean, before any of my friends who actually live in Philadelphia.

I’d never been an alum before, not in person. It was all new to me -- the open-bar parties, the crab-cake hors d’oeuvres, the alumni office staff treating me like visiting royalty. I guess they never know who has money and who doesn’t, so they’re nice to everyone.  

I had a great time at the reception, which honored one of my favorite teachers, the economics professor who runs the college’s honors program. It was great to see him and to see him praised. In his speech, he even singled me out, which seemed to me to be patently unfair to the arguably much more successful alums in the room, including the many lawyers, one of whom is a state representative. They were old news, as they’d all been back before. I was the prodigal, back after 25 years.

The dynamics among my friends, the college newspaper set, had not changed a bit. One old sports editor still made fun of the counterculture choices and left-wing politics of another former sports editor; my old roommate laughed at both of them and did her best to keep the peace. The old photo editor, now a corporate lawyer, retained his photographer’s distance from the action, fond of all parties and unwilling to take sides. The state representative drifted in and out; I wondered whether she was saving me from the awkwardness sure to arise if we ended up in a political conversation.

The photo editor and I went off to tour the college’s excellent art museum, and I found myself in a different kind of conversation, one much more familiar in recent years. The museum’s curator, it turns out, is an alum of the institution where I teach. She and I talked about the college, its new president, the new strategic planning committee, and what the campus was like when she attended.  Now I was on safe ground, representing the life I currently lead without having to explain it. This was a persona I found easy to inhabit, and it was a bit of a relief after negotiating how to talk to people I hadn’t seen in more than 20 years.

The next night was the class of 1980 dinner. I’d looked at the RSVP list and had known almost no one except my roommate, so a lot was hinging on whether we could sustain a conversation through an entire dinner. We’d already exchanged photos of our children the night before; what if we had nothing left to say to each other?

As I approached the student union building tentatively, not sure where the dinner was, I stopped to chat with some dining services staff who were taking a break outside in the late-afternoon sunshine. One of them admired my new outfit, and I told her how excited I’d been to find it, in a little import shop not far from my house. We talked about the price (not bad at all, they commented) and the various accessories, and they envied me the little shop. I confessed to wanting to look good in front of a bunch of people I hadn’t seen in more than 20 years.  They told me not to worry: “You got it going on, girl!” I hoped they were right.

After chatting with the college’s president over drinks -- how much easier it is to talk to a college president now -- I sat with my old roommate at dinner and was relieved to find that we liked each other still, or was it again? She was working for a charitable foundation after years at big accounting firms, and I was amused to see the new schmoozing skills she’d acquired in her fund-raising work. I’d seen those skills before, in our own development office staff.

At yet another party after the dinner, I stopped to talk to an elderly woman who’d been at our class dinner but whom I didn’t remember from any of my classes. She told me that when the college first went coed, in the 1970s, some of the male students had suggested to her, a 55-year-old worker in the cafeteria, that she take some classes.  She enrolled in the evening division, tuition free for college employees, and finished up the year I did, with a degree in sociology. The college helped her get work at a women’s shelter, and she worked there until she retired 10 years later. She was so grateful to the college, she said: “They were the best years of my life, when I was taking those classes.”  

I think I eventually figured out where I fit in that funny anthropological experiment that was the reunion. Somewhere between the cafeteria workers who liked my outfit and the lawyers and corporate vice-presidents with whom I got re-acquainted at the parties, I found myself as an alum. No need to compete in terms of social class or income when you have a Ph.D. and an academic job. No need to be embarrassed (or proud) about driving the little Ford or not sending my daughter to private school. The class position of the academic had social capital enough, for better or worse, to pull me through. Talking to the curator, the athletics administrator, the college president -- there I was in familiar territory. Hearing from that retired alum about what her bachelor’s degree had meant to her -- the story was different from the ones I hear at my current institution’s reunions, but the genre was the same.  

That’s why my old professor was pleased to see me -- I had staked my claim in the same place he had, in higher education. He remembered me as a working-class kid from the suburbs, and he was happy to have helped me see my way to a career in academe. I’m happy about it to, and I’m glad I gave the reunion a chance. Maybe I’ll do it again in another 25 years.

Author/s: 
Paula Krebs
Author's email: 
info@insidehighered.com

Paula Krebs is professor of English at Wheaton College, in Massachusetts.

Of Chivalry and Convention Badges

At the recent International Congress on Medieval Studies, at Western Michigan University, I received a mild surprise when I opened my envelope of conference materials: My badge had my name, the Congress logo, and a large blank space below my name where I was accustomed to seeing my college’s name.  

That’s right; contrary to the norms of academic conferences, the badge said nothing of where I was from. Wondering if this were a mistake, I quickly glanced around the room where confreres came and went and saw that no one had an institutional identity on his or her badge. Mirabile Mirabilis! Was this a new custom of the castle?

Well, at least there was now something to talk about at lunch. Of course, instead of glancing at a lunch partner’s badge and asking, "So what do you do at …?" I would have to ask, "So where are you from?" after which I figured the conversation could slip into safe, familiar channels (I’m used to this: The badges at the Conference on College Composition carry the conventioneer’s hometown rather than institution, so conversations quickly go the same way). If nothing else, there would be the topic of the blank spot below our names.

I imagined there would be inconveniences. This is a conference where many foreign accents and languages are heard, and it helps to know if someone is from Gröningen, Gdansk, or the Gutenberg Press (hey, some of us have book proposals to pitch). And sometimes one is happy to run across another who works with out-of-touch friends and schoolmates, something that can only be discerned from seeing a university name upon the badge.

These nuisances aside, the blank space below my name seemed downright chivalrous, as befits a medieval studies conference (and there were a few sessions on chivalry). It was one of the polite fictions of chivalry that all knights were fundamentally equal in their knighthood regardless of whether one was the Holy Roman Emperor or a pauper who couldn’t afford to keep his charger in oats. As in chivalry, so it was here:  we’re all medievalists. Does anything else matter?

Well, yes, it does, and the polite fiction that all professors are created equal runs about as far as the selfsame fiction about knights.  

At a scholarly conference, almost everything conspires to convey the notion that research is the privileged activity of our profession, and that, ergo, those whose badges say “I research” are the worthiest -- a good reason, I suppose, for the conference organizers to gamely try to suppress that signifier (and, to its credit, the medievalists’ group regularly offers a handful of sessions exclusively devoted to teaching – more than any other research-oriented conference I’ve attended).

It isn’t just the fact that we’re all here to exchange scholarship. When I catch up with a former professor from my prestigious grad program and begin waxing effusively about what I’m teaching, she cuts me short with, “But you are still writing, I hope” (I must be, since you are reading, but perhaps this isn’t what she meant). When my dissertation director asks me what I’m working on now, I know instinctively he’s not all that interested in how I taught the freshman research writing course.

Even at a session on “Teaching the Middle Ages at the Small Liberal Arts College,” a pleasant 90 minutes in which a tiny band commiserated and exchanged tricks for Monday morning (and got in the castle gate by disguising those tricks as scholarship), I felt a sense that we were huddled together, putting up a brave front against the profession’s real priorities. Other sessions discussed the hermeneutic practices of Cistercian monks and the play of signifiers in Chaucer. We discussed how to get our students to use the dictionary.

While, in the end, all of us at that session have found contentment, identity, and even a sense of calling teaching 12-credit loads at small liberal arts colleges or second-tier state institutions, nary one of us sits at table or in session across from a nametag advertising “Harvard” or “U. California” or “Carnegie Mellon” without a touch of envy and an anxious vacillation between self-affirmation and self doubt. But this is what I wanted, I tell myself. And I’m good at it. Teaching -- that’s what matters, anyway, not those obscure articles almost no one reads. Then again … could I have done something differently? Was it bad luck, or bad timing? Did I compromise too readily? Was I really not good enough?

As I popped my badge into its plastic holder, I mused that pride was the deadliest of deadly sins (there were sessions on those, too, for anyone who hasn’t yet mastered them) and wondered:  how long could this comity last? Not very, it turned out. I signed in Thursday morning, but by Thursday afternoon pens had been drawn and ink had been spilled. The first sighting was of a young lady who had written in large, legible letters under her name, “U. Toronto.” Then a few more popped up: Cornell. Yale. UCLA. All were sported by young adults who seemed to be grad students preening in their quality coats of arms, and I mused, "Flaunt it while you’ve got it. It won’t be long before you’ll be grateful to sign a contract to teach remedial writing at the Jonathan Edwards School for Sinners in the Hand of an Angry God." (I admit it. I’m prone to envy, with its accompanying bitterness and spite.)

And indeed, as a few more nametags popped up with handwritten university names upon them, I remarked that no one was advertising that he was from The Diminutive College of the Magna Mea Culpa, That Affordable Place across Town, or the Jim Bowie College of Cutlery Science. The handful who wrote in their colleges all touted names that suggested prestige, privilege, and class. And that’s chivalry. Where you’re from is who you are. Descent is destiny. Some knights are more equal than others.

On Friday I lunched with my dissertation director and some fellow medieval drama folk. One among us, this time a senior professor, had written “UCLA” under his name. He was engaged in an animated discussion about the trials of running department meetings with 65 members, when I interrupted: "My department has only six."

There was a moment of surprised silence and astonished looks, after which he asked me, "And where would that be?"

I told him.  He smiled and reached into his pocket, pulled out a nylon-tip pen, and offered it to me, saying, “Would you care to write that on your badge?” It was gracious. Courteous. Chivalrous.

I accepted, borrowed his pen, and wrote, “The College of St. Elizabeth.”  

"I’m leading a rebellion,” he said with a twinkle in his eye. Against what? I had to wonder, as I returned his pen. The new custom? The whole game of signifying prestige?

So for the rest of the conference, I happily bore my coat of arms, for which I received a few strange looks. Was it for violating the custom of the castle? For daring to advertise such a lineage? For mocking a prerogative of the prestigious? It may, of course, have merely been for my sloppy handwriting. Perhaps I was a curiosity of sorts as I entered the lists, my visor up and my heraldry fully visible, armed with nothing but a fresh bag of tricks for Monday morning.

And yet, revealing my origin was mostly to my benefit. It led to conversations with other small Roman Catholic college teachers who wanted to compare notes. One priest struck up a conversation about a nun of the order that founded my college whose spiritual conferences he had read. Monks and nuns looked kindly upon me. And conversations tended to begin, “So what do you do at the College of St. Elizabeth?” or “Where exactly is that?”  

Folks largely behaved as if seeing my college name was normal -- because it is, I suppose.

Author/s: 
John Marlin
Author's email: 
info@insidehighered.com

John Marlin is a professor of English at the small but spirited College of St. Elizabeth, in Morristown, New Jersey. He teaches writing, journalism and literature -- from Aeschylus to Austen.

Pages

Subscribe to RSS - English
Back to Top