English

Critical Condition

One of the sadder comic novels I’ve ever read (and the qualities of humor and melancholia do tend to go together) is Wilfrid Sheed’s Max Jamison, which appeared in Britain in 1970 as The Critic. The title character is a prominent cultural journalist, and sometime university lecturer, who is at the peak of his career -- meaning it’s all downhill from there. And he knows it. He’s becoming a parody of himself. In fact, the process is more or less complete. He imagines one of his old professors saying, “Jamison has this rigid quality, sometimes known as integrity, sometimes known simply as ‘this rigid quality.’ ”

The novel is set during the high tide of the 1960s counterculture. But Jamison is much too cerebral to go hippy, even for a little while. He’s read too much, seen too many plays, made too much a religion of the Higher Seriousness to tune in, turn on, or drop out.

“He was in love with the way his mind worked,” the narrator says, “and he was sick of the way his mind worked. The first thing that struck you about it, wasn’t it, was the blinding clarity, like a Spanish town at high noon. No shade anywhere. Yet not altogether lacking in subtlety. Very fine filigree work in the church. This was the mind they were asking him to blow.”

By the end of the novel, Jamison carves out a niche in academe and gets bogged down while writing a book called The Fallacy of the Post-Modern. (That would have been pretty avant garde in 1970, not like now.) I’m told that it was once common knowledge, in certain circles anyway, that the book is based on the career of Richard Gilman, a professor of drama at Yale who died last year.

If Sheed's novel holds up remarkably well after almost four decades, though, it's not for any roman àclef revelations about a specific person. Max Jamison is the intimate portrait of a mind at the end of its tether -- a mind not quite willing (or able) to cut that tether, and so condemned to circle around and around, at whatever limit it can reach, thereby digging itself into a rut. This is not an uncommon situation.

Rereading the novel this week, I winced at one line in particular about Jamison’s routine as a cultural journalist: “He doggedly went on reviewing, getting better, he thought, in a field where improvement is seldom noticed.”

Sheed himself has written eloquent and sharp-eyed commentary on books. His instinct as a satirist is not savage; he feels compassion, and even some indulgence for Jamison’s self-pity. But the man does know how to land a dart beneath the skin.

Well, perhaps criticism is “a field where improvement is seldom noticed” – but seldom doesn’t mean never.

Earlier this month, a party was held at a bookstore in downtown New York to announce the finalists for the National Book Critics Circle awards. The winners in each category will be honored when the final decisions are revealed at the awards ceremony on March 3. The event a couple of weeks ago was kind of a warm-up -- part press conference, part excuse for New York literary folk to drink and mingle.

The selection of finalists by the NBCC board -- narrowed down from a list of titles nominated by the organization’s 500 or so members -- sounds like a grueling process, so there was a lot of steam to blow off.

The evening was also the occasion for announcing this year’s winner of the Nona Balakian Citation for Excellence in Reviewing, named after a longtime editor at The New York Times Book Review. I received the award a few years go (one of those rare moments when the tether seems to stretch a little bit). And so, for continuity’s sake, they asked if I would make the announcement. A few minutes before going to the microphone, I was handed a folded piece of paper that identified the winner as Steven G. Kellman. The name seemed vaguely familiar, but it took me a minute to place it.

“He’s in San Antonio?” I asked an NBCC board member. “A professor of literature?”

Yes, and yes. Small world! In the late 1980s, Kellman had been a contributor to a little magazine in Texas with which a mutual friend was involved. (I think I still owe them a manuscript.) Kellman has also co-edited Magill’s Literary Annual -- a useful work found in the reference section of any good university library. He recently published Redemption: The Life of Henry Roth (Norton, 2005), an acclaimed biography of the author of Call It Sleep, one of the classic American novels of immigrant life.

The full bibliography of Kellman’s work runs to appalling length. The list of his scholarly works alone would be impressive. Once you count his pieces for newspapers and magazines, the question of whether he can somehow write in his sleep does come up. As another Balakian winner who saw the list told me, “He’s reviewed more books than I’ve ever read.”

Since that announcement, I’ve spoken with Kellman by phone. We’ve also exchanged some e-mail. Perhaps I was expecting to interview the critic in Sheed’s novel -- someone glad to be honored, yet also a little tired of playing the game. But Kellman doesn’t sound that way. The man's energy level is alarming.

“Pauline Kael, who was honored by the NBCC with a lifetime achievement award," he said, "once said that she considered herself a writer whose subject happened to be movies.... What hooked me on bookery was the exhilaration of slinging words on the page and making them prance. The impulse is the same whether I am writing an academic monograph whose print run is in the low four figures or a guest column for Newsweek.”

He calls it “dispiriting" that "so many of those who profess literature, who have dedicated their lives to discovering and sharing the delicacies and intricacies of verbal art, display dull indifference to their own use of language.” The result, he says, is usually prose as succulent as a bowl of mashed turnips.

I asked him what models he followed in creating a style distinct from the monographic monotone.

One was Edmund Wilson, “the patron saint of public intellectuals, before that portentous term needed to be coined.” Others included Irving Howe, Susan Sontag, and George Orwell.

All of them being the usual suspects, of course. But one figure he mentioned as an inspiration did stand out: Leslie Fiedler, who was professor emeritus of English at the State University of New York at Buffalo when he died, four years ago this month. Fiedler won the lifetime achievement award of the National Book Critics Circle in 1998. (Again, small world.)

In his prime, Fiedler was an intellectual wild-man -- a critic who began writing about gay multicultural subtexts in classic American literature as early as 1948, when most of his readers thought he must be joking; who focused on the images and archetypes found in both serious literature and pop culture, as if both were necessary to understanding the human condition; and who started publishing work in the field known as disability studies before there was a field known as disability studies.

Kellman co-edited a festschrift called Leslie Fiedler and American Culture, published by the University of Delaware Press in 1999. As it happens, when that book appeared, I attempted to persuade the culture editor of an American magazine to let me review it. “Oh no,” she said, “we don’t want that. I mean, isn’t Leslie Fiedler crazy?”

Sure he was -- like a fox. Fiedler is one of those authors you can turn to for energy when your own is flagging. Kellman told me he was “inspired and humbled by the stunning example” of Fiedler’s criticism. He wrote “with zest, and not just about love and death in the American novel, as if that were not enough, but also about Dante, Shakespeare, science fiction, Siamese twins, and much else.” 

He also seems to have absorbed Fiedler’s willingness to address a large audience, whenever the chance presented itself. Aside from his literary journalism, Kellman has written a newspaper column (for which he received the H.L. Mencken Award in 1986) and reviewed more than a thousand films. I asked if such extracurricular efforts had caused him any trouble -- disapproving noises from colleagues, worries for the condition of his soul, etc.

“It doesn’t really come up,” he said. “Sometimes it’s as if each side doesn’t know about the other, since people in journalism don’t seem to pay attention to my academic work, either. I’d like to think that my journalism is made sharper and stronger by the discipline I come from. It seems to me like I benefit from being in both worlds. But they don’t really meet.”

Maybe that is for the best. It means nobody is asking him to make a choice between academe and the public sphere.

“I think living exclusively in either one,” he said, “would start to feel claustrophobic.”

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

A Dean's View of the MLA Report

In one of his Meditations, Marcus Aurelius, the Roman emperor and Stoic philosopher, wrote, "Time is like a river made up of events which happen, and a violent stream; for as soon as a thing has been seen, it is carried away, and another comes in its place, and this will be carried away too." This sense of being a part of a time of incessant change animates the 2006 report of the MLA Task Force on Evaluating Scholarship for Tenure and Promotion. Begun in 2004, it is a rich, important document for anyone who wishes to reflect upon the contemporary rivers and streams of change of the academy.

I come to the report as a dean, specifically a graduate dean of arts and science in a large research university. Unlike Marcus Aurelius, I am no emperor. I find it a privilege to be a dean, even though the job has tempered my habitual optimism with stoicism. To oversimplify, the report treats the theme of change in the profession of modern languages and literature in three ways: the structural changes in United States higher education since World War II and their consequences for the humanities, especially for humanities faculty members; changes in the granting of tenure and promotion that people feared might happen but that seem not to have happened, at least not yet; and changes that ought to happen if the profession is to be wise, academically and socially useful, and robust.

Among the most important changes that the report explores is the well-documented rise of positions, full-time and part-time, that are off the tenure ladder. Tenure is increasingly limited to research universities and more affluent liberal arts colleges. Yet again, the rich are getting richer.  As a dean, I miss in the report a passionate yet logical definition and defense of tenure that I might use for several audiences --- the tuition-paying students who quickly turn to instant messaging in a class taught by a member of the Dead Wood Society, the trustees who wonder why academics should have job security when almost no one else does.  I can make such a defense, and have, but if tenure matters -- and an implicit conviction of the MLA task force is that it does -- then the defense must emanate from all of us who believe in it.

One pervasive anxiety explored in the report concerns a student’s life after the doctorate. The MLA report estimates that of every 100 English and foreign language doctoral recipients, 60 will be hired to tenure-track positions within 5 years. Of them, 38 will be considered for tenure at the institution where they were hired. Of them, 34 will be awarded tenure. The report, unfortunately, cannot say what happens to the 22 who leave the institution where they were hired before the tenure ordeal.  In my experience, some get recruited to another institution. Some drop out because they will believe they will not get tenure. Some take administrative jobs within higher education, and are judged as administrators, but still do vital scholarship and teaching. Some go on to non-academic careers, for which graduate school in the humanities still insufficiently prepares them.

As a graduate dean, even as I wonder about the 22 doctoral recipients who leave the institution that first offered them a tenure-track job and even as I celebrate the 34 Ph.D.s who do get tenure-track jobs, I feel that now well-honed guilt, anger, and concern about the 40 who are not hired to tenure-track positions within 5 years. To be sure, some deliberately and happily choose not to go on in academic life, but others would prefer to become academics. Despite all the national studies, including this report, about the oversupply of doctorates in the humanities, self-interested, faculty-controlled graduate programs are still too reluctant to limit admissions, still suspicious about doing regional coordination of graduate curricula and courses, and still petitioning for more financial aid and more students to teach. It is vulgar to call this a case of “Bring in the clones,” but the phenomenon yet again reveals, I have sadly concluded, how much easier it is to act on behalf of one’s self and one’s family, here the department or program, than on behalf of more abstract and psychologically distant goods, here the well-being of potential graduate students and of the profession as a whole.

The MLA report’s signal contribution is the call, by an impeccable committee of leading humanists, for a serious rethinking of scholarship and scholarly inquiry, which would then have ramifications for the conduct of academic institutions. I can see nothing but good coming out of such a rethinking, to be undertaken both nationally and locally, faculty member by faculty member, department by department, and institution by institution, as each articulates its particular role in the academic and social landscape. These roles will and should differ. Each will be important. The royal road to national prominence can take a number of routes and be paved with a variety of materials --- from yellow bricks to high-tech composites.

More specifically, the MLA report urges us to ask why the monograph has become the pinnacle of scholarly achievement, “the gold standard.” Why not the essay, or a series of linked essays? Why not other forms of scholarly achievement? And why must the dissertation be a “proto-book?” Why indeed? Is there any other form that the dissertation might take? I once had a conversation with a leading Renaissance scholar shortly after I became a graduate dean. “What is the most important reform in graduate education?” I asked. “Change the dissertation,” she said. Surely what matters about the dissertation is less the exact format than a form that displays what this capstone activity must display: respect for past work coupled with originality, independence of thought, and the capacity for sustained inquiry. Rhetorical flair would be nice, too. I have also argued for some years that the humanities graduate curriculum needs a vigorous overhaul, offering more common courses that programs share, including some introductory courses that would comprise a general education for graduate education. Among them could be, at long last, a required course in the ethics and history of scholarship.

Moreover, because of those new communications technologies, much scholarly inquiry is now being done digitally. Some of the most important work about and in digitalized scholarship is appearing from university presses, an invaluable resource that the task force correctly praises and for which it seeks more institutional resources. Yet many departments are clueless, all thumbs in the old-fashioned sense of the phrase, in doing evaluations of digital scholarship that respect peer review. Of the departments in doctorate-granting institutions that responded to the MLA’s survey, 40.8 percent report no experience evaluating refereed articles in electronic format, and 65.7 percent have no experience evaluating monographs in electronic format. This finding is similar to that of another useful study, here of five departments, including English-language literature, at the University of California at Berkeley. It concludes that what matters most in judging scholarship is peer review, but e-publishing is still tainted because peer review does not seem to have touched it sufficiently. Scholars are willing to experiment with digital communications. However, for nearly all, the “final, archival publication” must still appear in a traditional format. Only if faculty values change, the Berkeley report correctly suggests, will scholarly communications change. Deans may propose, but faculty actually dispose in questions of academic and curricular values.

The MLA report rightly argues that the academy tightly couples the canons of scholarly accomplishments with the awarding of tenure and promotion.  In brief, a faculty member gets the latter if s/he respects the former. Even as the report asks for a re-evaluation of these canons, it offers a series of recommendations for the administering of a transparent, fair tenure and promotion process. For the most part, these are sensible, and indeed, I was surprised that they are not already installed as best practices at most institutions. Of course, if possible, institutions should give junior faculty start-up packages if the institution is to require research and publication.  Of course, “collegiality” should not be an explicit criterion for tenure, because it might reward the good child and punish the up-start. However, a dean cautions, because tenure is forever, at least on the part of the institution, it is legitimate to ask how a candidate will contribute to the institution’s long-term well-being.

From this admonitory dean’s perspective, the report strays into boggy ground in its brief analysis of appropriate relations between someone up for tenure and the external letters that a tenure dossier now requires. “Candidates,” it states, “should have the privilege and the responsibility of naming some of their potential reviewers (we recommend half)." Candidates, the report further argues, should be able to exclude one or two figures whom they believe might be prejudicial.  This is a really bad idea. If tenure candidates were to have this power, the dispassionate and collective objectivity that is the putative value of peer review would be lost, and self-interest would fill the vacuum. Moreover, the temptations of cronyism, which external letters were meant to squash but which still flourishes among tenured faculty, might appear in a junior guise, accompanied by various modes of ingratiation with the powerful in a field who might then write a sweetly affirming letter.

Strangely, sensitive though the MLA report is to the growth in the number of non-tenure track jobs, and to the meaning of this growth, it is less radical than it might be in imagining the role of full-time, non-tenured scholars within an institution. The report argues, “The dramatic increase in the number of part-time non-tenure-track faculty members puts increased demands and pressure on all full-time tenure-track and tenured faculty members in many areas for which the casualized work force is not -- and should not be -- responsible: service on department committees and in departmental governance; student advising; teaching upper-level undergraduate and graduate courses; directing dissertations; and, less concretely but no less importantly, contributing to intellectual community building in the department and outside it, in the college and university….” But surely a qualified non-tenured faculty member should be able to be a significant academic citizen. Surely the report does not mean to construct such a hierarchy of faculty members with the tenure-track faculty as the philosopher kings and queens and the non-tenure-track professors as credentialed drones. If the report had more fully defined and defended tenure, it might have explored more adequately the distinctions and the overlap between not having and having tenure.

Let me not end with caviling and quibbling, but instead reiterate my respect for the conviction expressed by the task force about the profession’s relation to change.  It concludes, “It is up to us, then, the teacher-scholars of the MLA, to become agents in our academic systems and effect changes that reflect and instantiate appropriate standards of scholarly production and equity and transparency for our colleagues, our institutions, and our society.” Or, if a mere dean might revise the language of both a strong committee and an emperor, we neither helplessly observe nor flaccidly drift in the rivers of time. We shape their banks. We dam them or divert them or find new springs with which to refresh them. We build our rafts of thought and boats of words and navigate them. Bon voyage to us all.

Author/s: 
Catharine R. Stimpson
Author's email: 
info@insidehighered.com

Catharine R. Stimpson is dean of the Graduate School of Arts and Science at New York University and a past president of the Modern Language Association.

Remember Baudrillard

A few days ago, I tried the thought experiment of pretending never to have read anything by Jean Baudrillard – instead trying to form an impression based only on media coverage following his death last week. And there was a lot more of it than I might have expected. The gist being that, to begin with, he was a major postmodernist thinker. Everyone agrees about that much, usually without attempting to define the term, which is probably for the best. It also seems that he invented virtual reality, or at least predicted it. He may have had something to do with YouTube as well, though his role in that regard is more ambiguous. But the really important thing is that he inspired the "Matrix" movie franchise.

A segment on National Public Radio included a short clip from the soundtrack in which Lawrence Fishburn’s character Morpheus intones the Baudrillard catchphrase, “Welcome to the desert of the real.” The cover of Simulacra and Simulation -- in some ways his quintessential theoretical text, first published in a complete English translation by the University of Michigan in 1994 -- is shown in the first film. Furthermore, the Wachowski brothers, who wrote and directed the trilogy, made the book required reading for all the actors, including Keanu Reeves. (It is tempting to make a joke at this point, but we will all be better people for it if I don’t.)

There was more to Baudrillard than his role as Marshall McLuhan of the cyberculture. And yet I can’t really blame harried reporters for emphasizing the most blockbuster-ish dimensions of his influence. "The Matrix" was entertainment, not an educational filmstrip, and Baudrillard himself said that its take on his work “stemmed mostly from misunderstandings.” But its computer-generated imagery and narrative convolutions actually did a pretty decent job of conveying the feel, if not the argument, of Baudrillard’s work.

As he put it in an essay included in The Illusion of the End (Stanford University Press, 1994): “The acceleration of modernity, of technology, events and media, of all exchanges – economic, political, sexual – has propelled us to ‘escape velocity,’ with the result that we have flown free of the referential sphere of the real and of history.” You used to need digitalized special effects to project that notion. But I get the feeling of being “flown free of the referential sphere of the real and of history” a lot nowadays, especially while watching certain cable news programs.

Some of the coverage of Baudrillard’s death was baffled but vaguely respectful. Other commentary has been more hostile – though not always that much more deeply informed. A case in point would be an article by Canadian pundit Robert Fulford that appeared in The National Post on Saturday. A lazy diatribe, it feels like something kept in a drawer for the occasion of any French thinker’s death – with a few spots left blank, for details to be filled in per Google.

A tip-off to the generic nature of the piece is the line: “Strange as it seems, in the 1970s much of the Western world was ready to embrace him.” Here, Fulford can count on the prefab implication of a reference to that decade as a time of New Left-over radicalism and  countercultural indulgence. In fact Baudrillard was little known outside France until the 1980s, and even then he had a very small audience until late in the decade. The strong mood coming from most of Baudrillard’s work is that of bitter disappointment that oppositional social movements of earlier years had been neutralized – absorbed into academic bureaucracy and consumer society, with no reason to think that they would revive.

And if we are going to play the game of periodization-by-decade, well, it is perhaps worth mentioning that “much of the Western world was ready to embrace him" only after several years of watching Ronald Reagan -- a man whose anecdotes routinely confused his roles in motion pictures with actual experiences from his own life -- in a position of great power. The distinction between reality and simulation had been worn away quite a bit, by that point. Some of Baudrillard’s crazier flights of rhetoric were starting to sound more and more like apt descriptions of the actual.

Even then, it was by no means a matter of his work persuading university professors “that novels and poems had become irrelevant as subject matter for teaching and research,” as the macro setting for culture-war boilerplate on Fulford’s computer puts it.

Enthusiasm for Baudrillard’s work initially came from artists, writers, and sundry ne’er-do-wells in the cultural underground. The post-apocalyptic tone of his sentences, the science-fictionish quality of his concepts, resonated in ways that at least some people found creatively stimulating, whether or not they grasped his theories. (True confession: While still in my teens, I started writing a novel that opened with an epigraph from one of his books, simply because it sounded cool.)

Baudrillard’s work played no role whatever in the debates of “the canon” to which Fulford alludes. But he was, in a different sense, the most literary of theorists. He translated Bertolt Brecht, among other German authors, into French. Some of his earliest writings were critical articles on the fiction of William Styron and Italo Calvino. In 1978, he published a volume of poems. And a large portion of his output clearly belongs to the literary tradition of the aphorism and the “fragment” (not an unfinished work, but a very dense and compact form of essay). These are things you notice if you actually read Baudrillard, rather than striking po-faced postures of concern about how literature should be “subject matter for teaching and research.”

Besides, it is simply untrue to say that Baudrillard’s reception among American academics was one of uncritical adulation. If there was a protracted lag between the appearance of his first books in the 1960s and the dawn of interest in his work among scholars here in the 1980s, that was not simply a matter of the delay in translation. For one thing, it was hard to know what to make of Baudrillard, and a lot of the initial reception was quite skeptical.

In the mid-1960s, he became a professor of sociology at the University of  Paris at Nanterre , but the relationship of his work to the canon of social theory (let alone empirical research) is quite oblique. It’s also difficult to fit him into the history of philosophy as a discipline. Some of his work sounds like Marxist cultural theory, such as the material recently translated in Utopia Deferred: Writings for ‘Utopie’ 1967-1978 -- a collection distributed by MIT Press, a publisher known, not so coincidentally, for its books on avant-garde art. Still, there is plenty in Baudrillard’s work to irritate any Marxist (he grew profoundly cynical about the idea of social change, let alone socialism). And he delighted in baiting feminists with statements equating femininity with appearance, falsehood, and seduction.

Baudrillard was, in short, a provocateur. After a while that was all he was – or so it seemed to me, anyway. The rage of indignant editorialists notwithstanding, a lot of the response to Baudrillardisme amounted to treating him as a stimulating but dubious thinker: not so much a theorist as a prose-poet. A balanced and well-informed critical assessment of his work comes from Douglas Kellner, a professor of philosophy at UCLA, who wrote Jean Baudrillard: From Marxism to Postmodernism and Beyond (Stanford University Press, 1989), the first critical book on him in English. Kellner has provided me with the manuscript of a forthcoming essay on Baudrillard, which I quote here with permission.

“So far,” he writes, “no Baudrillardian school has emerged. His influence has been largely at the margins of a diverse number of disciplines ranging from social theory to philosophy to art history, thus it is difficult to gauge his impact on the mainstream of philosophy, or any specific academic discipline.”

At this point I’d interject that his questionable position within the disciplinary matrix (so to speak) tends to reinforce Baudrillard’s status as a minor literary figure, rather than an academic superstar. Kellner goes on to note that Baudrillard “ultimately goes beyond conventional philosophy and theory altogether into a new sphere and mode of writing that provides occasionally biting insights into contemporary social phenomena and provocative critiques of contemporary and classical thought. Yet he now appears in retrospect as a completely idiosyncratic thinker who went his own way....”

Not that Baudrillard exactly suffered for going his own way, however. A self-portrait of the postmodern intellectual as global jet-setter emerges in the five volumes of his notebook jottings published under the title “Cool Memories.” You get the sense that he spent a lot of time catching planes to far-flung speaking engagements – not to mention seeing various unnamed women out the door, once they had been given a practicum in the theories worked out in his book De la Séduction.

Many of the writings that appeared during the last two decades of his life simply recycled ideas from his early work. But celebrity is a full-time job.

One offer he did turn down was the chance to do a cameo in one of the Matrix sequels. (Instead, it was Cornel West who did his star turn onscreen as gnomic philosophical figure.) Still the appearance of "Simulacra and Seduction" in the first film greatly increased the book’s distribution, if not comprehension of its themes.

According to Mike Kehoe, the sales manager for the University of Michigan Press, which published the English translation, sales doubled in the year following “The Matrix.” The book had often been assigned in university courses. But those sales, too, jumped following the release of the film.

Rather than indulging my own halfbaked quasi-Baudrillardan speculations about how his theories of media metaphysics were reabsorbed by the culture industry, I decided to bring the week’s musings to a close by finding out more about how the book itself ended up on screen.

“It wasn’t the usual sort of product placement,” LeAnn Fields, a senior executive editor for the press, told me by phone. “That is, we didn’t pay them. It was the other way around. The movie makers contacted us for permission. But they reserved the right to redesign the cover for it when it appeared onscreen.”

The familiar Michigan edition is a paperback with bergundy letters on a mostly white cover. “But in the film,” said Fields, “it become a dark green hardcover book. We were quite surprised by that, but I guess it’s understandable since it serves as a prop and a plot device, as much as anything.” (If memory serves, some kind of cyber-gizmo is concealed in it by Keanu Reeves.)

I asked Fields if the press had considered bringing out a special version of the book, simulating its simulation in a deluxe hardback edition. “No,” she said with a laugh, “I don’t think we ever considered that. Maybe we should have, though.”

Recommended Reading: Mark Poster's edition of Baudrillard's Selected Writings, originally published by Stanford University Press in 1988, is now available as a PDF document. The single best short overview of Baudrillard's work is Douglas Kellner's entry on him for the Stanford Encyclopedia of Philosophy. There is an  International Journal of Baudrillard Studies  that publishes both commentary on his work and translations of some of his shorter recent writings. 

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Beatles vs. Stones

This morning I received an e-mail from a new colleague of mine about some workshop topics on writing. I met her last week at the massive Conference on College Composition and Communication (CCCC), in New York City. We’re both new members of the executive board of the Assembly for Expanded Perspectives on Learning (AEPL), a National Council of Teachers of English (NCTE) affiliate organization that is interested in promoting teaching and learning beyond traditional disciplines and methodologies.

She’s the recently elected associate chair trying to brainstorm some ideas for upcoming workshops and conferences. I’m the new treasurer trying to get my Excel columns to add up right.

At our meeting in the escalatored bowels of the Manhattan Hilton, the board agreed that the 2008 workshop would be titled “The Rhetorical Art of Reflection,” but in her e-mail today to me and the other board members, she suggested that the 2009 workshop might be on a topic related to the connections between music and writing.

This e-mail popped up as I was sitting here at my laptop in my university office  listening to Van Morrison's album “Tupelo Honey” and writing copy for a Web site for our recently approved general education program and curriculum.

I wrote back to her wondering if anyone else like me had this kind of continual digital soundtrack running through their media players while tapping along on their keyboards and wristing red laser mouse pods. I thought it would be interesting to find out what other folks listen to when they write, headphoned or not. I also recommended a new income-generating idea for our little AEPL assemblage: a CD collection of greatest hits for writing, recommended by the usual galaxy of comp/rhet stars. Hey, Peter Elbow! What are you listening to? Cheryl Glenn? Raul Sanchez?

My preferences for writing of course are situational, just like they should be for any good rhetorician. As I’m writing this essay, I’m listening to “Ethiopiques, Vol. 4: Ethio Jazz & Musique Instrumentale, 1969-1974” by musician-arranger Mulatu Astatqe. My daughter sent it to me last year, and I ripped it immediately into my playlists. Other writing favorites in jazz include “Consummation” by the Thad Jones & Mel Lewis Orchestra, passed on to me by my neighbor Bill, Lionel Hampton’s “Mostly Ballads” and “Mostly Blues,” and some other favorites from the early 70’s: Keith Jarrett’s “The Köln Concert,” and “The Colours of Chlöe” by Eberhard Weber.

Here at my desk with the tangle of wires running from the scanner, printer, PDA cradle, and leftover Gateway 2000 speakers, I start off the day usually with something to get the blood moving, like Los Pregoneros Del Puerto and their traditional music of Veracruz, Paco de Lucia’s “Anthologia Vol. 1,” or that dobro-infused live double play by Alison Krauss and Union Station.

Or if I’m particularly stressed out and need to write and relax, I click on “Union” or “Devotion” by Rasa, R. Carlos Nakai’s “Cycles. Vol. 2,” or Clannad’s “Landmarks.”

But if I’m just chugging along during the day, I go to the old faithfuls: the soundtrack from Ken Burns’ “Lewis and Clark: The Journey of the Corps of Discovery,” Mary Chapin Carpenter’s “Stones in the Road,” Dylan’s “Blood on the Tracks,” some Puccini or Neil Young’s “Comes a Time.”

Given the slice and dice randomized nature of iTunes and Napster, I realize that speaking of music in terms of albums is very old school, but the extended play of the 50 to 60 minute tune after tune fits my writing rhythm pretty well. Once a playlist is over, I know it’s time to take a break, push away from my desk, stand up and lean back to stretch out my stiff back, wander out into the hallway of that other world, or walk downstairs and check my campus mailbox to see what junk I can toss into the recycling bins nearby.

When I was a longhaired college kid, I had Crosby, Stills, and Nash, Marvin Gaye, Cat Stevens, and Joni Mitchell in pretty much constant rotation on my scratchy stereo, one skewered vinyl dropping down on the next until it was time to flip the stack over again. In those days, I was listening for lyrics and rhyme as much as anything, thinking I was a writer in the company of writers who also happen to play music. These days I’m listening for melody and rhythm as much as anything, thinking I’m a writer in the company of musicians who also happen to keep me writing.

I guess I don’t know if a workshop on music and writing is such a good idea after all. Right now I’m thinking it would be just about as useful as any other workshop on the preferences folks have about writing: pencil vs. pen, medium vs. fine tip, black vs. blue, laptop vs. desktop, blank pad vs. college-ruled vs. yellow legal pad, at the desk vs. in bed, PC vs. Mac, Bach vs. Mozart. Seems all too personal, finicky, and idiosyncratic to me. Kind of like writing, if you know what I mean.

Author/s: 
Laurence Musgrove
Author's email: 
info@insidehighered.com

Laurence Musgrove is an associate professor of English and foreign languages at Saint Xavier University, in Chicago.

Hard Wordes in Plaine English

Longtime readers of Intellectual Affairs may recall that this column occasionally indulges in reference-book nerdery. So it was a pleasant but appropriate surprise when the Bodleian Library of the University of Oxford  provided a copy of its new edition of the very first dictionary of the English language. It has been out of print for almost 400 years, and the Bodleian is now home to the one known copy of it to have survived.

Available now as The First English Dictionary, 1604 (distributed by the University of Chicago Press), the work was originally published under the title A Table Alphabeticall. It was compiled in the late 16th century by one Robert Cawdrey. The book did not bring him fame or fortune, but it went through at least two revised editions within a decade. That suggests there must have been a market for Cawdrey’s guide to what the title page called the “hard usuall English wordes” that readers sometimes encountered “in Scripture, Sermons, or elswhere.”

Cawdrey had the misfortune, unlike fellow lexicographer Samuel Johnson, of never meeting his Boswell. Yet he had an eventful career – enough to allow for a small field of Cawdrey studies. An interesting introduction by John Simpson, the chief editor of the Oxford English Dictionary, sums up what is known about Cawdrey and suggests ways in which his dictionary may contain echoes of his life and times.

At the risk of being overly present-minded, there’s a sense in which Cawdrey was a pioneer in dealing with the effects of his era’s information explosion. Thanks to the printing press, the English language was undergoing a kind of mutation in the 16th century.

New words began to circulate in the uncharted zone between common usage and the cosmopolitan lingo of sophisticated urbanites who traveled widely. Learned gentlemen were  traveling to France and Italy and coming back “to powder their talk with over-sea language,” as Cawdrey noted. Some kinds of “academicke” language (glossed by Cawdrey as “of the sect of wise and learned men”) were gaining wider usage. And readers were encountering words like “crocodile” and “akekorn” which were unfamiliar. Cawdrey’s terse definitions of them as “beast” and “fruit,” respectively, suggest he probably had seen neither.

Booksellers had offered lexicons of ancient and foreign languages. And there were handbooks explaining the meaning of specialized jargon, such as that used by lawyers. But it was Cawdrey’s bright idea that you might need to be able to translate new-fangled English into a more familiar set of “plaine English words.”

Cawdrey also found himself in the position of needing to explain his operating system. “To profit by this Table,” as he informed the “gentle Reader” in a note, “thou must learn the Alphabet, to wit, the order of the Letters as they stand....and where every Letter standeth.” Furthermore, you really needed to have it down cold. A word beginning with the letters “ca,” he noted, would appear earlier than one starting with “cu.” After using the “Table” for a while, you probably got the hang of it.

Who was this orderly innovator? Cawdrey, born in the middle of England sometime in the final years of Henry VIII, seems not to have attended Oxford or Cambridge. But he was learned enough to teach and to preach, and came to enjoy the patronage of a minister to Queen Elizabeth. He married, and raised a brood of eight children. In a preface to the dictionary, Cawdrey acknowledges the assistance of “my sonne Thomas, who now is Schoolmaister in London.”

Cawdrey published volumes on religious instruction and on the proper way to run a household so that each person knew his or her proper place. He also compiled “A Treasurie or store-house of similies both pleasant, delightfull, and profitable, for all estates of men in generall.” (Such verbosity was quite typical of book titles at the time. The full title page for his dictionary runs to about two paragraphs.)

Whatever his chances for mobility and modest renown within the Elizabethan intelligentsia were severely limited, however, given his strong religious convictions. For Cawdrey was a Puritan – that is, someone convinced that too many of the old Roman Catholic ways still clung to the Church of England.

Curious whether "Puritan" (a neologism with controversial overtones) appeared in dictionary, I looked it up. It isn’t there. But Cawrey does have “purifie,” meaning “purge, scoure, or make cleane” -- which is soon followed by “putrifie, to waxe rotten, or corrupted as a sore.” By the 1580s, Cawdrey had both words very much in mind when he spoke from the pulpit. When he was called before church authorities, one of the complaints was that he had given a sermon in which he had “depraved the Book of Common Prayer, saying, That the same was a Vile Book and Fy upon it.” He was stripped of his position as minister.

But Cawdrey did not give up without a fight. He appealed the sentence, making almost two dozen trips to London to argue that it was invalid under church law. All to no avail. He ignored hints from well-placed friends that he might get his job back by at least seeming to go along with the authorities on some  points. For that matter, he continued to sign his letters as if he were the legitimate pastor of his town.

No doubt Cawdrey retained a following within the Puritan underground, but he presumably had to go back to teaching to earn a living. Details about his final years are few. It isn’t even clear when Cawdrey died. He would have been approaching 70 when his dictionary appeared, and references in reprints of his books a few years later imply that they were revised posthumously.

In his introductory essay, John Simpson points out that the OED now lists 60,000 words that are known to have been in use in English around the year 1600. Cawdrey defines about 2,500 of them. “We should probably assume that he was unable to include as many words as he would have liked,” writes Simpson, “in order to keep his book within bounds. It was, after all, an exploratory venture.”

But that makes the selection all the more interesting. It gives you a notion of what counted as a “hard word” at the time. Most of them are familiar now from ordinary usage, though not always in quite the sense that Cawdrey indicates. He gives the meaning of “decision” as “cutting away,” for example. Tones of the preacher can be heard in his slightly puzzling definition of “curiositie” as “picked diligence, greater carefulnes, then is seemly or necessarie.”

Given his Puritan leanings, it is interesting to see that the word “libertine” has no specifically erotic overtones for Cawdrey. He defines the word applying to those “loose in religion, one that thinks he may doe as he listeth.” One of the longest entries is for “incest,” explained as “unlawfull copulation of man and woman within the degrees of kinred, or alliance, forbidden by Gods law, whether it be in marriage or otherwise.”

It is a commonplace of much recent scholarship that, prior to the mania for categorizing varieties of sexual desire that emerged in the 19th century, the word “sodomy” covered a wide range of non-procreative acts, heterosexual as well as homosexual. Cawdrey, it seems, didn’t get the memo. He defines “sodomitrie” as “when one man lyeth filthylie with another man.” Conversely, and rather more puzzling, is his definition of “buggerie” (which one might assume to be a slang term for a rather specific act) as “conjunction with one of the same kinde, or of men with beasts.”

In a few entries, one detects references to Cawdrey’s drawn-out legal struggle of the 1580s and '90s. He explains that a "rejoinder" is “a thing added afterwards, or is when the defendant maketh answere to the replication of the plaintife.” So a rejoinder is a response, perhaps, to “sophistikation” which Cawdrey defines as “a cavilling, deceitful speech.”

Especially pointed and poignant is the entry for “temporise,” meaning “to serve the time, or to follow the fashions and behaviour of the time.” Say what you will about Puritan crankiness, but Robert Cawdrey did not “temporise.”

Particularly interesting to note are entries hinting at how the “new information infrastructure” (circa 1600) was affecting language. The expense of producing and distributing literature was going down. “Literature,” by the way, is defined by Cawdrey here as “learning.” Cawdrey includes a bit of scholarly jargon, “abstract,” which he explains means “drawne away from another: a litbooke or volume prepared out of a greater.”

Some of the words starting to drift into the ken of ordinary readers were derived from Greek, such as “democracie, a common-wealth gouerned by the people” and “monopolie, a license that none shall buy and sell a thing, but one alone.” Likewise with terms from the learned art of rhetoric such as “metaphor,” defined as "similitude, or the putting over of a word from his proper and naturall signification, to a forraine or unproper signification.”

Cawdrey’s opening address “To the Reader” is a manifesto for the Puritan plain style. Anyone seeking “to speak publiquely before the ignorant people,” he insists, should “bee admonished that they never affect any strange inkhorne termes, but labour to speake so as is commonly received, and so as the most ignorant may well understand them.”

At the same time, some of the fancier words were catching on. The purpose of the dictionary was to fill in the gap between language that “Ladies, Gentlewomen, or any other unskilfull persons” might encounter in their reading and what they could readily understand. (At this point, one would certainly like to know whether Cawdrey taught his own three daughters how to read.) Apart from its importance to the history of lexicography, this pioneering reference work remains interesting as an early effort to strike a balance between innovation and accessibility in language use.

“Some men seek so far for outlandish English,” the old Puritan divine complains, “that they forget altogether their mothers language, so that if some of their mothers were alive, they were not able to tell, or understand what they say.” Oh Robert Cawdrey, that thou shouldst be alive at this hour!

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee writes Intellectual Affairs each week. Suggestions and ideas for future columns are welcome.

In Praise of Small Conferences

This last fall I attended the 2006 TYCA-West conference. It was held in beautiful Park City, Utah in October. About 60 two-year college English faculty, graduate students, and even some university professors gathered to discuss the study and practice of teaching English. It’s hard to imagine a more beautiful setting than Park City in the fall. The crisp mountain air and the burnt orange and red scrub oak painting the surrounding mountains lend a….

What? You mean to tell me that you’ve never heard of TYCA-West? TYCA is the Two-Year College English Association, which is a group of the National Council of Teachers of English. It is comprised of seven regions, and each region holds an annual conference. TYCA-West is the regional organization that includes Utah, Idaho, Nevada, Arizona, Wyoming, and (who would have guessed), Hawaii.

Let’s be honest, as far as conferences go, it’s difficult to imagine a less prestigious conference than a regional two-year college English conference. You aren’t likely to rub shoulders with star scholars in the field. Nor will you encounter presentations that will help you sort through the talked about new book or intellectual movement of the year. For that, go to MLA. I’m not against the big conference. But I’ve come to appreciate the strengths of the small conference, and for professors dedicated to teaching, regional conferences may in fact be more valuable and more rewarding than higher profile conferences.

At the TYCA-West conference, we tend to focus on practical issues associated with teaching English. This last year, our keynote address was by Sharon Mitchler, the past TYCA-National chair. She addressed the larger economic and demographic trends associated with teaching English in the two-year college. I learned, for instance, that two-year colleges “teach an estimated 50 percent of all college-level composition and 70 percent of all developmental composition courses,” and I learned that “college participation rates among low-income students peaked in 1998 and have been falling since then.” Mitchler’s presentation had a refreshingly empirical cast, something I’m not accustomed to at humanities conferences. But she effectively embedded those facts within a larger argument about how these trends will ultimately determine what we do in the classroom, whether we realize it or not. At the 2005 TYCA-West conference, we were treated to an excellent presentation by Kathleen Blake Yancey on the changing nature of literacy. It was followed by an engaging and pleasingly cant-free discussion about what we’re currently experiencing in the classroom. Many of the challenges associated with teaching writing persist. Instructors shared stories about how difficult it is to get students to become critical readers and writers. Many instructors, however, pointed to newer trends in writing instruction, like service learning, which offer students more authentic scenarios of composition.

I’ve also formed lasting friendships at TYCA-West.  Since becoming involved in TYCA-West, I now know and correspond with faculty members from each of the states within my region, from places like Yavapai College of Arizona, Community College of Southern Nevada, Western Wyoming Community College, and Dixie State College. We share an identity as two-year college English faculty, joined in a common enterprise. As faculty members who share similar economic and demographic challenges, we have also formed a regional identity, something not typically encouraged by the larger conferences. I feel like I have developed an authentic network through my experience at TYCA-West. From Jeff Andelora who teaches at Mesa Community College, I’ve learned about the history of community college English. From Bradley Waltman at the Community College of Southern Nevada, I’ve learned about the challenges associated with placing students in writing courses.

Here’s what you won’t find at TYCA-West or most other smaller, regional conferences. You won’t be subjected to the name-badge-glance-and-turn, a move I’ve always for some reason viewed as akin to a basketball player’s expert pivot. (If only the Utah Jazz center could pivot like that.) Instead, you will encounter colleagues at peer institutions genuinely interested to meet you and hear what you have to say.

Neither will you attend presentations obviously constructed for the sole purpose of CV fodder. No counterintuitive readings of canonical texts that strain credulity. No impotent counter-hegemonic posturing. Presentations tilt toward the practical rather than the theoretical. Though, believe it or not, two-year college English professors are interested in theory, but we typically put theory in the service of practical considerations. In my experience, you are more likely to hear what Joseph Williams called the “So what?” question at smaller conferences. Taken together, the presentations at our TYCA conferences soberly address the perennial challenge of how we get our students to become more effective writers and readers.

Finally, regional conferences are cheap. I briefly considered attending this year’s Conference on College Composition and Communication in New York City. But rooms at the conference hotel are $300 a night and the flight would have cost me about $500 round trip. The total cost of the conference would have easily exceeded $1,500 and, though I am lucky enough to get support from my college to attend conferences, I decided that it just wasn’t worth it. For those faculty members who receive little or no support from their institutions, this year’s 4Cs conference is probably out of reach.

In contrast, let me present, Thoreau-like, the costs of my 2005 TYCA-West in Prescott, Arizona:

  • Travel $230 (round trip to Phoenix plus a shuttle to and from Prescott).
  • Hotel (shared room with a colleague) $75.
  • Conference Registration $140 (included breakfast and lunch on both days of the conference).
  • Food $75 (including a beer and scrumptious burger at The Saloon, which has a wall-sized painting of Steve McQueen worth seeing).

For around $500 I enjoyed a conference where I connected with professors from the region, went to Prescott for the first time -- a beautiful little college town in the mountains northwest of Phoenix -- and learned a little more about how to become a more effective English teacher. The 2006 TYCA-West conference in Park City was a 30-minute drive from my house.

Regional organizations can languish, though. Anyone who has been involved in the organization and promotion of a regional conference can tell you that it’s sometimes difficult to generate interest and attendance. Because the large, national conferences exert such a big influence over the discipline, it is often a challenge to persuade professors that small conferences are worth their time. After all, what will a presentation at TYCA-West do for your CV? But I am excited about next year’s TYCA-West conference in Las Vegas. (I suggested we adopt the line, “What happens at TYCA-West stays at TYCA-West,” in order to generate greater participation.)

Large conferences will always be important, and I still plan on attending them. But the academic work done by many college professors happens primarily in the classroom. The small conference provides an ideal forum for them to share this important work.

Author/s: 
Jason Pickavance
Author's email: 
info@insidehighered.com

Jason Pickavance is an instructor in the English department at Salt Lake Community College, where he teaches courses in writing and American literature.

The Eternal Sophomore

“It has been my experience with literary critics and academics in this country,” wrote Kurt Vonnegut in an essay published in 1981, “that clarity looks a lot like laziness and childishness and cheapness to them. Any idea which can be grasped immediately is for them, by definition, something they knew all the time. So it is with literary experimentation, too. If a literary experiment works like a dream, is easy to read and enjoy, the experimenter is a hack. The only way to get full credit as a fearless experimenter is to fail and fail.”

The anger in that statement had been building up for at least a couple of decades. Much of Vonnegut’s early work was classified as science fiction – a filing-cabinet drawer that, as he once put it, academics tended to confuse with a urinal. He was later discovered by people who didn't read science fiction, and most of his books stayed in print. But that just meant he had failed to fail, so the charge of being a hack was still in the air.

In some respects, though, his complaints were already out of date when he made them; for by the early 1980s, there was already a scholarly industry in Vonnegut criticism. It now runs to some three dozen books, not to mention more journal articles than anyone would want to count.

During the original wave of speculation on postmodernism during the 1960s and early ‘70s – when that notion was relatively untheoretical, a label applied to emergent literary tendencies more than the name for some vast cultural problematic – it was very often the work of Kurt Vonnegut that people had in mind as an exemplary instance. Parataxis, metafiction, blurring of the distinction between mass-culture genres and modernistic formal experimentation -- all of this, you found in Vonnegut. His novels were chemically pure samples of the postmodern condition.

And then came the definitive moment documenting Vonnegut’s place in the literary curriculum: the film "Back to School" (1986), in which the author had a cameo role.

In that landmark work, as you may recall, Rodney Dangerfield played Thornton Mellon, a millionaire who returns to college for the educational opportunities involved in partying with coeds in bikinis. When an English professor assigns a paper on Vonnegut’s fiction, Dangerfield hires the novelist himself to write the analysis. The paper receives a failing grade. (Someone in Hollywood must be a fan of Northrop Frye, who once said that whatever else one might say about Wordsworth’s preface to the Lyrical Ballads, as a piece of Wordsworth criticism it only merited a B plus.)

Given such clear evidence of canonization, it was a surprise to notice that a couple of friends responded to the news of Vonnegut’s death last week with slightly embarrassed sadness. Both are graduate students in the humanities. One called his novels a “guilty pleasure.” Another mentioned how much Vonnegut’s work had meant to him “even if he’s not considered that great or serious a writer.”

I suspect that such feelings about Vonnegut are pretty widespread -- that the shelves of secondary literature don’t really quell a certain ambivalence among readers who feel both deep affection for his work combined with a certain keen nervousness about his cultural status. Unfortunately Vonnegut did not make things any easier by publishing so many novels that devolved into self-parody. If he had quit after Cat’s Cradle and Slaughterhouse Five, the ratio of wheat to chaff in his fiction would be much more favorable.

But the ambivalence itself is not, I think, a response to the uneven quality of his work -- nor even the product of some misguided notion that a funny author can’t be taken seriously. Rather, the problem may be that Vonnegut is an author one tends to discover in adolescence. Defensiveness about the attachment one feel to his work is, in part, a matter of wanting to protect the part of oneself that seemed to come into being upon first reading him. “I deal with sophomoric questions that full adults regard as settled,” he told an interviewer once.

He had, for example, a large capacity for facing brute contingency as part of human existence. A great deal of life is chance. (The fact that you were born, for example. Think how arbitrary that is.) And much of the rest of life consists of learning to evade that truth – walling it off, away from consciousness, because otherwise the reality of it would be too hard to fathom. Instead, we throw ourselves into fictions of power and belonging: nationalism, militarism, religion, the acquisition of cool stuff. These are ways to contain both the vulnerability before chance and the terrors of loneliness. In Vonnegut’s understanding of the world, loneliness is a fundamental part of human experience that became much, much worse in the United States, somehow, during the second half of the twentieth century – with no particular reason to think it will get better anytime soon.

As contributions to the cultural history of mankind, such thoughts are pretty small beer. On the other hand, just try to escape their implications. To call a point simple is the cheapest and least effective means of gainsaying it.

On Monday, at about the time I sat writing that paragraph about chance and terror and helplessness, someone was walking around a university campus shooting people at random. This was a coincidence. It was chance. That thought is no comfort. As one of the Tralfamidorians says in Slaughterhouse Five: “Well, here we are, Mr. Pilgrim, trapped in the amber of this moment. There is no why.” So it goes.

Vonnegut (who once called himself “a Christ-worshiping agnostic”) drew from the ground truth of existential terror a moral conclusion that it made sense to try to love your neighbor as yourself – or at least to treat other people with radical decency. This sounds simplistic until you actually try doing it.

He was a socialist in the old Midwestern tradition best expressed in a famous statement by Eugene Debs that went: "Years ago I recognized my kinship with all living beings, and I made up my mind that I was not one bit better than the meanest on earth. I said then, and I say now, that while there is a lower class, I am in it, and while there is a criminal element I am of it, and while there is a soul in prison, I am not free." Quoting that was about as close to a theoretical statement as Vonnegut ever got. The rest of his outlook he regarded as common sense.

“Everything I believe,” he said, “I was taught in junior civics during the Great Depression – at School 43 in Indianapolis, with full approval of the school board. School 43 wasn’t a radical school. American was an idealistic, pacifistic nation at that time. I was taught in the sixth grade to be proud that we had a standing Army of just over a hundred thousand men and that the generals has nothing to say about what was done in Washington. I was taught to be proud of that and to pity Europe for having more than a million men under arms and spending all their money on airplanes and tanks. I simply never unlearned junior civics. I still believe in it. I got a very good grade.”

Someone with such attitudes must necessarily be an anachronism, of course, and anachronisms tend to be either funny or sad. His books, at their best, were both. A few of them will survive because they hold those qualities in such beautiful proportion. “Laughter,” as Vonnegut once put it, “is a response to frustration, just as tears are, and it solves nothing, just as tears solve nothing. Laughter or crying is what a human being does when there’s nothing else he can do.”

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Poetry Can Be Dangerous

On April 19, after a day of teaching classes at Shippensburg University, I went out to my car and grabbed a box of old poetry manuscripts from the front seat of my little white beetle and carried it across the street and put it next to the trashcan outside Wright Hall. The poems were from poetry contests I had been judging and the box was heavy. I had previously left my recycling boxes there and they were always picked up and taken away by the trash department.

A young man from ROTC was watching me as I got into my car and drove away. I thought he was looking at my car, which has black flower decals and sometimes inspires strange looks. I later discovered that I, in my dark skin, am sometimes not even a person to the people who look at me. Instead, in spite of my peacefulness, my committed opposition to all aggression and war, I am a threat by my very existence, a threat just living in the world as a Muslim body.

Upon my departure, he called the local police department and told them a man of Middle Eastern descent driving a heavily decaled white Beetle with out of state plates and no campus parking sticker had just placed a box next to the trash can. My car has New York State plates, but he got the rest of it wrong. I have two stickers on my car. One is my highly visible faculty parking sticker and the other, which I just don't have the heart to take off these days, says "Kerry/Edwards: For a Stronger America."

Because of my recycling the bomb squad came, the state police came. Because of my recycling buildings were evacuated, classes were canceled, campus was closed. No. Not because of my recycling. Because of my dark body. No. Not because of my dark body. Because of his fear. Because of the way he saw me. Because of the culture of fear, mistrust, hatred, and suspicion that is carefully cultivated in the media, by the government, by people who claim to want to keep us "safe."

These are the days of orange alert, school lock-downs, and endless war. We are preparing for it, training for it, looking for it, and so of course, in the most innocuous of places -- a professor wanting to hurry home, hefting his box of discarded poetry -- we find it.

That man in the parking lot didn't even see me. He saw my darkness. He saw my Middle Eastern descent. Ironic because though my grandfathers came from Egypt, I am Indian, a South Asian, and could never be mistaken for a Middle Eastern man by anyone who'd ever met one.

One of those in the gathering crowd, trying to figure out what had happened, heard my description-a Middle Eastern man driving a white Beetle with out-of-state plates and knew immediately they were talking about me and realized that the box must have been manuscripts I was discarding. When the police were told I was a professor, immediately the question came back about where I was from.

At some length several of my faculty colleagues were able to get through to the police and get me on a cell phone where I explained to the university president and then to the state police that the box contained old poetry manuscripts that needed to be recycled. The police officer told me that in the current climate I needed to be more careful about how I behaved. "When I recycle?" I asked.

The university president appreciated my distress about the situation but denied that the call had anything to do with my race or ethnic background. The spokesman for the university called it an "honest mistake," not referring to the young man from ROTC giving in to his worst instincts and calling the police but referring to me, who made the mistake of being dark-skinned and putting my recycling next to the trashcan.

The university's bizarrely minimal statement lets everyone know that the "suspicious package" beside the trashcan ended up being, indeed, trash. It goes on to say, "We appreciate your cooperation during the incident and remind everyone that safety is a joint effort by all members of the campus community."

What does that community mean to me, a person who has to walk by the ROTC offices every day on my way to my own office just down the hall-who was watched, noted, and reported, all in a day's work? Today we gave in willingly and whole-heartedly to a culture of fear and blaming and profiling. It is deemed perfectly appropriate behavior to spy on one another and police one another and report on one another. Such behaviors exist most strongly in closed and undemocratic and fascist societies.

The university report does not mention the root cause of the alarm. That package became "suspicious" because of who was holding it, who put it down, who drove away. Me.

It was poetry, I kept insisting to the state policeman who was questioning me on the phone. It was poetry I was putting out to be recycled.

My body exists politically in a way I can not prevent. For a moment today, without even knowing it, driving away from campus in my little Beetle, exhausted after a day of teaching, listening to Justin Timberlake on the radio, I ceased to be a person when a man I had never met looked straight through me and saw the violence in his own heart.

Author/s: 
Kazim Ali
Author's email: 
info@insidehighered.com

Kazim Ali is a poet and novelist. He teaches at Shippensburg University and at Stonecoast, the low-residency MFA program of the University of Southern Maine. Eyewitnesses confirmed his account of the scene after he left the university. A university spokesman declined to discuss specifics of the incident or who was involved, but told Inside Higher Ed that "the response was appropriate based on the circumstances," and that "just days after the [Virginia Tech] massacre, everybody is looking out for each other."

Digital Masonry

Jacques-Alain Miller has delivered unto us his thoughts on Google. In case the name does not signify, Jacques-Alain Miller is the son-in-law of the late Jacques Lacan and editor of his posthumously published works. He is not a Google enthusiast. The search engines follows “a totalitarian maxim,” he says. It is the new Big Brother. “It puts everything in its place,” Miller declares, “turning you into the sum of your clicks until the end of time.”

Powerful, then. And yet – hélas! – Google is also “stupid.” It can “scan all the books, plunder all the archives [of] cinema, television, the press, and beyond,” thereby subjecting the universe to “an omniscient gaze, traversing the world, lusting after every little last piece of information about everyone.” But it “is able to codify, but not to decode. It is the word in its brute materiality that it records.” (Read the whole thing here. And for another French complaint about Google, see this earlier column.)

When Miller pontificates, it is, verily, as a pontiff. Besides control of the enigmatic theorists’s literary estate, Miller has inherited Lacan’s mantle as leader of one international current in psychoanalysis. His influence spans several continents. Within the Lacanian movement, he is, so to speak, the analyst of the analysts’ analysts.

He was once also a student of Louis Althusser, whose seminar in Paris during the early 1960s taught apprentice Marxist philosophers not so much to analyze concepts as to “produce” them. Miller was the central figure in a moment of high drama during the era of high structuralism. During Althusser’s seminar, Miller complained that he had been busy producing something he called “metonymic causality” when another student stole it. He wanted his concept returned. (However this conflict was resolved, the real winner had to be any bemused bystander.)

Miller is, then, the past master of a certain mode of intellectual authority – one that has been deeply shaped by (and is ultimately inseparable from) tightly restricted fields of communication and exchange.

Someone once compared the Lacanian movement to a Masonic lodge. There were unpublished texts by the founder that remained more than usually esoteric: they were available in typescript editions of just a few copies, and then only to high-grade initiates.

It is hard to imagine a greater contrast to that digital flatland of relatively porous discursive borders about which Miller complains now. As well he might. (Resorting to Orwellian overkill is, in this context, probably a symptom of anxiety. There are plenty of reasons to worry and complain about Google, of course. But when you picture a cursor clicking a human face forever, it lacks something in the totalitarian-terror department.)

Yet closer examination of Miller’s pronouncement suggests another possibility. It isn’t just a document in which hierarchical intellectual authority comes to terms with the Web's numbskulled leveling. For the way Miller writes about the experience of using Google is quite revealing -- though not about the search engine itself.

“Our query is without syntax,” declares Miller, “minimal to the extreme; one click ... and bingo! It is a cascade -- the stark white of the query page is suddenly covered in words. The void flips into plenitude, concision to verbosity.... Finding the result that makes sense for you is therefore like looking for a needle in a haystack. Google would be intelligent if it could compute significations. But it can’t.”

In other words, Jacques-Alain Miller has no clue that algorithms determine the sequence of hits you get back from a search. (However intelligent Google might or might not be, the people behind it are quite clearly trying to “compute significations.”) He doesn’t grasp that you can shape a query – give it a syntax – to narrow its focus and heighten its precision. Miller’s complaints are a slightly more sophisticated version of someone typing “Whatever happened to Uncle Fred?” into Google and then feeling bewildered that the printout does not provide an answer.

For an informed contrast to Jacques-Alain Miller’s befuddled indignation, you might turn to Digital History Hacks, a very smart and rewarding blog maintained by William J. Turkel, an assistant professor of history at the University of Western Ontario. (As it happens, I first read about Miller in Psychoanalytic Politics: Jacques Lacan and Freud's French Revolution by one Sherry Turkle. The coincidence is marred by a slip of the signfier: they spell their names differently.)

The mandarin complaint about the new digital order is that it lacks history and substance, existing in a chaotic eternal present – one with no memory and precious little attention span. But a bibliographical guide that Turkel posted in January demonstrates that there is now extensive enough literature to speak of a field of digital history.

The term has a nice ambiguity to it – one that is worth thinking about. One the one hand, it can refer to the ways historians may use new media to do things they’ve always done – prepare archives, publish historiography, and so on. Daniel J. Cohen and Roy Rosenzweig’s Digital History: A Guide to Gathering, Preserving, and Presenting the Past on the Web (University of Pennsylvania Press, 2006) is the one handbook that ought to be known to scholars even outside the field of history itself. The full text of it is available for free online from the Center for History and New Media at George Mason University, which also hosts a useful selection of essays on digital history.

But as some of the material gathered there shows, digitalization itself creates opportunities for new kinds of history – and new problems, especially when documents exist in formats that have fallen out of use.

Furthermore, as various forms of information technology become more or more pervasive, it makes sense to begin thinking of another kind of digital history: the history of digitality.
Impressed by the bibliography that Turkel had prepared – and by the point that it now represented a body of work one would need to master in order to do graduate-level work in digital history – I contacted him by e-mail to get more of his thoughts on the field.

“Digital history begins,” he says, “with traditional historical sources represented in digital form on a computer, and with 'born-digital' sources like e-mail, text messages, computer code, video games and digital video. Once you have the proper equipment, these digital sources can be duplicated, stored, accessed, manipulated and transmitted at almost no cost. A box of archival documents can be stored in only one location, has to be consulted in person, can be used by only a few people at a time, and suffers wear as it is used. It is relatively vulnerable to various kinds of disaster. Digital copies of those documents, once created, aren't subject to any of those limitations. For some purposes you really need the originals (e.g., a chemical analysis of ink or paper). For many or most other purposes, you can use digital representations instead. And note that once the chemical analysis is completed, it too becomes a digital representation.”

But that’s just the initial phase, or foundation level, of digital history – the scanning substratum, in effect, in which documents become more readily available. A much more complex set of questions come up as historians face the deeper changes in their work made possible by a wholly different sort of archival space – what Roy Rosenzweig calls the "culture ofabundance" created by digitality.

“He asks us to consider what it would mean to try and write history with an essentially complete archival record,” Turkel told me. “I think that his question is quite deep because up until now we haven't really emphasized the degree to which our discipline has been shaped by information costs. It costs something (in terms of time, money, resources) to learn a language, read a book, visit an archive, take some notes, track down confirming evidence, etc. Not surprisingly, historians have tended to frame projects so that they could actually be completed in a reasonable amount of time, using the availability and accessibility of sources to set limits.”

Reducing information costs in turn changes the whole economy of research – especially during the first phase, when one is framing questions and trying to figure out if they are worth pursuing.

“If you're writing about a relatively famous person,” as Turkel put it, “other historians will expect you to be familiar with what that person wrote, and probably with their correspondence. Obviously, you should also know some of the secondary literature. But if you have access to a complete archival record, you can learn things that might have been almost impossible to discover before. How did your famous person figure in people's dreams, for example? People sometimes write about their dreams in diaries and letters, or even keep dream journals. But say you wanted to know how Darwin figured in the dreams of African people in the late 19th century. You couldn't read one diary at a time, hoping someone had had a dream about him and written it down. With a complete digital archive, you could easily do a keyword search for something like "Darwin NEAR dream" and then filter your results.”
As it happens, I conducted this interview a few weeks before coming across Jacques-Alain Miller’s comments on Google. It seems like synchronicity that Turkel would mention the possibility of digital historians getting involved in the interpretation of dreams (normally a psychoanalyst’s preserve). But for now, it sounds as if most historians are only slightly more savvy about digitality than the Lacanian Freemasons.

“All professional historians have a very clear idea about how to make use of archival and library sources,” Turkel says, “and many work with material culture, too. But I think far fewer have much sense of how search engines work or how to construct queries. Few are familiar with the range of online sources and tools. Very few are able to do things like write scrapers, parsers or spiders.”

(It pays to increase your word power. For a quick look at scraping and parsing, start here. For the role of spiders on the Web, have a look at this.)

“I believe that these kind of techniques will be increasingly important,” says Turkel, “and someday will be taken for granted. I guess I would consider digital history to have arrived as a field when most departments have at least one person who can (and does) offer a course in the subject. Right now, many departments are lucky to have someone who knows how to digitize paper sources or put up web pages.”

Author/s: 
Scott McLemee
Author's email: 
info@insidehighered.com

Scott McLemee writes Intellectual Affairs each week. Suggestions and ideas for future columns are welcome.

Speak, Memory

Last week,Intellectual Affairs took up the topic of what might be called scandal-mania -- the never-ending search for shock, controversy, and gratifying indignation regarding our “master thinkers.” Unfortunately there haven’t been enough “shocking revelations” recently to keep up with the demand. So the old ones are brought out of mothballs, from time to time.

A slightly different kind of case has come up recently involving Zygmunt Bauman, who is emeritus professor of sociology at the University of Leeds and the University of Warsaw. Bauman is a prolific author with a broad range of interests in social theory, but is probably best known for a series of books and essays analyzing the emergence of the new, increasingly fluid and unstable forms of cultural and social order sometimes called “postmodernism.”

No doubt that fact alone will suffice to convince a certain part of the public that he must be guilty of something. Be that as it may, Bauman is not actually a pomo enthusiast. While rejecting various strands of communitarianism, he is quite ambivalent about the fragmentation and confusion in the postmodern condition. His book Liquid Times: Living in an Age of Uncertainty, just issued by Polity, is quite typical of his work over the past few years -- a mixture of social theory and cultural criticism, sweeping in its generalizations but also alert to the anxieties one sees reflected in the newspaper and on CNN.

In March, a paragraph concerning Bauman appeared at Sign and Sight, a Web site providing capsule summaries in English of the Feuilletons (topical cultural articles) appearing in German newspapers and magazines. It noted the recent publication in the Frankfurter Allgemeine Zeitung of an article by a Polish historian named Bogdan Musial. The piece “uncovers the Stalinist past of the world famous sociologist,” as Sign and Sight put it.

It also quoted a bit of the article. "The fact is that Bauman was deeply involved with the violent communist regime in Poland for more than 20 years,” in Musial’s words, “fighting real and supposed enemies of Stalinism with a weapon in his hand, shooting them in the back. His activities can hardly be passed off as the youthful transgressions of an intellectual seduced and led astray by communist ideology. And it is astonishing that Bauman, who so loves to point the finger, does not reflect on his own deeds."

A few weeks later, another discussion of the matter appeared in The Irish Times -- this one by Andreas Hess, a senior lecturer in sociology at the University of Dublin. The piece bore what seems, with hindsight, the almost inevitable title of “Postmodernism Made Me Do It: A World Without Blame.” (The article is not available except to subscribers, but I’ll quote from a copy passed along by a friend.)

Summing up the charges in the German article, Hess said that secret files recently declassified in Poland revealed that Bauman “participated in operations of political cleansing of alleged political opponents in Poland between 1944 and 1954. The Polish files also show Bauman was praised by his superiors for having been quite successful in completing the tasks assigned, although he seems, as at least one note suggests, not to have taken any major part in direct military operations because of his ‘Semitic background.’ However, to be promoted to the rank of major at the youthful age of 23 was quite an achievement. As the author of the article [in the German newspaper] pointed out, Bauman remained a faithful member of the party apparatus.”

Hess goes on to suggest that “Bauman’s hidden past” is the key to his work as “one of the prophets of postmodernism.” This is not really argued so much as asserted -- and in a somewhat contradictory way.

On the one hand, it is implied that Bauman has used postmodern relativism as a way to excuse his earlier Stalinist crimes. Unfortunately for this argument, Bauman is actually a critic of postmodernism. And so, on the other hand, the sociologist is also guilty of attacking Western society by denouncing postmodernity. Whether or not this is a coherent claim, it points to some of what is at issue in the drama over “Bauman’s secret Stalinism,” as it’s called.

Now, I do not read German or Polish -- a decided disadvantage in coming to any sense of how the controversy has unfolded in Europe. Throughout the former Soviet sphere of influence, a vast and agonizingly complex set of problems has emerged surrounding “lustration” -- the process of "purifying" public life by legally disqualifying those who collaborated with the old Communist regimes from serving in positions of authority.

Debates over the politicized use of lustration in Poland have gone on for years. “What may look like an effort to reconcile with the Communist past,” wrote one Polish legal scholar not long ago, “is something else entirely. It is an assault on reconciliation and a generational bid for power.” There are bound to be implications to Bauman’s lustration that will be lost on those of us looking at it from a distance.

But let’s just look at the matter on purely in terms of the academic scandal we’ve been offered. I have read some of Bauman’s work, but not a lot. Under the circumstances that may be an advantage. I am not a disciple – and by no means feel committed to defending him, come what may.

If he has hidden his past, then its revelation is a necessary thing. But then, that is the real issue at stake. Everything turns on that “if.”

What did we know about Zygmunt Bauman before the opening of his files? What could be surmise about his life based on interviews, his bibliographical record, and books about him readily available at a decent university library?

One soon discovers that “Bauman’s hidden past” was very badly hidden indeed. He has never published a memoir about being a Stalinist -- nor about anything else, so far as I know -- but he has never concealed that part of his life either. The facts can be pieced together readily.

He was born in Poland in 1925 and emigrated to the Soviet Union with his family at the start of World War II. This was an altogether understandable decision, questions of ideology aside. Stalin’s regime was not averse to the occasional half-disguised outburst of anti-Semitism, but that was not the central point of its entire agenda, at least; so it is hardly surprising that a Jewish family might respond to the partition of Poland in 1939 by heading East.

Bauman studied physics and dreamed, he says, of becoming a scientist. He served as a member of the Polish equivalent of the Red Army during the war. He returned to his native country as a fervent young Communist, eager, he says, to rebuild Poland as a modern, egalitarian society – a “people’s democracy” as the Stalinist lingo had it. His wife Janina Bauman, in her memoir A Dream of Belonging: My Years in Postwar Poland (Virago, 1988) portrays him as a true believer in the late 1940s and early 1950s.

But there is no sense overstressing his idealism. To have been a member of the Polish United Workers Party was not a matter of teaching Sunday school classes on Lenin to happy peasant children. Bauman would have participated in the usual rounds of denuciation, purge, “thought reform,” and rationalized brutality. He was also an officer in the Polish army. The recent revelations specify that he belonged to the military intelligence division -- making him, in effect, part of the secret police.

But the latter counts a “revelation” only to someone with no sense of party/military relations in the Eastern bloc. Not every member of the military was a Communist cadre -- and an officer who was also a member of the party had a role in intelligence-gathering, more or less by definition.

But a Jewish party member was in a precarious position – again, almost by definition. In 1953, he was forced out of the army during one of the regime’s campaigns against “Zionists” and “cosmopolitans.” He enrolled in the University of Warsaw and retrained as a social scientist. He began to research on the history of the British Labour Party and the development of contemporary Polish society.

One ought not to read too much dissidence into the simple fact of doing empirical sociology. Bauman himself says he wanted to reform the regime, to bring it into line with its professed egalitarian values. And yet, under the circumstances, becoming a sociologist was at least somewhat oppositional a move. He published articles on alienation, the problems of the younger generation, and the challenge of fostering innovation in a planned economy.

And so he remained loyal to the regime -- in his moderately oppositional fashion -- until another wave of official anti-Semitism in 1968 made this impossible. In her memoir, Janina Bauman recalls their final weeks in Poland as a time of threatening phone calls, hulking strangers loitering outside their apartment, and TV broadcasts that repeated her husband’s name in hateful tones. “A scholarly article appeared in a respectable magazine,” she writes. “It attacked [Zygmunt] and others for their dangerous influence on Polish youth. It was signed by a close friend.”

Bauman and his family emigrated that year, eventually settling in Leeds. (He never faced a language barrier, having for some years been editor of a Polish sociological journal published in English.) His writings continued to be critical of both the Soviet system and of capitalism, and to support the labor movement. When Solidarity emerged in 1980 to challenge the state, Bauman welcomed it as the force that would shape of the future of Poland.

These facts are all part of the record -- put there, most of them, by Bauman himself. By no means is it a heroic tale. From time to time, he must have named names, and written things he didn’t believe, and forced himself to believe things that he knew, deep down, were not true.

And yet Bauman did not hide his past, either. It has always been available for anyone trying to come to some judgment of his work. He has been accused of failing to reflect upon his experience. But even that is a dubious reading of the evidence. A central point of his work on the “liquid” social structure of postmodernism is its contrast with the modernity that went before, which he says was “marked by the disciplinary power of the pastoral state.” He describes the Nazi and Stalinist regimes as the ultimate, extreme cases of that “disciplinary power.”

Let’s go out on a limb and at least consider the possibility that someone who admittedly spent years serving a social system that he now understands as issuing from the same matrix as Hitler’s regime may perhaps be telling us (in his own roundabout, sociologistic way) that he is morally culpable, no matter what his good intentions may have been.

Alas, this is not quite so exciting as “Postmodernist Conceals Sinister Past.” It doesn’t even have the satisfying denouement found in “The God That Failed,” that standard of ex-Communist disillusionment. Sorry about that.... It’s just a tale of a man getting older and – just possibly – wiser. I tend to think of that as a happy story, even so.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee writes Intellectual Affairs each week. Suggestions and ideas for future columns are welcome.

Pages

Subscribe to RSS - English
Back to Top