In a recent New York Review article on Byron, Harold Bloom makes the following passing remark: “In the two centuries since Byron died in Greece [...] only Shakespeare has been translated and read more, first on the Continent and then worldwide.” Bloom does not cite any statistics, and one cannot help but wonder: Really? More than Homer and Dante, or, among the moderns, more than Sartre and Thomas Mann? Of course, what Bloom really means is that Byron was translated and read more than any other English writer, and he may well be correct on that count. Yet this omission is telling, as it highlights an unfortunate tendency (recently diagnosed by David Damrosch) among certain English professors to equate literature in general with literature written in English. This disciplinary bias, less prejudice than habit, can distort their scholarship – the authors that they admire tend to be far more catholic in their reading. But this pattern also raises a larger academic question: Why do we still partition the literary canon according to nationalist traditions? Is this really the most intellectually satisfying and authentic approach to literary studies?
For an example of how disciplinary blinders can affect scholars as well-read as Bloom, we need only turn back to his article, where we find Byron described as “the eternal archetype of the celebrity, the Napoleon of the realms of rhyme... the still unique celebrity of the modern world.” What such hyperbole masks is the fact that the model for such literary celebrity is in reality to be located in another author, who unfortunately did not have the good sense to be born in England. Indeed, anyone familiar with the inordinate fame of Jean-Jacques Rousseau knows that he was the first genuine literary celebrity, lionized and sought out across Europe, much to his growing despair and paranoia (as this brilliant study by the historian Antoine Lilti details). Byron himself was smitten by Rousseau, touring the Lac Léman with his friend Shelley to visit the sites from Julie, ou la nouvelle Héloïse. Rousseau may not have provided his public with the same devilish scandals as the naughty Lord, but his Confessions, with their admission of a fondness for spankings and exhibitionism, were sultry enough.
Bloom is certainly no provincial, and his own, published version of The Western Canon includes German, Spanish, French, and Italian works – although this canon, too, is heavily tilted toward English authors. But can this be avoided? No doubt French scholars would produce a version of the canon equally tilted toward the French, just as scholars from other nations would privilege their own authors. To an extent, this literary patriotism is normal and understandable: every culture values its heritage, and will expend more energy and resources promoting it.
From the viewpoint of literary history, however, such patriotism is also intellectually wrongheaded. To be sure, writers are often marked most strongly by their compatriots: one must read Dante to understand Boccacio, Corneille to understand Racine, or, as Bloom would have us believe, Whitman to understand T. S. Eliot. But such a vertical reading of literature (which Bloom himself mapped out in The Anxiety of Influence) overlooks the equally – sometimes far more – important horizontal ties that connect authors across national borders. T. S. Eliot may have been “hopelessly evasive about Whitman while endlessly revising him in [his] own major poems,” yet by Eliot’s own admission, the French school of symbolist poetry had a far greater impact on his work. Some of Eliot’s first published poems, in fact, were written in French. Conversely, the French novelist Claude Simon may have endlessly revised Proust, but his own major novels – such as La route des Flandres and L’herbe – owe far more to William Faulkner. Such examples could be multiplied ad infinitum: they are, in fact, the stuff that literary history is made of.
To this criticism, English professors have a ready-made answer: Go study comparative literature! But they have only half a point. Comp lit programs are designed to give students a great deal of flexibility: their degrees may impose quotas for number of courses taken in foreign language departments, but rarely, if ever, do comp lit programs build curricular requirements around literary history. Yet that is precisely the point: Students wishing to study English Romanticism ought to have more than Wikipedia-level knowledge about German Idealist philosophy and Romantic poetry; students interested in the 18th-century English novel should be familiar with the Spanish picaresque tradition; and so on and so forth. Comp lit alone cannot break down the walls of literary protectionism.
The fact that we even have comp lit departments reveals our ingrained belief that “comparing” literary works or traditions is merely optional. Despite Bloom’s own defense of a “Western canon,” such a thing no longer exists for most academics. This is not because the feminists, post-colonialists, or post-modernists managed to deconstruct it, but rather because our institutions for literary studies have gerrymandered the canon, department by department. Is it not shocking that students can major in English at many colleges without ever having read a single book written in a foreign language? Even in translation? (Consider, by contrast, that history majors, even those desirous to only study the American Revolution, are routinely required to take courses on Asian, African, and/or European history, in many different time periods, to boot.) Given that English is the natural home for literary-minded students who are not proficient in another language, it is depressing that they can graduate from college with the implicit assumption that literature is the prerogative of the English-speaking peoples, an habeas corpus of the arts.
But wait a minute: how dare I criticize English curriculums for not including foreign works, when the major granted by my own department, French, is not exactly brimming with German, Russian, or Arabic texts, either? To the extent that French (or any other foreign language) is a literature major, this point is well taken. But there are differences, too. First, it is far more likely that our students will have read and studied English literature at some point in high school and college. They will thus already have had some exposure, at least, to another national canon. Second, and more importantly, a French, Spanish, or Chinese major is more than a literature major: it is to no small degree a foreign language major, meaning that the students must master an entire other set of linguistic skills. Finally, language departments are increasingly headed toward area studies. German departments routinely offer classes on Marx, Nietzsche, and Freud, none of whom are technically literary authors. Foreign language departments are sometimes the only places in a university where once-important scholarly traditions can still be studied: Lévi-Strauss’s Tristes tropiques probably features on reading exam lists more often in French than in anthropology departments. A model for such an interdisciplinary department already exists in Classics.
I do not wish to suggest that English professors are to blame for the Anglicization of literature in American universities: they reside, after all, in English departments, and can hardly be expected to teach courses on Russian writers. The larger problem is institutional, as well as methodological. But it bears emphasizing that this problem does not only affect undergraduates, and can lead to serious provincialism in the realm of research, as well. An English doctoral student who works on the Enlightenment once openly confessed to me that she had not read a single French text from that period. No Montesquieu, no Voltaire, no Rousseau, no Diderot, rien. Sadly, this tendency does not seem restricted to graduate students, either.
Literary scholars are not blind to this problem: a decade ago, Franco Moretti challenged his colleagues to study “world literature” rather than local, national, or comparative literatures. He also outlined the obvious difficulty: “I work on West European narrative between 1790 and 1930, and already feel like a charlatan outside of Britain or France. World literature?” While the study of world literature presents an opportunity for innovative methodologies (some of which were surveyed in a recent issue of New Literary History), students already struggling to master a single national literary history will no doubt find such global ambitions overwhelming.
What, then, is to be done? Rearranging the academic order of knowledge can be a revolutionary undertaking, in which ideals get trampled in administrative terror. And prescribing a dose of world literature may ultimately be too strong a medicine for the malady that ails literary studies, particularly at the undergraduate level. In fact, a number of smaller measures might improve matters considerably. To begin with, literature professors could make a greater effort to incorporate works from other national literatures in their courses. Where the funds are available, professors from neighboring literature departments could team-teach such hybrid reading lists. Second, language and literature majors could also require that a number of courses be taken in two or three other literature departments. A model for this arrangement already exists at Stanford, where the English department recently launched an “English Literature and Foreign Language Literature” major, which includes “a coherent program of four courses in the foreign literature, read in the original.” To fulfill this last condition, of course, colleges would have to become more serious about their foreign-language requirements. Finally, literature students would be better served if colleges and universities offered a literature major, as is notably the case at Yale, UC San Diego, and UC Santa Cruz. Within this field of study, students could specialize in a particular period, genre, author, or even language, all the while taking into account the larger international or even global context.
Will such measures suffice to pull down the iron curtain dividing the literary past? Unless they manage to infiltrate the scholarly mindset of national-literature professors, probably not. Then again, as many of us know firsthand, teaching often does transform (or at least inform) our research interests. A case could of course be made for more radical measures, such as the fusion of English and foreign language departments into a single “Literature Department,” as exists at UC San Diego. But enacting this sort of bureaucratic coup carries a steep intellectual (not to mention political) price. It would be unfortunate, for instance, to inhibit foreign literature departments from developing their area-studies breadth, and from building bridges with philosophy, history, anthropology, sociology, religious studies, political science, and international relations. English departments, moreover, are developing in similar, centrifugal directions: in addition to teaching their own majors, English departments contribute more widely to the instruction of writing (including creative writing), and have their own ties with Linguistics and Communications departments. This existing segmentation of the university may appear messy, but has the benefit of preventing new walls from being erected, this time between neighboring disciplines.
Dan Edelstein is assistant professor of French at Stanford University.
Yogi Berra is supposed to have said that people shouldn't write their autobiographies while they're still alive. Anyone who reads very many academic autobiographies will appreciate the sentiment. We have enough accounts, thanks, of how the path to tenure in the English department at Duke University was lit up by certain profound early life experiences. (The route now seems exceptionally well-mapped for one that not many people get to travel.)
But an exception might be made a recent volume called A Taste for Language: Literacy, Class, and English Studies by James Ray Watkins, Jr., published by Southern Illinois University Press. It is not the work of an academic celebrity. I doubt anyone will turn to it for career advice; it doesn't offer any. But as a study of the examined life, it has its lessons.
The author is an online educator for the Art Institute of Pittsburgh and the Center for Talented Youth at Johns Hopkins University. He runs a blog called Writing in the Wild. So one learns from the back cover. But the lesson really starts with a photograph across from the title page. It shows the author's father and was taken circa 1944. He is wearing a tie and his hair is well-combed. The pose suggests that the portrait might be of a young soldier, taken as a memento for his parents -- except that he looks as if he may not yet be old enough to shave.
And it turns out all of this is true. The son of a tenant farmer in Mississippi, Watkins Sr. enlisted in the army at the age of 16. He claimed to be older, of course, and to have graduated high school, although his formal education actually ended in the fourth grade.
Thanks to the GI Bill, the adolescent tank-commander in that photograph later went to night school to get his equivalency degree, then attended Louisiana State University. This prepared him for a successful career as a utilities analyst for the city of Houston. He died in the early 1980s, not long after the author began his own higher education.
"No one in his immediate family had attended school much beyond the middle- or high-school level," writes Watkins Jr.. "His family, my mother tells me, saw college as a kind of indulgence and thought that any young man could better spend his time earning a living. Before he entered LSU, then, it is likely that my father had only the roughest approximation of what a university education might entail.... My father left college with more than professional skills; he graduated with a larger sense of the purposes of education that made it imperative for his children as well."
This is a story of upward mobility, then, with economic security as its goal. But that is not all that was transmitted from father to son. To go from seeing education as a needless luxury to regarding it as an urgent necessity for one's children involves a deep change of ethos. Watkins tries to reconstruct this process through a close reading of any material he can find from his father's education -- in particular the textbooks for his courses on composition and literature at LSU during the late 1940, which left him with the skills needed to produce the sort of expository prose required in the professional workplace.
As it happened, the English department at LSU was also then an epicenter of the New Criticism, whose practitioners tried to teach students to read literary works with an eye to how their language worked. "It seems reasonable to assume that my father's lack of previous education made the inculcation of this sensibility difficult at best....The academic triumph of New Critical literary education in the English department had strict limits, clearly marked in my father's transcript."
But the effort had its effect, even so. It meant that Watkins Sr. could recognize that there might be something worthwhile about the ability to read for pleasure. And so it is that -- two generations after functional literacy was the family norm, but anything beyond it regarded with misgivings -- the author could end up writing a master's thesis on Paul de Man, getting a Ph.D., and teaching at various institutions.
This, then, is not an academic autobiography so much as an educational genealogy. The author is tracing back to their sources the conditions of possibility for his own existence. But it is not particularly introspective. There are no prose-poetical arias. The writing is unsentimental.
Instruction in expository composition left Watkins Sr. in command of an efficient, objective, no-frills style: the equivalent of a professional demeanor that could zero in on facts, while keeping subjective expression to a bare minimum. The son honors that ability with a narrative voice that is so precise in conveying the man's likes and habits and expectations of life that you are left with a sense of having met him -- yet with only a hint at the depths of feeling that it must have stirred in him to tell the story.
In that sense, A Taste for Language is not memoiristic, either. Digging through his father's textbooks and situating them in the history of language study as a discipline, Watkins is doing scholarship. And his research has implications that are not strictly personal.
People who come from a long line of securely middle-class professionals can take a certain amount of inherited cultural capital for granted. In Watkins's case, that is not an option. His recognizes that he has been shaped, however indirectly, by educational influences that were being exercised on him before he was even born. His father's upward mobility was in part the product of the pedagogical labor of writing instructors. Talk about "the life of the mind" can get highfalutin and self-aggrandizing at times. There is something to say for grasping how much of it is the result of institutional processes that go largely unnoticed.
That, in turn, raises questions about how well the present arrangement works. "On the one hand," Watkins writes, "we must accept our students' vocational goals as legitimate expressions of their desire to maintain or strengthen their economic position; on the other, we must seek out ways to persuade them that the contemplative, reflective traditions of the academy are important to their professional and social futures. Indeed, our goals ought to be even larger: to convince students that in spite of their apparent impracticality, the critical methodologies of the school have immediate professional application. Alertness to injustice isn't simply helpful in 'society in general'; it is necessary in the immediate, specific context of the work site."
Whether this can be realized in practice is, of course, another matter. Back when Watkins's father went to college, composition and literature were part of the same discipline. But that has not been the case for some time. Most training in composition is done by part-time or adjunct instructors. That arrangement, in turn, reflects a set of priorities in which such training is treated as a necessary but (at best) secondary function of the university. Which, in turn, reinforces the tendency for the rewards of higher education to go to students who arrive with adequate stocks of inherited cultural capital. It is an arrangement that seems almost as if it were designed to sustain inequality, rather than narrowing it.
"A two-tiered system of a few well-paid and independent literary teachers and researchers working side-by-side with poorly compensated part-time composition teachers would hardly support the interests of our profession, our students, or our society," writes Watkins. He calls for unionization of instructors in composition and literature as a first step towards mitigating this situation.
This reader, at least, wanted to applaud. Without the ability to bargain collectively, it's hard to see how the casualisation of academic labor will ever end. But it does rather raise the question of whether the expectation of upward mobility is not so ingrained in the professionalized middle-class as to make solidarity an almost unimaginable ideal.
Since 2000, I've been the host of the Wimba Distinguished Lecture Series, shouting from the rooftops (well, desktops) about how to use modern educational technologies to teach effectively online. But now, after evangelizing for the last decade, I'm switching sides. I am teaching creative writing online as an adjunct professor for Holmes Community College, in Goodman, Mississippi. How the tables have turned.
I've probably led more webcasts than anyone on the planet. Seriously. I've hosted webcasts at least once a week for 10 years and I've also given thousands of other online presentations. From presentations about educational technologies and policies, to effective instructional techniques, I've done it. But now I'm tasked with teaching – online – creative writing, a topic that traditionally uses a workshop format, a format that is quite difficult to replicate in a virtual environment. Yet it's not the format that worries me.
You see, this is my first time teaching a college course. Though I've led writing workshops, collaborated with writers and journalists here in New York, contributed to numerous publications, and even penned my own book, I now fretfully ready myself to formally – and virtually – mold young (and a few moldier) minds at a college more than 1,000 miles away from my life here in New York. But I can’t wait. I can't wait to familiarize my students with exemplary works of poetry, fiction and nonfiction. I can't wait to answer my students' questions and hear their insights. I can't wait for my students to learn from me and for me to learn from them. I'm nervous. But I'm ready. I think. So in the immortal inquiry asked by David Byrne: Well, how did I get here?
Let’s start by looking at the Ed Tech industry first.
When wearing my Wimba hat, I often remind my audience that it’s only been about a decade since the modern format of online courses was put into place. The current configuration of combining course management systems, web conferencing, instant messaging, message boards, etc. to teach a class to students in a classroom and/or their pajamas barely existed in the 20th century, so when one stops to consider the idea that collegiate courses had been taught (more or less) in the exact same manner since ancient Egypt, Greece, and Mesopotamia, it’s quite startling to see how quickly this transformation has transpired.
Obviously this format of modern courses is still being tweaked, but it certainly appears that much of the technological and pedagogical foundation is firmly in place. As of today, the dawn of the ‘10s, tens of thousands of postsecondary faculty, either because of or in spite of their ability and/or willingness, have already taken the plunge and incorporated technologies into their courses – often with a great deal of success.
I’ve written numerous research documents boasting both the tangible and intangible benefits of technology-enabled courses. Countless examples of institutions around the globe that have seen benefits such as increased retention rates of students, increased enrollments, improved graduation rates, and dollars saved on time and travel, all thanks to technology in the classroom, fill the pages of these documents. In fact, I’ve seen so many positive examples of technology-enabled education over the years that I now have an extremely difficult time understanding why any institution wouldn’t beef up its current online offerings. The downside is just so negligible while the upside is so great.
But I digress. After all, I’ve now got my own class to worry about.
A couple of months ago I left my comfy big-city confines and headed south to tiny Goodman for an on-site orientation for new faculty. I didn’t really know what to expect. I knew I’d have a big leg-up in terms of my knowledge of online course technologies, but I also knew I’d have a big leg-down in terms of my knowledge of classroom instruction. Turns out I was dead-on.
My two Holmes Community College trainers that day explained the ins and outs of being an online instructor to me and the approximately 10 others in the room, all of whom had collegiate teaching experience. At least my tech savviness made up for the in-front-of-a-class-savviness I lacked. But even though I was already familiar with the Blackboards and SunGuards of the world, I didn’t realize how much about them I didn’t know. As my girlfriend always says, it’s hard to know what you don’t know.
My HCC trainers spent hours teaching me about Bb’s enrollment tools, grading and assessment functions, and how to withdraw students who need to drop out. Despite being around instructors for so much of my life, I guess I never truly grasped how much of teaching is actually administering. After a full day of technology training I left the campus very excited, but also very nervous. I kept picturing myself pushing the wrong button and accidentally unenrolling an eager student and then having to sheepishly write an email to the Holmes IT staffers informing them of my blunder.
But on the flipside, my nervousness also translated to eagerness. As I learned more about my prospective students – fervent 18-21-year-olds as well as working adults from around the country – I plotted the numerous ways in which I could engage them online. While driving from my orientation back to the Jackson airport I thought of at least 20 assignments that would combine best practices of teaching creative writing face-to-face with best practices of teaching online. In fact, by the time I reached the rental car return desk I could envision the thank-you letters I hoped to receive from my happy students who affably learned a few tips and tricks about writing with some flair.
Which brings me to today.
My lesson plans are done. My syllabus is up. My books are in the bookstore. But my mind still bursts with uncertainties (after all, I am a writer).
How well can they write? What do they already know? What don’t they know? From what kinds of experiences will they draw when they put pen to paper? Have they been to William Faulkner’s house up the road in Oxford? Will they mind if I occasionally swear? Will I understand them if they speak with thick drawls? Will their writing be better than mine?
The waiting is the hardest part. I wish I could invent time travel and get the first class over with.
The funny part is that I’m never this nervous when preparing and/or waiting to give presentations for Wimba, but I guess that’s because of my experience at the company. Hopefully I’ll read this op-ed a few years from now and laugh at how nervous I was. Man, I can’t wait to be a veteran writing teacher brimming with the confidence only gained from years of experience! Oh, how worn will the elbow pads of my tweed jackets be. Some day.
I discussed my trepidations with my family over the holidays, and my dad, drawing upon his 30 years of teaching experience, asked, “Do you have your opening speech ready?” I told him I did, but I lied, I guess because I don’t really need one. And this already demonstrates the difference between online and face-to-face.
When my class is ready to begin, someone from HCC’s technology department will simply hit a button in Blackboard, and then, in an instant, the class will be active. It won’t be the same as the first class of a face-to-face course. I won’t write my name in big letters on the chalkboard and won’t give a big dramatic speech about the wonders of writing creatively. Instead, my students will receive a message in their inboxes notifying them to watch the archive of a lecture I’ll record later this week. Sure, they’ll still see my talking head and hear the inflection of my nervous-yet-excited voice, but the impact might not be as a great as watching me forcefully pace back-and-forth in front of full lecture hall. Then again, perhaps the impact will be even greater because they’ll be equally nervous as they embark on a new class in a new medium.
Stay tuned for more as I tell my tales from the other side….
Matt Wasowski is senior director of customer programs at Wimba.
What 10 books have most influenced you? That question has launched many a discussion online in recent days.I’ve been scribbling down my own list while reading the replies – but also wondering just how we assess the presence of influence, let alone its relative intensity.
With some of the lists, it's hard to tell what the word means. If a person names J.R.R. Tolkien’s Lord of the Rings as an influence, what does that imply? Did he or she become a scholar of Anglo-Saxon literature? Go on an epic quest that saved the world? Write hobbit stories? Record a heavy-metal album with runes on the cover?
To cite something as an influence can be a way to emphasize that it yielded much satisfaction. But the term properly implies something more consequential than that. You didn't just consume and digest; you were consumed and digested in turn.
I greatly enjoy the TV series "Breaking Bad" yet do not feel that it is transforming my existence. It has not inspired me to cook and sell methamphetamines, or even to imagine this as a possible solution to midlife anomie. Hence I would not claim it as an influence, just yet.
What counts, then? In mulling this over, it became clear that some authors were just too influential to claim as influences, if you will forgive the paradox. I read quite a lot of Marx, Freud, and Nietzsche at an impressionable age, and this certainly left its mark. But putting them on the list seemed unnecessary, for their power is pretty nearly inescapable.It would be like pointing out that I have breathed a lot of oxygen in my day.
Anyway, enough prolegomenous throat clearing. On to the list...
(1) Bertrand Russell, Why I am Not a Christian
Until shortly before my 14th birthday, in the opening months of the Carter administration, I was a Christian fundamentalist who fully expected the apocalyptic scenario of the Book of Revelation to be worked out in world events during my lifetime. Please understand that I do not say that with even the slightest sense of irony.
At the time, I was also very keen on Blaise Pascal, who was definitely not a Southern Baptist but who had undergone a mystical experience giving him a deep conviction of the existence of a divine order. Bertrand Russell’s book must have been on a library shelf near Pascal. I started reading it to arm myself against the enemy.
Things did not work out that way. In his urbane and relentlessly logical manner, Russell broke down everything that I had taken to be axiomatic about the existence of God -- and about the terrible consequences of not believing. He seemed to anticipate every counterargument. I spent days -- and a few late nights – running through it in my head.
The experience was painful and terrifying. It shook me to my core; even that seems like an understatement. Nothing remained the same afterward. To repeat: Influence and pleasure are entirely different things.
(2) Allen Ginsberg, Howl and Other Poems
At 14, I thought this was the greatest book of poetry ever. The length and the rhythm of Ginsberg’s lines, his juxtapositions of imagery (“the crack of doom on the hydrogen jukebox”), the way his diction shifted into the biblical or the street-level obscene ... all of this made the hair stand up on the back of my neck. Which, as someone once, said, is how you can tell when poetry is working on you.
It also inspired many a page of my own literary efforts, now lost to posterity. Paper will burn, if you let it.
Today the Beat idea that suffering and madness and extremity bring wisdom does not strike me quite so appealing and romantic. I have been exposed to quite enough miserable, crazy, extreme people for one lifetime. (They come to Washington a lot, especially these days.) But I still love this book. The shorter poems in the back – written when Ginsberg himself was under the influence of William Carlos Williams – still seem very moving.
(3) Jorge Luis Borges, Other Inquisitions
The best way to discover Borges is probably through the short stories in Ficciones, or the selection of prose and poetry in Labyrinths. As chance had it, I first came across him by way of this volume of essays, in which criticism becomes a form of imaginative writing. For Borges, all of literature forms one big interconnected structure in which the books are, in effect, reading you. Many an academic article on intertextuality consists of an unwitting and usually witless gloss on Other Inquisitions.
My favorite passage comes at the end of “Kafka and His Precursors,” an essay of three pages that subtly transforms the very idea of “influence” itself: “The fact is that every writercreates his own precursors. His work modifies our conception of the past, as it will modify the future.”
(4) Susan Sontag, Against Interpretation and Other Essays
Borges combined erudition with playfulness. Sontag, by contrast, was an erudite person who sometimes tried very hard to be playful, more or less out of a sense of duty. I don’t think this worked out very well, and certainly not over the long run.
But in the early 1960s, she wrote a series of essays on literature, film, art, and ideas that remain exceptional and definitive. In them you feel a mind trying to open itself to as many possibilities as it can, sort of like Matthew Arnold dealing with being trapped in Andy Warhol’s Factory for a while. This book was the syllabus for my own reading and moviegoing for a few years after I first discovered it, and I go back to visit it from time to time, like a favorite neighborhood.
(5-6) Jean-Paul Sartre, pretty much anything in English translation as of the early 1980s
OK, admittedly this is cheating, since it would include dozens of volumes of philosophy, fiction, plays, and journalism. I would need to wedge the pertinent volumes of Simone de Beauvoir’s memoirs in there, as well. You do what you have to do. I feel sufficiently uneasy about this to let it claim two spots on the list, rather than just one.
Sartre embodied the writer as intellectual and activist. There is nobody even remotely comparable these days; don't accept cheesy knock-offs. The question of Sartre's legacy is too complicated to go into here, and I am ambivalent about much of it, now, in any case. But his work still provokes me – through inspiration or irritation or both – in a way that no living author’s work does.
Narrowing things down a bit: Two volumes of interviews and articles from his final decade or so, Life/Situations and Between Marxism and Existentialism, seem like quintessential books. The latter has recently been reissued by Verso.
(7) Norman Podhoretz, Making It
Published in late 1967, while Podhoretz still considered himself a liberal (his transformation into neoconservative ideologue would take a few more years), Making It is the story of one man’s relentless climb to eminence in the world of the New York literary-intellectual establishment.“One of the longest journeys in the world,” its opening sentence begins, “is the journey from Brooklyn to Manhattan....”
Reading this in Texas at the age of 19, I was not yet in a position to appreciate its full, rich ridiculousness, and instead studied the book as carefully as I once had any account of the act of love – preparing for the day when detailed information might prove useful, rather than just frustrating.
In a hurry to brush off the hayseeds, I managed to confuse cynicism with sophistication. Over time, this did a certain amount of damage -- some of it, fortunately, remediable. It is embarrassing to include this book on my list. That is why I am doing so.
(8) Richard Hofstadter, The Paranoid Style in American Politics
There are serious problems with Hofstadter’s analysis of the People’s Party of the 1890s. We can talk about the failings of the consensus school of U.S. historiography until the cows come home. I acknowledge these things without reservation. And yet this book is indispensable.
I first read it in the early 1980s and have revisited it at least once each decade since then. I know of no better description of the typical qualities and standard features of our public discourse in its barking-at-the-moon episodes. It reminds us that such upsurges do not come out of nowhere. This is not exactly a comfort, but it does help make the batshit insane seem at least somewhat intelligible.
One American television network has evidently adopted the book as the basis for its business model. But you can’t blame Hofstadter for that.
(9) Richard Wright, American Hunger
In the early 1940s, Richard Wright produced an autobiographical manuscript covering his life up to 1937. Most of it was appeared in 1945 as Black Boy, but the final section, covering his years as a member of the Communist Party, was published as a separate book in 1977.
I was very taken with it not simply for its account of the radical movement during the Depression but for Wright's account of his own struggle to become a writer. And all the more so given something the author's estate included in the original printing of the book. It was facsimile reproduction of one page of the typescript, covered with his handwritten revisions of the text -- lines crossed out, words changed, sentences rewritten, etc.
This came as a revelation. My assumption had been that once you learned how to write, well, you just wrote. (The struggle was just to get to that point.) I stared at the page for a long time, trying to figure out how Wright had known that a given phrase or sentence might be improved, especially since what he had down often looked fine.
Another form of influence: When a book teaches you how much you don't know about how much you don't know, and how much you need to know it.
(10) Richard Lanham, Revising Prose
Finding this volume in a secondhand bookstore was not, perhaps, the answer to a prayer. But the deep perplexity left by American Hunger certainly left me ready for it.
Half of learning to write is knowing how to recognize when a sentence or paragraph is bad, and why, and what can be done about this. Lanham teaches a handful of very basic skills necessary to begin reworking a draft. His manual is now in its fifth edition. I have no idea what changes may have been introduced in the past 25 years or so. But if a textbook ever changed my life, this one did.
For a long while now I have planned to write an essay about the habit of keeping a notebook, and have even, from time to time, started to take notes on the topic. By now there have accumulated more passages hectoring myself to settle down to work on it than pages containing actual insights. It seems the project has a short circuit.
But it may be that this reflects a basic tension within the notebook itself, considered as a genre of writing. On the one hand, it is turned towards the outside world; it is absorptive and assimilative, a tool for recording information, ideas, impressions. On the other, it is the ideal venue for self-consciousness to run amok. Even when a notebook is integral to a specific project, the writing always seems to be lacking something. Thoughts remain unfinished or provisional. You are moving but you aren't there yet. This can be frustrating. But then a notebook can also be where you can dig in your heels -- summoning up the confidence, or the vital reserves of energy, needed to continue.
Sometimes the notebook provides escape from the work in progress, rather than contributing to it. This is not necessarily a matter of procrastination.
The best essay on the notebook as workshop is probably “On Intellectual Craftsmanship” by C. Wright Mills.(See this column on it.) But an important supplement comes from Elias Canetti, who won the Nobel Prize for Literature in 1981. After spending decades on his idiosyncratic and sui generis work of scholarship Crowds and Power (1960), Canetti published a mordant essay on notebook-keeping called “Dialogue with the Cruel Partner.”
“One cannot avoid the fact,” he writes, “that a work being continued daily through the years may occasionally strike one as clumsy, hopeless, or belated. One loathes it, one feels besieged by it, it cuts off one’s breath. Suddenly, everything in the world seems more important, and one feels like a bungler... Every outside sound seems to come from a forbidden paradise; whereas every word one joins to the labor one has been continuing for so long, every such word, in its pliant adjustment, its servility, has the color of a banal and permitted hell.”
From such dark moods, the notebook offers a reprieve. When the writer “views himself as the slave of his goal, only one thing can help: he has to yield to the diversity of his faculties and promiscuously record whatever comes to his mind.... The same writer, normally keeping a strict discipline, briefly becomes the voluntary plaything of his chance ideas. He writes down things that he would never have expected in himself, that go against his background, his convictions, his modesty, his pride, and even his otherwise stubbornly defended truth.”
There is a third modality of the notebook habit – a matter of treating it, neither as the warehouse and workshop for a project nor as an escape from its demands, but as something like its own form of writing, imposing its own peculiar demands.
Joan Didion’s essay “On Keeping a Notebook” is astute on how this third mode is a function of temperament: “The impulse to write things down is a peculiarly compulsive one, inexplicable to those who do not share it, useful only accidentally, only secondarily, in the way that any compulsion tries to justify itself.” The fragments jotted down are “bits of the mind's string too short to use, an indiscriminate and erratic assemblage with meaning only for its own maker.”
The resulting collages of stray data and random insights are a way to keep track of one’s earlier incarnations, the personalities adopted and left behind in the course of a lifetime. “I think we are well advised to keep on nodding terms with people we used to be,” Didion writes, “whether we find them attractive company or not.”
As it happens, Canetti made much the same point. “The mechanisms one uses to make life easy are far too well-developed,” he writes. “First a man says, somewhat timidly: ‘I really couldn’t help it.’ And then, in the twinkling of an eye, the matter is forgotten. To escape this unworthiness, one ought to write the thing down, and then, much later, perhaps years later, when self-complacence is dripping out of all of one’s pores, when one least expects it, one is suddenly, and to one’s horror, confronted with it. ‘I was once capable of that, I did that.’ ”
On this account, then, notebooks are, in effect, an annex of the superego. My own notebooks play that role at times.They document opinions or enthusiasms that sometimes prove embarrassing, after a few years have passed. But they are also full of injunctions – usually to work harder, or to finish some project now gathering dust in one of the more workshop-like volumes, or to start studying X in a systematic fashion (and here’s the syllabus...).
Recently the text of Didion’s essay was posted at an online venue called The New Inquiry, which is something of a cross between a group blog and a salon (it sponsors face-to-face meetings in New York between readers and contributors) and seems to be in transition towards becoming a magazine. Its three founders are recent graduates of Columbia University and Barnard College.
The site itself is a kind of collective notebook. It made me wonder how the proprietors understood notebook-keeping – and whether digital technology influenced how they practiced it. My own habits are irremediably old-fashioned. A netbook is not a notebook, to my mind anyway, and I still do a lot of writing with pen in hand, even while exhorting myself to be more productive and efficient (a performative contradiction, if ever there were one). But being stuck in one’s own habits does not preempt curiosity about those of other people, so I asked the New Inquirists how they saw “notebooking.”
While she prefers to read from paper, Jennifer Bernstein, a New York-based writer, finds that reflecting on what she reads is another matter: “I often create a Word document in which to jot down the best ideas and quotations from a book. Then I end up reading commentary on the book and articles related to its theme, excerpts from which I also paste into the document, usually with my own thoughts. The document becomes a kind of mini-scrapbook, the record of my exploration of a concept (for example, one I did recently was conservatism in the 20th century). This isn’t a perfect method. It’s led to a proliferation of strangely titled documents on my hard drive that at some point I should probably sort through and systematize. On the other hand, the chaos reflects how my mind really works.”
Rachel Rosenfelt, a cultural critic living in Brooklyn, told me: “I've never been a paper-notebook keeper in the sense Didion means it. Or in most senses, really. When I moved out of my last apartment I unearthed a pocket notebook that I had bought years earlier to track my expenses. On the first page was written: ‘notebook- $3.14.’ That was the only entry.”
Instead, she uses whatever book she is reading as a recording surface. They end up “profaned,” as she puts it, “filled with unrelated scribblings in the front and back pages, marked up with underlines, stars and notes....The notes I take within the texts and margins of books work like a diary for me in that sense, and often have a second life online in the form of the ideas I formulate and write about on TNI and elsewhere.” One consequence is that Rosenfelt can never part with a book when she is done with it. After all, you don’t sell a diary.
The attitude of Mary Borkowski, an arts programmer for the Columbia University radio station WKCR, sounds closest to my own. “I'm a bit eccentric in that I rarely write anything initially on the computer,” she told me. “I compose most essays, letters, short stories, poems, even emails, in long hand and then transcribe them onto the screen. I do realize that writing in longhand is, well, time-consuming, but there is something about writing in longhand that is always more surreptitious, more crafty, almost silent -- the least painful way to wrench a thought from my mind.”
The exact format of “notebooking” matters less, Borkowski says, than the impulse to find “a canvas for the mind” – a place for “the spurts of thoughts and memes, blurps from the brain stems that have no order yet.” The notebook is “the outline before the outline.”
I sensed that The New Inquiry serves as a place to record (the preferred term now is “curate”) things its participants had read, and to gloss them if the spirit so moves. Jennifer Bernstein confirmed this: “I usually just post several cultural artifacts that I see as closely related, without comment (see this, for example). This format allows me to maintain the loose, associative connection between them (and to suggest that connection to others). Websites can accommodate all kinds of media, including audio and video, which allows juxtapositions that weren’t instinctive or even possible before.”
Besides “collective notebook-keeping in the form of group blogs,” Bernstein noted the potential of formats such as Google Documents, “where people can edit the very same text, or Wave, which supports all forms of media. Basic software innovations like Word’s Track Changes and Google Wave have multiplied the forms that commentary can take.”
But part of what I value about The New Inquiry is that its participants always seem at least somewhat ambivalent about the technologies they have grown up with – and this comes through in Mary Borkowski’s comments.
“We create tools for living,” she told me, “and they became objects that totally dominate us or we dominate them. Notebooking is then one of the last personal stands against the individual mind being dominated by outside forces, or having to 'think inside the box,' if you will. It's a 'secret,' 'private' outlet that used to exist in ledger or diary form but now, especially when we're so inundated by the busy-ness of technology, notebooking is a state of mind expressed in the time we are separated from our palm devices, or laptops, or phones. The notebooking state of mind comes up when we can think minus the chatter, when ideas clarify. Notebooking facilitates the spontaneity of creativity, thoughts that could occur at any moment, or random time -- the unaccounted for in our over-accounted for, micromanaged, lifehacking world. Notebooking is the place to process your thinking in a world that seems to only value the end product.”
About 20 years ago, while I was working in the Manuscript Division of the Library of Congress, one of my fellow archival technicians was a recently graduated Yalie who had been employed at one point by the Beinecke Rare Book and Manuscript Library. Yale University is home to, among other things, the Ezra Pound papers. "After a while,” my friend said, “you started to notice something about the Ezra Pound scholars. They looked like Ezra Pound. Not all of them, of course, but a lot of them did. You could tell when there was a conference because all these guys who looked like Ezra Pound were in the reading room.”
This raised questions, of course, about influence and causality: Did you start imitating Ezra Pound after studying him for a while, or was it that guys who already looked a little bit like the poet were more likely to specialize in him? Did people in other specialties or fields of study do this? We did not have people in powdered wigs showing up at the LC asking to see the papers of the Founding Fathers. Did they maybe have powdered wigs in their backpacks but thought better of it when they saw the security guards?
And so the conversation progressed after work, after beers. I forget what conclusions we reached, but that may be for the best.
Some of it came to mind a couple of weeks ago while I was back at my own alma mater, the University of Texas at Austin, standing in the lobby of the Harry Ransom Center, where there is a small display of a few items from the recently acquired papers of David Foster Wallace. I had made inquiries about having a look at the collection. It is still being processed, and I was told that doing so would only be possible during a return trip this fall. I imagined coming back in November to a reading room full of David Foster Wallace scholars -- unshaven guys in bandannas, presumably. So much for stealing a march on them....
The glass case in the Ransom Center lobby contains a few pages of the typescript of his novel Infinite Jest, and the page proofs of a biography of Borges that he wrote about forThe New York Times (full of the marginal and inside-the-cover notes a reviewer makes along the way), and also a poem about Vikings that he had written at the age of 7.
The display was a modest concession to public curiosity. While no amount of staring at it could spark in my brain any new insight into DFW's work, it had the virtue of being unsensationalistic. A writer who kills himself runs the risk -- and he must have known this -- of having his life and work turned into one long suicide note. That is both ghoulish and dumb, but perhaps understandable, given that the act of writing itself tends to be lacking in overt drama. It is easier to focus on the big exit than the steady application of backside to chair.
One small element of the display did have an emotional charge, at least for this viewer. Inside the cover of the Borges biography (which he ended up finding disappointing) Wallace recorded the word count and deadline his editor had given him when assigning the piece. There is absolutely nothing remarkable about either the note or its location; it is the kind of thing a reasonably efficient working writer jots down as a matter of course.
But there is a complex double-take involved in seeing Wallace in those terms: a genius, yes, but also, among other things, a reasonably efficient working writer, immersed in the everyday routines of that particular mode of being in the world.
Returning last week to the familiar clutter of my Inside Higher Ed cubicle -- a scene less of reasonable efficiency than entropic squalor -- I found that Broadway Books has sent a copy of David Lipsky’s new volume Although Of Course You End Up Becoming Yourself: A Road Trip With David Foster Wallace. It consists of the transcript of five days’ worth of conversations that Lipsky, a novelist and Rolling Stone contributing editor, had with Wallace in early 1996, when Infinite Jest had just appeared. There are a few pages of introductory material by Lipsky himself. They overlap a bit with the memorable article Lipsky published following Wallace’s death, but not that much, and anyone who has read the one should also check out the other.
Becoming Yourself is not that long a book (just over 300 pages, most of them well-ventilated with white space) but I found it a slow read, because something about the whole thing felt disquieting. Lipsky was accompanying Wallace on part of his book tour. Their discussions, recorded on tape, were meant to be raw material for a Rolling Stone profile that, for one reason or another, never quite came together. Although a sort of intimacy emerges, the whole thing is marked by the strained dynamic of self-consciousness squared -- for each of them is alert to the Goffmanian undercurrents of each step of the whole encounter, the way that each element of self-disclosure (whether by interviewer or by subject) is at least potentially a form of manipulation.
That tension will not come as a surprise to any reader of Wallace. The ratcheting-up of self-awareness, particularly as provoked and channeled by the mass media, is the vital pulse of his writing, whether fiction or non-. He never left you with the sense that he was exempt from it in its most inexorable and on-autopilot forms; on the contrary. But with pen in hand, he could, if not exactly regulate the pace and intensity of hyperlucidly self-conscious frames of mind, then at least do something with them, creatively.
Not so here. At times Wallace finds himself at sea, treading water, going in circles. His comments, made between stints of promoting Infinite Jest, are riddled with a sense of complicity in something he understands as both necessary and dubious. (Commercially necessary; existentially dubious.)
He has, he says, “written a book about how seductive image is, and how very many ways there are to get seduced off any kind of meaningful path, because of the way the culture is now. But what if I become this grotesque parody of just what the book is about? And of course, this stuff drives me nuts.... So the next level of complication is, do I congratulate myself on my worry and concern about all this stuff, because it is a sign that I’ve not been seduced about it? And then of course, if I get happy about that, then I’ve lost the edge – I mean, there’s just no end to the little French curls of craziness you can go through about it.”
True, that. But these are remarks, and remarks are not literature. Over time it becomes obvious that Wallace is in perfect earnest about the fear of distraction from work – from making literature, that is, rather than being part of the culture industry, with its ambient sound (as he puts it) of “this enormous hiss of egos at various stages of inflation and deflation.” Wallace is the most eloquent in the passages where, no mistake about it, you can hear his desire to stop talking.
Best to end, then, with one of them:
“What writers have is a license and also the freedom to sit – to sit, clench their fists, and make themselves excruciatingly aware of the stuff that we’re mostly aware of only on a certain level. And that if the writer does his job right, what he basically does is remind the reader of how smart the reader is. Is to wake the reader up to stuff that the reader’s been aware of all the time. And it’s not a question of the writer having more capacity than the average person. It’s that the writer is willing I think to cut off, cut himself off from certain stuff, and develop ... and just, and think really hard. Which not everybody has the luxury to do. But I gotta tell you, I just think to look across the room and automatically assume that somebody else is less aware than me, or that somehow their interior life is less rich, and complicated, and acutely perceived than mine, makes me not as good a writer. Because that means I’m going to be performing for a faceless audience, instead of trying to have a conversation with a person.”
Usually the reader imitates the author -- hoping to absorb that state of grace or genius, or at least to share in its aura. Here the roles have shifted, the polarities reversed. This is why his death seems such a loss. Reading him, there was the sense that he understood the way we live now. He would tell us what we knew about it. Almost knew, but not yet.
What's in a name? that which we call a rose By any other name would smell as sweet.
These lines from Romeo and Juliet are often quoted to indicate the triviality of naming. But anyone who has read or seen the play through to its end knows that the names Montague and Capulet indicate a complex web of family relationships and enmities that end up bringing about the tragic deaths of our protagonists.
Lore also has it that Shakespeare's lines were perhaps a coy slam against the Rose Theatre, a rival of his own Globe Theatre, and that with these lines he was poking fun at the stench caused by less-than-sanitary arrangements at the Rose.
I write now in response to the naming of a newly created department at my large state university called "the Department of Writing and Rhetoric." This new department is being split off from the English department and given the mandate to install a new Writing Across the Curriculum program, convert adjunct positions to "permanent" instructor positions, and establish a related B.A. degree.
While the acronym WAR may seem appropriate to some of my colleagues, many of them think we have more important things to worry about than a name right now. We have also been repeatedly told in the face of previous protests that referring to Composition as Writing is a trend nationwide. Nonetheless, I believe that this title is an indication of bad faith and a negative harbinger for the work of the new department and programs like it elsewhere.
Since the announcement of this change, I attended a tenure party for a colleague in another department. Every single person I spoke with at this party assumed from the title of the new department that "all" writing would be taught there, including my field of Creative Writing. People repeatedly asked me what I thought about being in a new department, and I repeatedly corrected them as confusion spread over their faces. They couldn't understand how the Department of Writing and Rhetoric would not include the writing of fiction, poetry, and so on. I repeatedly had to say that “Writing” in this usage means Composition. They repeatedly asked me why, then, the department will be using the title Writing.
That's a very good question, and one that indicates something disturbing, not just here, but in that nationwide naming trend mentioned above and so often cited. Referring to programs in Composition by the title "Writing" indicates that this field is the authority over all meaningful types of writing – in all other fields. By implication, it implies that no other type of writing but what Composition Studies teaches is valid or important – or even exists. Both of these claims are demonstrably false, although they are the silent assumptions that often underlie Composition's use of the term Writing to describe itself.
Perhaps even more disturbing is that using the name Department of Writing and Rhetoric indicates a willingness to write badly in order to empire-build. Good writing is always about clarity and insight, precision and accuracy. Therefore, this confusing name calls into question the very quality of the writing instruction that will be given in the new department. If the department cannot and will not name itself accurately, then what does that bode for the students to be educated there?
Don't get me wrong. I also differ from some of my colleagues in that I am happy about the creation of the new department. Composition is an upstart field that, like my own of Creative Writing, has often not gotten its due. Partly this is because it stems from a remedial function -- Composition became necessary when the sons and daughters of the working class began attending colleges and universities and were not adequately prepared in the finer points of belles lettres.
Naturally, due to the fact that the background -- and the goals -- of these individuals differed from those of the upper classes that had established belles lettres, Composition began to explore and defend less artistic, more practical forms of writing. This evolution differs from that of such programs in mathematics, for instance, where remedial algebra still focuses on the same formulas as those used in advanced courses. In Composition Studies and Writing Across the Curriculum programs, there has been a focus on supplanting the literary scholarly essay as the gold standard of writing. In the past few decades, Composition as a field has worked hard to establish the legitimacy and importance of other forms of writing and their teaching. Much of this effort I admire.
I am also happy that Composition will be given resources long absent. Having taught Composition courses myself for several years, I understand the need for acknowledgment and support, even if the specifics of the plan at my university have not been widely shared or discussed and seem to me based on suspect methods. I wish the new department nothing but the best in its attempts to improve basic writing instruction for our students.
However, many in the field of Composition have also brought resentment of old wounds and insults to bear by attempting to claim that it is foundational and that it is the expert in all types of writing. Advocates for the field have accomplished this by theorizing what they do and by selling it to those in other fields as the answer to literacy. Among other things, they have also tried to change its name to something less associated with its remedial roots and more grandiose in its scope. However, it remains the case that Composition Studies does not represent a universal approach to literacy, critical thinking, or writing.
In my own field of Creative Writing, for instance, we have far different assumptions about what constitutes effective writing instruction. Admittedly, we have somewhat different purposes. But let me also point out that the rise of Composition Studies over the past 30 or 40 years does not seem to have led to a populace that writes better.
In fact, it has coincided with a time when literacy rates have dropped and where complaints about the poor writing skills of college and university graduates (especially of large public universities) have continued to rise. Obviously many complex social factors contribute to this. It is also debatable whether universities have contributed to this state of affairs because the changing methods of teaching Composition are misguided or because there simply haven't been enough resources. I'm all for giving Composition the resources it needs, respecting its right to self-determination in its field, and letting us see what happens. I am all for the general population writing better, even if it is in an instrumental and limited form disconnected from the literary traditions that have fed most love of and respect for the written word in our culture.
Beyond the details of these various professional debates, my negative reaction to the new departmental name stems from the corruption of language that is so prevalent in our society today, where advertisers and politicians and many others lie through exaggeration, omission and indirection. The best analysis of this is perhaps Toni Morrison's 1993 Nobel Lecture in Literature. In it she talks about uses of language that are destructive, about language that obscures rather than clarifies, and how so often such language "tucks its fascist boots under crinolines of respectability and patriotism as it moves relentlessly toward the bottom line and the bottomed-out mind."
If we put the writerly education of our students into the hands of people who insist on rejecting the accurate term Composition for the grandiose and unclear one Writing, what will they learn? They will learn, I am afraid, that they can say whatever they want, even if it is sloppy, confusing, manipulative, or a knowing lie.
Misnaming this department also evokes the negative definition of the title's other half: Rhetoric. In academe we know that rhetoric can be "the study of effective use of language," but most of the world is more familiar with rhetoric defined as "the undue use of exaggeration and display; bombast." This latter definition seems apt when combined with Writing in this name.
I, for one, will never call it the Department of Writing and Rhetoric. I will call it what it actually is: the Department of Composition and Rhetoric. If its practitioners truly respected their own history, they would call it that, too. A "rose" sometimes can smell not so sweet, especially if it turns out not to be a flower at all.
Lisa Roney is associate professor of English and coordinator for the undergraduate Creative Writing program at the University of Central Florida.
Many of us committed to the liberal arts have been defensive for as long as we can remember.
We have all cringed when we have heard a version of the following joke: The graduate with a science degree asks, “Why does it work?”; the graduate with an engineering degree asks, “How does it work?”; the graduate with a liberal arts degree asks, “Do you want fries with that?”
We have responded to such mockery by proclaiming the value of the liberal arts in the abstract: it creates a well-rounded person, is good for democracy, and develops the life of the mind. All these are certainly true, but somehow each misses the point that the joke drives home. Today’s college students and their families want to see a tangible financial outcome from the large investment that is now American higher education. That doesn’t make them anti-intellectual, but simply realists. Outside of home ownership, a college degree might be the largest single purchase for many Americans.
There is a disconnect as parents and students worry about economic outcomes when too many of us talk about lofty ideals. More families are questioning both the sticker price of schools and the value of whole fields of study. It is natural in this environment for us to feel defensive. It is time, however, that we in the liberal arts understand this new environment, and rather than merely react to it, we need to proactively engage it. To many Americans the liberal arts have a luxury they feel they need to give up to make a living -- nice but impractical. We need to speak more concretely to the economic as well as the intellectual value of a liberal arts degree.
The liberal arts always situate graduates on the road for success. More Fortune 500 CEOs have had liberal arts B.A.s than professional degrees. The same is true of doctors and lawyers. And we know the road to research science most often comes through a liberal arts experience. Now more than ever, as employment patterns seem to be changing, we need to engage the public on the value of a liberal arts degree in a more forceful and deliberate way.
We are witnessing an economic shift that may be every bit as profound as the shift from farm to factory. Today estimates are that over 25 percent of the American population is working as contingent labor -- freelancers, day laborers, consultants, micropreneurs.
Sitting where we do it is easy to dismiss this number because we assume it comes from day laborers and the working class, i.e., the non-college-educated. But just look at higher education's use of adjuncts and you see the trend. The fastest-growing sector of this shift is in the formally white-collar world our students aspire to. This number has been steadily rising and is projected to continue its upward climb unchanged. We are living in a world where 9:00-5:00 jobs are declining, careers with one company over a lifetime are uncommon, and economic risk has shifted from large institutions to individuals. Our students will know a world that is much more unstable and fluid than the one of a mere generation ago.
We have known for many years that younger workers (i.e., recent college graduates) move from firm to firm, job to job and even career to career during their lifetime. What we are seeing now, however, is different. And for as many Americans, they are hustling from gig to gig, too. These workers, many our former students, may never know economic security, but they may know success. For many of the new-economy workers, success is measured by more than just money, as freedom, flexibility and creativity count too.
If this is the new economy our students are going to inherit, we as college and university administrators, faculty and staff need to take stock of the programs we offer (curricular as well as extracurricular) to ensure that we serve our students' needs and set them on a successful course for the future. The skills they will need may be different from those of their predecessors. Colleges and universities with a true culture of assessment already are making the necessary strategic adjustments.
In 1956, William Whyte, the noted sociologist, wrote The Organizational Man to name the developing shift in work for that generation. Whyte recognized that white-collar workers traded independence for stability and security. What got them ahead in the then-new economy was the ability to fit in (socialization) and a deep set of narrow vocational skills. Firms at the time developed career ladders, and successful junior executives who honed their skills and got along advanced up the food chain.
Today, no such career ladder exists. And narrow sets of skills may not be the ticket they once were. We are witnessing a new way of working developing before our eyes. Today, breadth, cultural knowledge and sensitivity, flexibility, the ability to continually learn, grow and reinvent, technical skills, as well as drive and passion, define the road to success. And liberal arts institutions should take note, because this is exactly what we do best.
For liberal arts educators, this economic shift creates a useful moment to step out of the shadows. We no longer need to be defensive because what we have to offer is now more visibly useful in the world. Many of the skills needed to survive and thrive in the new economy are exactly those a well-rounded liberal arts education has always provided: depth, breadth, knowledge in context and motion, and the search for deeper understanding.
It will not be easy to explain to future students and their parents that a liberal arts degree may not lead to a particular “job” per se, because jobs in the traditional sense are disappearing. But, we can make a better case about how a liberal arts education leads to both a meaningful life and a successful career.
In this fluid world, arts and sciences graduates may have an advantage. They can seek out new opportunities and strike quickly. They are innovative and nimble. They think across platforms, understand society and culture, and see technology as a tool rather than an end in itself. In short, liberal arts graduates have the tools to make the best out of the new economy. And, above all, we need to better job identifying our successes, our alumni, as well as presenting them to the public. We need to ensure that the public knows a liberal arts degree is still, and always has been, a ticket to success.
This could be a moment for the rebirth of the liberal arts. For starters, we are witnessing exciting new research about the economy that is situating the discussion more squarely within the liberal arts orbit, and in the process blurring disciplinary boundaries. These scholars are doing what the American studies scholar Andrew Ross has called “scholarly reporting,” a blend of investigative reporting, social science and ethnography, as a way to understand the new economy shift. Scholars such as the sociologists Dalton Conley and Sharon Zurkin and the historian Bryant Simon offer new models of engaged scholarship that explain the cultural parameters of the new economy. We need to recognize and support this research because increasingly we will need to teach it as the best way to ensure our students understand the moment.
We also need to be less territorial, and recognize that the professional schools are not the enemy. They have a lot to offer our students. Strategic partnerships between professional schools and the arts and sciences enrich both and offer liberal arts students important professional opportunities long closed off to them. We also need to find ways to be good neighbors to the growing micropreneurial class, either by providing space, wifi, or interns. Some schools have created successful incubators, which can jump-start small businesses and give their students important ground-floor exposure to the emerging economy.
Today’s liberal arts graduates will need to function in an economy that is in some ways smaller. Most will work for small firms and many will simply work on their own. They will need to multitask as well as blend work and family. And, since there will be little budget or time for entry-level training, we need to ensure that all our students understand the basics of business even if they are in the arts. We also might consider preparing our graduates as if they were all going to become small business owners, because in a sense many of them are going to be micropreneurs.
Richard A. Greenwald
Richard A Greenwald is dean of the Caspersen School of Graduate Studies, director of university partnerships, and professor of history at Drew University in Madison, N.J. His next book is entitled The Micropreneurial Age: The Permanent Freelancer and the New American (Work)Life.
When the economy goes down, one expects the liberal arts -- especially the humanities -- to wither, and laments about their death to go up. That’s no surprise since these fields have often defined themselves as unsullied by practical application. This notion provides little comfort to students -- and parents -- who are anxious about their post-college prospects; getting a good job -- in dire times, any job -- is of utmost importance. (According to CIRP’s 2009 Freshman Survey, 56.5 percent of students -- the highest since 1983 -- said that “graduates getting good jobs” was an important factor when choosing where to go to college.)
One expects students, then, to rush to courses and majors that promise plenty of entry-level jobs. Anticipating this, college administrators would cut back or eliminate programs that are not “employment friendly,” as well as those that generate little research revenue. Exit fields like classics, comparative literature, foreign languages and literatures, philosophy, religion, and enter only those that are preprofessional in orientation. Colleges preserving a commitment to the liberal arts would see a decline in enrollment; in some cases, the institution itself would disappear.
So runs the widespread narrative of decline and fall. Everyone has an anecdote or two to support this story, but does it hold in general and can we learn something from a closer examination of the facts?
The National Center for Education Statistics reports that the number of bachelor's degrees in “employment friendly” fields has been on the rise since 1970. Undergraduate business degrees -- the go-to “employment friendly” major -- has increased from 1970-71, with 115,400 degrees conferred, to 2007-08, with 335,250 conferred. In a parallel development, institutions graduated seven times more communications and journalism majors in 2007-08 than in 1970-71. And while numbers are small, there has been exponential growth in “parks, recreation, leisure, and fitness studies,” “security and protective services,” and “transportation and materials moving” degrees. Computer science, on the other hand, peaked in the mid-80s, dropped in the mid-90s, peaked again in the mid-2000s, and dropped again in the last five years.
What has students’ turn to such degrees meant for the humanities and social sciences? A mapping of bachelor degrees conferred in the humanities from 1966 to 2007 by the Humanities Indicator Project shows that the percentage of such majors was highest in the late 1960s (17-18 percent of all degrees conferred), low in the mid-1980s (6-7 percent), and more or less level since the early 1990s (8-9 percent). Trends, of course, vary from discipline to discipline.
Degrees awarded in English dropped from a high of 64,627 in 1970-71 to half that number in the early 1980s, before rising to 55,000 in the early 1990s and staying at that level since then. The social sciences and history were hit with a similar decline in majors in 1970s and 1980s, but then recovered nicely in the years since then and now have more than they did in 1970. The numbers of foreign language, philosophy, religious studies, and area studies majors have been stable since 1970. IPEDS data pick up where the Humanities Indicator Project leaves off and tell that in 2008 and 2009, the number of students who graduated with bachelor's degrees in English, foreign language and literatures, history, and philosophy and religion have remained at the same level.
What’s surprising about this bird’s-eye view of undergraduate education is not the increase in the number of majors in programs that should lead directly to a job after graduation, but that the number of degrees earned in the humanities and related fields have not been adversely affected by the financial troubles that have come and gone over the last two decades.
Of course, macro-level statistics reveal only part of the story. What do things look like at the ground level? How are departments faring? Course enrollments? Majors? Since the study of the Greek and Roman classics tends to be a bellwether for trends in the humanities and related fields (with departments that are small and often vulnerable), it seemed reasonable to ask Adam Blistein of the American Philological Association whether classics departments were being dropped at a significant number of places. “Not really” was his answer; while the classics major at Michigan State was cut, and a few other departments were in difficulty, there was no widespread damage to the field -- at least not yet.
Big declines in classics enrollments? Again, the answer seems to be, “Not really.” Many institutions report a steady gain in the number of majors over the past decade. Princeton’s classics department, for example, announced this past spring 17 graduating seniors, roughly twice what the number had been three decades ago. And the strength is not just in elite institutions. Charles Pazdernik at Grand Valley State University in hard-hit Michigan reported that his department has 50+ majors on the books and strong enrollments in language courses.
If classics seems to be faring surprisingly well, what about the modern languages? There are dire reports about German and Russian, and the Romance languages seem increasingly to be programs in Spanish, with a little French and Italian tossed in. The Modern Language Association reported in fall 2006 -- well before the current downturn -- a 12.9 percent gain in language study since 2002. This translates into 180,557 more enrollments. Every language except Biblical Hebrew showed increases, some exponential -- Arabic (126.5 percent), Chinese (51 percent), and Korean (37.1 percent) -- while others less so -- French (2.2 percent), German (3.5 percent), and Russian (3.9 percent). (Back to the ancient world for a moment: Latin saw a 7.9 percent increase, and ancient Greek 12.1 percent). The study of foreign languages, in other words, seems not to be disappearing; the mix is simply changing.
Theoretical and ideological issues have troubled and fragmented literature departments in recent years, but a spring 2010 conference on literary studies at the National Humanities Center suggests that the field is enjoying a revitalization. The mood was eloquent, upbeat, innovative; no doom and gloom, even though many participants were from institutions where painful budget cuts had recently been made.
A similar mood was evident at National Forum on the Future of Liberal Education, a gathering of some highly regarded assistant professors in the humanities and social sciences this past February. They were well aware that times were tough, the job market for Ph.D.s miserable, and tenure prospects uncertain. Yet their response was to get on with the work of strengthening liberal education, rather than bemoan its decline and fall. Energy was high, and with it the conviction that the best way to move liberal education forward was to achieve demonstrable improvements in student learning.
It’s true that these young faculty members are from top-flight universities. What about smaller, less well-endowed institutions? Richard Ekman of the Council of Independent Colleges reports that while a few of the colleges in his consortium are indeed in trouble, most were doing quite well, increasing enrollments and becoming more selective. And what about state universities and land grant institutions, where most students go to college? Were they scuttling the liberal arts and sciences because of fierce cutbacks? David Shulenburger of the Association of Public and Land-grant Universities says that while budget cuts have resulted in strategic “consolidation of programs and sometimes the elimination of low-enrollment majors,” he does not “know of any public universities weakening their liberal education requirements.”
Mark Twain once remarked that reports of his death were greatly exaggerated. The liberal arts disciplines, it seems, can say the same thing. The on-the-ground stories back up the statistics and reinforce the idea that the liberal arts are not dying, despite the soft job market and the recent recession. Majors are steady, enrollments are up in particular fields, and students -- and institutions -- aren’t turning their backs on disciplines that don’t have obvious utility for the workplace. The liberal arts seem to have a particular endurance and resilience, even when we expect them to decline and fall.
One could imagine any number of reasons why this is the case -- the inherent conservatism of colleges and universities is one -- but maybe something much more dynamic is at work. Perhaps the stamina of the liberal arts in today’s environment draws in part from the vital role they play in providing students with a robust liberal education, that is, a kind of education that develops their knowledge in a range of disciplinary fields, and importantly, their cognitive skills and personal competencies. The liberal arts continue -- and likely will always -- give students an education that delves into the intricate language of Shakespeare or Woolf, or the complex historical details of the Peloponnesian War or the French Revolution. That is a given.
But what the liberal arts also provide is a rich site for students to think critically, to write analytically and expressively, to consider questions of moral and ethical importance (as well as those of meaning and value), and to construct a framework for understanding the infinite complexities and uncertainties of human life. This is, as many have argued before, a powerful form of education, a point that students, the statistics and anecdotes show, agree with.
W. Robert Connor and Cheryl Ching
W. Robert Connor is the former president of the Teagle Foundation, to which he is now a senior adviser. Cheryl Ching is a program officer at Teagle.
Reflecting on the recent The Humanities and Technology conference (THAT Camp) in San Francisco, what strikes me most is that digital humanities events consistently tip more toward the logic-structured digital side of things. That is, they are less balanced out by the humanities side. But what I mean by that itself has been a problem I've been mulling for some time now. What is the missing contribution from the humanities?
I think this digital dominance revolves around two problems.
The first is an old problem. The humanities’ pattern of professional anxiety goes back to the 1800s and stems from pressure to incorporate the methods of science into our disciplines or to develop our own, uniquely humanistic, methods of scholarship. The "digital humanities" rubs salt in these still open wounds by demonstrating what cool things can be done with literature, history, poetry, or philosophy if only we render humanities scholarship compliant with cold, computational logic. Discussions concern how to structure the humanities as data.
The showy and often very visual products built on such data and the ease with which information contained within them is intuitively understood appear, at first blush, to be a triumph of quantitative thinking. The pretty, animated graphs or fluid screen forms belie the fact that boring spreadsheets and databases contain the details. Humanities scholars, too, often recoil from the presumably shallow grasp of a subject that data visualization invites.
For many of us trained in the humanities, to contribute data to such a project feels a bit like chopping up a Picasso into a million pieces and feeding those pieces one by one into a machine that promises to put it all back together, cleaner and prettier than it looked before.
Which leads to the second problem, the difficulty of quantifying an aesthetic experience and — more often — the resistance to doing so. A unique feature of humanities scholarship is that its objects of study evoke an aesthetic response from the reader (or viewer). While a sunset might be beautiful, recognizing its beauty is not critical to studying it scientifically. Failing to appreciate the economy of language in a poem about a sunset, however, is to miss the point.
Literature is more than the sum of its words on a page, just as an artwork is more than the sum of the molecules it comprises. To itemize every word or molecule on a spreadsheet is simply to apply more anesthetizing structure than humanists can bear. And so it seems that the digital humanities is a paradox, trying to combine two incompatible sets of values.
Yet, humanities scholarship is already based on structure: language. "Code," the underlying set of languages that empowers all things digital, is just another language entering the profession. Since the application of digital tools to traditional humanities scholarship can yield fruitful results, perhaps what is often missing from the humanities is a clearer embrace of code.
In fact, "code" is a good example of how something that is more than the sum of its parts emerges from the atomic bits of text that logic demands must be lined up next to each other in just such-and-such a way. When well-structured code is combined with the right software (e.g., a browser, which itself is a product of code), we see William Blake’s illuminated prints, or hear Gertrude Stein reading a poem, or access a world-wide conversation on just what is the digital humanities. As the folks at WordPress say, code is poetry.
I remember 7th-grade homework assignments programming onscreen fireworks explosions in BASIC. When I was in 7th grade, I was willing to patiently decipher code only because of the promise of cool graphics on the other end. When I was older, I realized the I was willing to read patiently through Hegel and Kant because I learned to see the fireworks in the code itself. To avid readers of literature, the characters of a story come alive to us, laying bare our own feelings or moral inclinations in the process.
Detecting patterns, interpreting symbolism, and analyzing logical inconsistencies in text are all techniques used in humanities scholarship. Perhaps the digital humanities' greatest gift to the humanities can be the ability to invest a generation of "users" in the techniques and practiced meticulous attention to detail required to become a scholar.
Trained in analytic philosophy, Phillip Barron is a digital history developer at the University of California at Davis.