Ideas have seldom been the currency of American politics. (Most of the time, currency is the currency of American politics.) But this seems like a moment in history when new thinking is a matter of some urgency.
Over the past few days, I've been conducting an utterly unscientific survey of academics, editors, and public intellectuals to find out how -- if given a chance -- they might try to influence the incoming occupant of the White House. The question was posed by e-mail as follows:
"Imagine you are invited to a sit-down with the president-elect and given the chance to suggest some recommended reading between now and the inauguration.Since we're trying to keep this fantasy of empowerment at least slightly plausible, I'd ask you to limit yourself to one book. (He will be busy.) Something not yet available in English is fine; we will assume a crack team of translators is standing by. Journal articles, historical documents, and dissertations also acceptable.
"What would you propose? Why? Is there a special urgency to recommending it to the attention of the next Chief Executive at this very moment? Remember, this is a chance to shape the course of history. Use your awesome power wisely...."
I tried to cast a wide net for potential respondents -- wider than my own political sympathies, in any case. Not all who were invited chose to participate. But everyone who did respond is included here. The suggestions were far-ranging, and the president-elect would no doubt benefit from time spent reading any of the nominated titles. (To make tracking things down easier on his staff, I have added the occasional clarifying note in brackets.)
In reality, of course, it's a long shot that the new president will take any of this advice. But the exercise is serious, even so -- for it is matter of opening a wider discussion of what books and ideas should be brought to bear on public life at this pivotal instant. An election is a political process; but so, sometimes, is thinking.
Eric Rauchway is a professor of history at the University of California at Davis and author of The Great Depression and the New Deal: A Very Short Introduction, recently published by Oxford University Press.
If they were asking me I'd suppose they were familiar with my own modest works, so I'd try to point out a perhaps neglected or forgotten classic.
Suppose it's John McCain, who has often expressed admiration for Theodore Roosevelt. I'd humbly suggest President-elect McCain revisit the chapters in George Mowry's classic Era of Theodore Roosevelt dealing with Roosevelt's first full term of office (1905-1909), when he worked hard with Congress to craft landmark legislation regulating business, affording protection to consumers, and providing for workers' compensation.
Suppose, conversely, it's Barack Obama, who would be the first northern Democrat elected since the party sloughed off the South in the Civil Rights era (i.e., since John Kennedy) and who would, like the greatest northern Democrat and perhaps the greatest president of all, Franklin Roosevelt, take office in a time of profound crisis. I would humbly remind him of Isaiah Berlin's classic essay on Roosevelt, in which he describes how much could be accomplished by a deft politician, sensitive even to minute ebbs and flows in political opinion, who while not lacking vision or integrity nevertheless understand—as Berlin wrote—"what to do and when to do it."
[The essay on Roosevelt can be found in the Berlin omnibus collection The Proper Study of Mankind: An Anthology of Essays, published ten years ago by Farrar, Straus and Giroux. Or here, while the link lasts.-SM]
Elvin Lim is an assistant professor of government at Wesleyan University and author of The Anti-Intellectual Presidency: The Decline of Presidential Rhetoric from George Washington to George W. Bush, published by Oxford University Press and discussed recently in this column.
The president-elect should read Preparing to be President: The Memos of Richard E. Neustadt (AEI Press, 2000), edited by Charles O. Jones. Richard Neustadt was a scholar-practitioner who advised Presidents Truman, Kennedy, Johnson, and Clinton, and, until his passing in 2003, also the dean of presidential studies. Most of the memos in this volume were written for president-elect John Kennedy, when the country was, as it is now, ready for change.
At the end of every election, "everywhere there is a sense of a page turning ... and with it, irresistibly, there comes the sense, 'they' couldn't, wouldn't, didn't, but 'we' will," Neustadt wrote years ago, reminding presidents-elect that it is difficult but imperative that they put the brake on a campaign while also starting the engine of a new administration. Campaigning and governing are two different things.
Buoyed by their recent victory, first-term presidents have often over-reached and under-performed, quickly turning hope into despair. If there is one common thread to Neustadt's memos, it is the reminder that there is no time for hubris or celebration. The entire superstructure of the executive branch - the political appointees who direct the permanent civil service - is about to lopped off, and the first and most critical task of the president-elect is to surround himself with competent men and women he can work with and learn from.
In less than three months, the president-elect will no longer have the luxury of merely making promises on the campaign trail. Now he must get to work.
Jenny Attiyah is host and producer of Thoughtcast, an interview program devoted to writers and academics, and available via podcast.
We don't have to agree with everything we read in this country. Reading is not unpatriotic. So may I suggest that the future commander-in-chief actually read the speeches by Osama bin Laden? At a minimum, he can read between the lines. As Sun Tzu said, "know thine enemy". But we know so little about bin Laden. We don't even know where he lives. Supposedly, he "hates our freedoms" – but he would argue that what he hates is the freedom we take with our power.
After these videos were released, it usually took some effort to dig out a transcription. In the end, I had to go to Al Jazeera for a translation. What I remember most clearly is grainy video of the guy, holding his index finger aloft, but with the volume silenced, so our talking TV heads could impart their wisdom in peace. Let's hope the next president is willing to turn off the mute button on our enemy. Ignorance is no longer an excuse.
[Verso Press made this much easier three years ago with the collection Messages to the World: The Statements of Osama Bin Laden, which provides as much OBL as anyone should have to read.-SM]
Daniel Drezner is a professor of international relations at Tufts University. He also blogs.
I'd probably advise the president to read the uber-source for international relations, Thucydides' History of the Peloponnesian War. Too many people only read portions like the Melian Dialogue, which leads to a badly distorted view of world politics (the dialogue represents the high-water mark of Athenian power -- it all goes downhill after that). The entire text demonstrates the complex and tragic features of international politics, the folly of populism, the occasional necessity of forceful action, the temptations and dangers of empire, and, most importantly, the ways in which external wars can transform domestic politics in unhealthy ways.
Chris Matthew Sciabarra is a visiting scholar at New York University and a founding co-editor of The Journal of Ayn Rand Studies.
Given my own views of the corporatist state-generated roots of the financial crisis, I'd probably recommend The Theory of Money and Credit by Ludwig von Mises, so that he could get a quick education on how the credit policies of a central bank set the boom-bust cycle into motion. Perhaps this might shake the new President into a truly new course for US political economy.
Irving Louis Horowitz is professor emeritus of sociology and political science at Rutgers University and editorial director of Transaction Publications.
While I seriously and categorically doubt that any one book will shape the course of history, and even less, do I feel touched by a sense of "awesome power" much less preternatural wisdom, I will recommend a book that the next president of the United States would, or better should, avail himself of: On Thermonuclear War by Herman Kahn. Released first by Princeton University Press in the dark days of the Cold War in 1960, and reissued by Transaction Publishers in 2007, this is the painful reminder that peace in our time is heavily dependent of the technology of war in our time. The howls of dismissal that greeted this book upon first blush have been replaced by a sober appreciation that the global threat to our Earth are very much a man made product.
Kahn's book can serve as a guide in the stages of diplomatic failure and its consequential turn to military activities at maximum levels. Kahn does not presume pure rationality as a deterrent to war, and in light of the nuclear devices in the hands of dangerous nations states such as Iran and North Korea, where notions of life and death may give way to Gotterdamerung and the preference of destruction and self-immolation, such presumed rational behavior discourse may prove dangerous and even delusionary.The unenviable task of the next president will be to avoid taking the world to the proverbial brink - and making sure others do not dare take the fatal step to do likewise. Oddly, for all of its dire scenarios, Kahn's classic is a curiously optimistic reading, rooted in realistic policy options. It deserves to be on the shelf of the next head of the American nation.
Dick Howard is a professor of history at the State University of New York at Stony Brook and editor of the Columbia University Press series Columbia Studies in Political Thought/Political History.
I'd have him read Polanyi's The Great Transformation. Why? It's short, clearly argued, and makes a simple but fundamental point: capitalism is not the natural way that people relate to one another (including in their "economic" relations). It is the result of several political decisions that create the framework within which it can emerge. The next president will have to recognize that he too will make political decisions with economic consequences (and should not deceive himself into thinking that his decisions are simply a reaction to economic "necessities").
To be noted as well: Polanyi, a former banker in Austria, was writing in the wake of the Great Depression, whose causes he was trying to understand. It was the inability of "economics" to understand what had happened to the world economy that led Polanyi to his pathbreaking and brilliant study.
A hubristic final note: I of course recommend this only because my own study of the history of political thought from the Greeks to the American and French revolutions, titled The Primacy of Politics, will not yet be on the market.
[ Primacy will be published by Columbia University Press in late '09.-SM]
James Marcus is the book-review editor for The Columbia Journalism Review and has translated several books from Italian.
It's not often that the POTUS asks me what to read next, and at first I thought I should rise to the occasion with something suitably canonical. I considered Democracy in America, The Federalist Papers, maybe even The Education of Henry Adams (although I'd allow the leader of the free world to skip the virgin-and-dynamo stuff at the end). Then I decided that it made more sense to submit a narrow-gauge production: a book that grappled with public issues through the prism of personal experience, not unlike Barack Obama's Dreams from My Father or John McCain's Faith of My Fathers. If, like the two titles I just mentioned, it included a dash of Oedipal ambivalence, so much the better.
What I came up with was Tobias Wolff's In Pharaoh's Army: Memories of the Lost War. As the next president ponders the best way to extract the United States from its Iraqi quagmire, a memoir of Vietnam seems like a useful reality check. The author, a self-confessed screw-up, spent part of his enlistment in the Mekong Delta, advising a Vietnamese artillery battalion. There are very few heroics in his book, and no argumentation about the wisdom of being there in the first place. What we do get is the endless confusion of fighting a popular insurgency. And an insistence that even the survivors of such a conflict are permanently marked: "It's the close call you have to keep escaping from, the unending doubt that you have a right to your own life. It's the corruption suffered by everyone who lives on, that henceforth they must wonder at the reason, and probe its justice."
Over the next four years, the president will almost certainly order U.S. troops into battle. In its modest, personal, anti-rhetorical manner, this book reminds us of the price to be paid.
Claire Potter is a professor of history and American studies at Wesleyan University, and is also known as Tenured Radical. She contributes to the history blog Cliopatria.
My contribution to President Obama's reading list is Nancy Cott's Public Vows: A History of Marriage and the Nation (2000). While the history of marriage has been augmented considerably since this book came to include important volumes on the history of interracial marriage, the demand for gay marriage, and the fraught relationship between Christianity and marriage, all other scholars have relied, more or less, on Cott's argument that marriage is first and foremost a contract with the state.
It's not primarily a contract with another person – although it is that; it is not a contract with your local community – regardless of their approval and disapproval; and it is in no way a contract with any religious hierarchy – although it can be critical to the terms of inclusion in a religious community.
Marriage, President Obama, is about citizenship. You, along with nearly everyone who hedges his bets on gay marriage, reiterates that the most important fact about marriage is that it is between "one man and one woman." But that's not true. In the United States, as Cott shows, marriage has been primarily about the qualifications of a man "to be a participating member of a state."
While over time political authorities in the United States have allowed marriage to "bear the impress of the Christian religion," if marriage is a public institution at all, its function is to mirror the political community and to be the arm of the state that functions to "shape the gender order." In other words, Mr. President, the history of marriage is a political history, not a religious one; and it is a history of inclusion or exclusion from political power.
George Scialabba is the author of What Are Intellectuals Good For?, a collection of essays forthcoming from Pressed Wafer in March 2009. He was profiled in this column two years ago.
Dear Citizen Obama (I'm afraid the overly deferential "Mr. President" encourages the aggrandizement of the Executive Branch):
More than thirty years ago, your predecessor Jimmy Carter described America's tax system as "a national disgrace." Since then, it's gotten much, much worse. It is now so complex and irrational that only two groups of Americans understand it: tax lawyers and readers of David Cay Johnston, Pulitzer-Prize-winning New York Times reporter and author of Perfectly Legal: The Covert Campaign to Rig Our Tax System to Benefit the Super-Rich -- and Cheat Everybody Else. The abuses and evasions detailed in Perfectly Legal (and its companion volume, Free Lunch: How the Wealthiest Americans Enrich Themselves at Government Expense – and Stick You with the Bill) may raise your blood pressure dramatically. You should read them, but only under a doctor's supervision.
Continued tax avoidance at current staggering levels by the wealthy is your mortal enemy. Unless the tax code is drastically reformed -- and effectively enforced -- there will simply not be enough money to accomplish your goals. It will take courage, persistence, and all your celebrated rhetorical skills to vanquish this dragon in your path. But unless you do, your hopes will be thwarted and your administration will be no more than a ripple on the surface of American history.
Since I have more than once in the past few months mourned the unkind timing of Norman Mailer's death this year -- What might the author of one of our finest war novels have made of the trials of Senator McCain on the campaign trail? How would the instigatory commentator on so much of our nation's cultural, political, and existential foment make sense of the long and disciplined loneliness of Senator Obama? And, last but by no means least, how would an imagination precocious and peculiar enough to have set a novel called Why Are We in Vietnam? in Alaska have illuminated the passage of Sarah Palin through the national psyche? -- I'd recommend to the new chief executive Mailer's piece on the 1960 Democratic convention, "Superman Comes to the Supermarket."
Coming out of the exhaustions of electoral combat, I might even give him a pass and ask him only to read the first paragraph -- forgive me, Norman -- if he promised to spend some time thinking about it:
"For once let us try to think about a political convention without losing ourselves in housing projects of fact and issue. Politics has its virtues, all too many of them -- it would not rank with baseball as a topic of conversation if it did not satisfy a great many things -- but one can suspect that its secret appeal is close to nicotine. Smoking cigarettes insulates one from one's life, one does not feel as much, often happily so, and politics quarantines one from history; most of the people who nourish themselves in the political life are in the game not to make history but to be diverted from the history that is being made."
Jodi Dean is a professor of political science at Hobart and William Smith Colleges and author of Democracy and Other Neoliberal Fantasies, forthcoming from Duke University Press.
I would recommend that President Obama read Our American King by David Lozell Martin.First, Obama is already familiar with Marxist, feminist, structuralist, and post-colonial theory from his days as a student at Harvard. So there is already some coverage here. Second, Obama has lots of advisors providing lots of advice on policy matters. Anything added here would end up just another item in the mix. Third, the new President faces so many enormous challenges that it is highly unlikely he'll have much time to devote to pondering a complex text, no matter how important.So I recommend a novel published last year, bedside reading that will provide the new President with food for thought. It captures, I think, the fears of many of us for the future of democracy in a time of extreme inequality, the sense that our country is leaning heavily on the wrong side of a precipice.
Our American King depicts what remains of the United States after a great economic calamity: the top .1 percent of Americans have appropriated all the wealth and goods for themselves and left the rest of the country to fend for itself. As the super-rich live in heavily defended enclaves, the suburbs and cities descend into violence, starvation, and death. Social order collapses. The President and Vice President that oversaw the calamity, that presided over the great transfer of wealth from the many to the few, are hung upside and backwards on the White House gates. The central drama of the novel involves the man who comes to power next. He is set up as a king, a uniter, the great hope of the people. Through him, they begin to work together, to imagine again the possibility of collective responsibility. The new king's authority draws from the people's fear and desperate longing for hope, a fear and a longing that, as Martin makes clear, may not always lead to the best outcomes.
My hope is that President Obama will read this book and recognize that people's longing for a leader, the One, is powerful but precisely because of that power should be redirected toward common projects, toward faith in each other and belief in equality, toward a renewed conviction that the conditions of the least well off--not the best--tell us who we are.
As a playwright, I want the next president to read a play. Plays are perfect fodder for the chief executive-to-be: they are short, can be digested in one sitting, and offer the advantage of distilling larger currents of thought into character, dialogue and action. And such an opportunity should not be wasted on agit-prop (Bertolt Brecht, Clifford Odets) or classics that should already have been imbibed by the civilized soul. (So let’s shelve Henry V and Major Barbara for now.) The play should talk to the president about the human cost of tough times, the dignities and foibles of ordinary citizens, and the dire alternatives to forceful and human courses of action.
For such times, the German playwright Odon von Horvath is just the ticket. Before his tragic death on the cusp of World War II, Horvath offered a window on the brutalities of economic collapse and the roots of fascism in desperation and human folly. But which Horvath to select? Tales from the Vienna Woods is Horvath’s masterpiece, but I’d worry that its deep subtleties and epic canvas of pre-war Austria would confound a reader pressed for time. So I’d opt instead for Horvath’s tiny jewel of human desolation: Faith, Hope and Charity.
In a mere 52 pages, the play follows Elisabeth, an ordinary young woman down on her luck, as she is hounded to death by close encounters with unfeeling bureaucracy and casual cruelty. It is a succinct and powerful play with a simple lesson: if our political institutions are not suffused with the moral values of the play’s title, they can be perverted into engines of personal annihilation. It is a message the new president should consider as sweeping changes in government and its powers are proposed and enacted.
Every so often a word become charged with a certain magic. One example, in decades past, was “alienation.” A more recent instance is “exile,” which for a while now has enjoyed a degree of glamor wholly distinct from the normally miserable experience of real-life exile itself. Such has been the case in literary and cultural studies, anyway.Somewhere along the way, it ceased to refer to the circumstance of being expelled from one’s homeland, or otherwise obliged to flee and unable to return. It took on vaguer connotations, and ever more charm.
Reading books from university presses or listening to papers at MLA, one learned that almost any kind of relocation, displacement, or out-of-sorts experience counted as exile. It seemed, more and more, to be a state of mind, if not an existential condition. In that regard “exile” came to resemble “alienation,” except that the earlier term had grown passé.
Given what exile means in the more pedestrian, agonizing sense -- the loss of one's home, citizenship, and wherewithal -- it was hard not to cringe at the metaphorical and metaphysical bloating of this once-meaningful term. The best you could do was bite your tongue.
Either that, or push the exaggeration into overdrive. The fascination with exile is a kind of self-melodramatizing by proxy. We’re all exiles now – maybe especially the adjuncts. I am considering a memoir of life “in exile” from my home town in Texas, a tiny gemeinschaft that only got its second stop-light a few years ago. (Clearly some conditions of exile are easier to bear than others.)
No freeze-dried profundities about the exilic condition are to be found in Ha Jin’s book The Writer as Migrant, recently published by the University of Chicago Press. This seems all the more remarkable given that the author himself – a novelist who is now a professor of creative writing at Boston University – has been in exile, in the most exactingly literal sense, for almost two decades, ever since the Tiananmen Square massacre.
In fact, Ha Jin uses the word “exile” itself sparingly, instead preferring a more expansive category.
“My choice of the word ‘migrant,’” he explains, “is meant to be as inclusive as possible – it encompasses all kinds of people who move, or are forced to move, from one country to another, such as exiles, emigrants, immigrants, and refugees.” The migrant has not simply left a homeland, whether willingly or by necessity. He or she has arrived in a new place and must make a life there, with or without ties to the old country. And that entails a possibility that the old ties will be replaced or transformed. Ha Jin’s emphasis is on the potentials (and the dangers) this creates for a specific kind of migrant: the writer.
Obviously his own experience as a Chinese emigre has written all of his fiction in the adopted language of English forms the backdrop to his reflections. “When I began to write,” he says, “I longed for a return to China, and I saw my stay in the United States as a sojourn, so it felt almost natural for me to claim to be something of a spokesman for the unfortunate Chinese. Little did I know that such a claim could be so groundless.”
For one thing, the day of return did not arrive. And sooner or later, there was a challenge to his moral claim, whether from other people or from his own conscience: “You sell your country and your people abroad.”
Creative labor in a new language, too, entails a sense of betrayal, even as it also opens up the possibility of making a new way in the world. “I have been asked why I write in English,” says Ha Jin. “I often reply, ‘For survival.’ People tend to equate ‘survival’ with ‘livelihood’ and praise my modest, also shabby, motivation. In fact, physical survival is just one side of the picture, and there is also the other side, namely, to exist – to live a meaningful life.” As he later puts it, “your homeland is where you build your home.”
But this book, which began life as a series of lectures at Rice University, is not a memoir. Ha Jin presents his reflections chiefly by way of short comments on the work of migrant writers, some of them exiles (Nabokov, Solzhenitsyn, Kundera) while others repatriated by choice (Conrad, Lin Yutang, Naipaul). “My observations,” Ha Jin avers, “ are merely that – my observations. Every individual has his peculiar circumstances and every writer has his own way of surviving and practicing his art. Yet I hope my work here can shed some light on the existence of the writer as migrant.”
Ha Jin’s approach is atheoretical and ahistorical, and so devoid of any tendency to ponderousness that the lack is quite conspicuous. It is a modest book; almost aggressively modest. But it leaves the non-migrant American reader (one whose travels have always been voluntary and round-trip) better prepared for the future, as more of our literature is written in English as a second language.
In Talking Out of School: Memoir of an Educated Woman(Dalkey Archive Press), Kass Fleisher reviews her education and her career in college teaching -- without holding back criticism of herself or academe. Sexual politics, class politics and academic politics all figure prominently. The following excerpt is from a section in which she recalls her years as an adjunct. Spoiler alert/warning: There is explicit language throughout the book -- including a few choice words in the excerpt that follows.
1998. I’m adjuncting, and have precisely one friend on the faculty, the guy who got me the gig. We have lunch together every week and we’ve had precisely one argument in the entire span of all our lunches, about a relationship he had with a student once. He tells it as a personal horror story. She’d been bright, talented, precocious -- and ultimately unstable. She filed harassment charges against him, he spent good money to hire a lawyer, he was forced to detail to his department chair some embarrassingly intimate details...
... and the chair let it go. When the university affirmative action officer agreed that the relationship had been consensual that the relationship made sense given consent the charges were dropped.
He comes to my office one day, disturbed. One of his older-women graduate students has written an angry letter, distributed to the chair, the vice-president and the president -- by way of demanding her tuition money back -- complaining that he swears too much in class shitpissfuckcuntcocksuckermotherfuckertits. The chair of the department does not inform him that he and the big boys have received said letter. The chair sits on the letter for a week or two and then, without conferring with the instructor, conducting a hearing, or even remembering (apparently) that grievance procedures have been established and printed in the faculty handbook, he writes my friend a formal letter of reprimand, stating that he’ll be subject to “disciplinary action” if ever another such complaint arises.
“She may, with some justification,” the chair writes, “formally bring a charge of harassment against you.
“Copies of this and the student’s letter will be placed in your personnel file.”
To sabotage your tenure review next year, the letter does not say.
Unlike the chair (apparently), my friend and I consult the faculty handbook and find that this letter indeed violates multiple personnel procedures...
... and further, that the only “disciplinary action” listed is termination.
Fuck, man. You mean you can lose your job for saying “fuck”? You call that fucking “ harassment”?
A month later, when the instructor’s student evaluations come back from the students who remained in the tech writing course after the complaining student left -- 40 percent of whom are women -- he will get a solid 5.0 on a 5-point scale—unanimous enthusiasm.
The chair will never comment on this....
1998. “I will no longer tolerate,” the chair writes in his letter to my friend, “what can only be described as your insensitive, vulgar, and obscene language in the classroom.”
The colleague’s intent in a graduate-level, academic tech writing class (i.e., not a vocational training workshop) is not just to teach students how to type memos, but rather to challenge students to consider how they know what they know as tech writers. This can be achieved while they expand their knowledge of their field, which exists right in the oily hinge, right in the fishy craw of the intersection of higher education and the corporation. Given the mess such a collision must be, he and I agree, some form of institutional critique is vital, and this sort of three-dimensional, reflexive analysis can, over time, only make students better tech writers. To know your context is to know your work.
Like many of his grad students, the complainant is his age, and already works as a tech writer. For much more than his salary.
From the first class meeting, she’s been unwilling to question herself in this manner. She’s uninterested in engaging his “message.” She pronounces the first assignment “a waste of time.” She simply wants to be told what she needs to “know” in order to cough up a master’s degree and presto! get a still higher salary.
“Withdraw me from this class, and do not charge my account.”
My “vulgar, obscene” colleague has been working with a search committee all fall. The chair calls him the week of Thanksgiving break and tells him that he’s being removed from the committee.
When my friend asks why, the chair explains that it’s political. A colleague with opposing pedagogical values has demanded to be included equal time on the committee.
The work’s almost done.
“He’s making this demand three weeks before we interview candidates at the MLA convention?” my friend says.
The chair nods.
“He just up and got pissed off at this late date?”
The chair has no real answer for this.
“At this late date?”
“It’s about fairness,” the chair says. “It’s about making sure both sides are represented.”
The only added perk for taking on all the added work of reading 300 application files is that you get reimbursed for the trip to the hiring conference, the Modern Language Association convention that meets annually between Christmas and New Year’s. So aside from losing all the work he’s put into this search so far no credit = no merit raise, teeny as that would be the instructor will have to pay his own way to MLA, where he has naturally two interviews himself.
“Is this your way of punishing me for the problem with the student who doesn’t like fucking cursewords?” the instructor asks.
“Certainly not,” the chair says.
Students will blame the discomfort of a learning transition on anything they can find. My friend’s experience illustrates clearly that in academe, it’s OK for instructors to fuck students...
... you just can’t say “fuck.”
Kass Fleisher is an assistant professor of English and creative writing at Illinois State University.
In the early 1960s, Susan Sontag lived in that liminal condition known as ABD. She was in her late 20s. While working on her fiction and criticism, she held brief appointments in philosophy at Sarah Lawrence College and at City College in New York, followed by a few years teaching in the religion department at Columbia University. Something of her attitude towards academic life perhaps comes through in the novel Sontag was writing at the time, The Benefactor (1963). Its narrator says he made his reputation with a scholarly article presenting “important ideas on a topic of no great importance.”
This sounds less like a Wildean epigram than something muttered under the breath about a visiting lecturer. In Sontag’s case, the weariness ran much deeper; it was colored with both distaste and uncertainty about the texture and direction of her own life. So one gathers from Reborn: Journals and Notebooks, 1947-1963, the selection of personal writings edited by her son David Rieff and published, like all of Sontag's books, by Farrar Straus and Giroux.
In a third-person account of her life written in 1958 (shortly before her 25th birthday) Sontag complains about “the tense careerism of the academic world, the talkativeness of it. She felt sick of talk, of books, of intellectual industry, of the inhibited gate of the professor.” This passage stands out by contrast with the rest of her journal – filled with lists of books to buy, words to learn, ideas to analyze. The final lines of the Reborn refer to “the intellectual ecstasy I’ve had access to since early childhood... Intellectual ‘wanting’ like sexual wanting.”
But passion is not always good for you. Another entry mentions her realization that reading could be an addiction: “I was like an alcoholic who nevertheless experiences a bad hangover after each binge. After an hour or two browsing in a bookstore, I felt numb, restless, depressed. But I didn’t know why. And I couldn’t keep away from the stuff.” She would go on benders, reading in a “greedy way” until she passed out – keeping “several books beside the bed at night, in order to fall asleep.”
Her cinephilia was dipsomaniacal as well: it takes Sontag four pages to list all the films she saw over two weeks in the spring of 1961. The desire to become a writer was there, inside her. But it had to fight to get out, and that struggle involved overcoming the temptations all around her: academic ambition, the distraction of bookstores, the pleasure of sitting in front of the screen and its secondhand dreams.
In one of her earliest essays -- a discussion of the then newly published diaries of the Italian novelist Cesare Pavese, published in 1962 -- Sontag describes the “peculiarly modern literary genre” embodied in the writer’s notebook or journal. “Here we read the writer in the first person,” she says; “we encounter the ego behind the masks of ego in an author’s works. No degree of intimacy in a novel can supply this, even when the author writes in the first person or uses a third person which transparently points to himself.”
She returned to the topic the following year in another essay, this time on Albert Camus. “The notebooks of a writer have a very special function: in them he builds up, piece by piece, the identity of a writer [for] himself. Typically, writers’ notebooks are crammed with statements about the will: the will to write, the will to love, the will to renounce love, the will to go on living. The journal is where a writer is heroic to himself.... Solitariness is the indispensable metaphor of the modern writer’s consciousness, not only to self-declared emotional misfits like Pavese, but even to as sociable and socially conscientious a man as Camus.”
These passages apply to Sontag’s own notebooks at least as much as they do those of the authors she is ostensibly discussing. They are a workshop in which she fashions a sense of identity. Passages about love and will abound.
At the age of 17, she met and married Philip Rieff, a professor 10 years her senior – this being a matter of will, one quickly surmises, for her erotic preference was for other women. The seven years of her life as faculty spouse roll through the pages of Sontag’s journal like a gray fog, clammy with resentment. A fellowship takes her to Oxford during the fall of 1957 – away from Rieff and their son. By the start of the next year, Sontag is in Paris, where she crosses paths with an old girlfriend; the renewal of their affair has a catalytic effect. “The thought of going back to me old life,” she writes, “it hardly seems like a dilemma any more.... I’m already on the other side from which it’s impossible to return.”
This is not quite the tale of self-creation through self-acceptance it may warm the liberal heart to imagine. The women Sontag then falls in love with prove to be neurotic and abusive (physically so, in one case) and she remains in the closet to some of her closest academic colleagues. Absorbing the “all-out assault on my personality” conducted by one of her lovers, she comes to identify with the aggressor. The degree of critical self-consciousness grows. Analysis of her experience begins to display an edge. The complaints she once directed at her spouse now become reflexive. "I have grown complacent in the years with Philip,” she writes. “I grew accustomed to his flabby adulation, I ceased to be tough with myself, and accepted my defects as loveable since they were loved.... Perhaps it was necessary, this turning inward and deadening of my sensibility, my acuteness. Otherwise I should not have survived. To remain sane, I became a little stolid. Now I must begin to risk my sanity, to re-open my nerves.”
No coincidence, then, that Hyppolite, the narrator of her first novel, would complain about professors who “raised problems only in order to solve them, and brought their lectures to a conclusion with maddening punctuality.” Instead, he goes to excesses that leave him half insane. For all her interest in the avant garde, Sontag never pursued any course quite so dramatic as that. Perhaps the stolidity won out in the end. She ended up, in her later years, much closer to Philip Rieff’s perspective on culture than either of them might have admitted.
“To write,” notes Sontag in 1961, “you have to allow yourself to be the person you don’t want to be (of all the people you are).” Self-creation, like self-consciousness itself, leads to all sorts of paradoxes. Reborn gives readers a glimpse of this process as it unfolded at the start of Sontag’s career. Two forthcoming volumes will carry the story to the close of her life. (David Rieff’s recent book on her final days, Swimming in a Sea of Death: A Son’s Memoir, offers a sort of preface to this body of posthumous writings by his mother.)
Having read this selection from her journals three times over the past couple of weeks, let me close on a note of impatient enthusiasm. And also, come to think of it, a passage from that essay on Camus, published when Sontag was 30:
"Great writers are either husbands or lovers. Some writers supply the solid virtues of a husband: reliability, intelligibility, generosity, decency. There are other writers in whom one prizes the gifts of a lover, gifts of temperament rather than of moral goodness. Notoriously, women tolerate qualities in a lover—moodiness, selfishness, unreliability, brutality—that they would never countenance in a husband, in return for excitement, an infusion of intense feeling. In the same way, readers put up with unintelligibility, obsessiveness, painful truths, lies, bad grammar — if, in compensation, the writer allows them to savor rare emotions and dangerous sensations. And, as in life, so in art both are necessary, husbands and lovers. It's a great pity when one is forced to choose between them."
This seems more complex, and less amusing, than it did on last reading.
"Say what you will about the tenets of National Socialism, dude, at least it’s an ethos." -- The Big Lebowski
After almost five years teaching writing, English, ESL, and humanities courses to high school students and undergraduates, I have come to the conclusion that it is a serious mistake to ground undergraduate instruction in writing in the basics of Aristotelian rhetoric. I believe doing so is increasingly common, and that it is increasingly normal for universities to reframe composition jobs as being in “rhetoric and composition.”
This is a discussion somewhat rooted in the practicalities of teaching first-year undergraduates to write, but it has much broader implications. It is part of a larger conversation about what, exactly, the humanities are supposed to mean at a historical moment when college-level reading and writing skills are quite valuable, yet also when the political and economic conditions put “anti-ideological” pressure on institutions of higher learning. In other words, universities increasingly see themselves as preparing students to write fluently on any topic, from any perspective.
This is not the “end” of ideological instruction, naturally, since its final consequence is to encourage students to write for the highest bidder, making every young writer into a copy writer. But it is worth examining how rhetorically themed instruction in writing -- especially in ethos, pathos, and logos -- arose as a natural way of resolving political conflicts between Western institutions, and to consider the consequences of this paradigm shift for our students. My objection is not merely political; it is also pedagogical, since "rhetoric and composition" forecloses many other valuable ways of teaching reading and writing.
How Critical Thinking Evolved Into Rhetoric
From the middle of the last century until fairly recently, the idea that the purpose of undergraduate education is to foster “critical thinking” has had a virtual monopoly in both academic and popular circles. This goal has been institutionalized around the globe, wherever students are tested on "critical reasoning" skills.
It is an answer I myself have given on many occasions, and it holds up well for an old chestnut. It is a difficult code to enforce in a humanities classroom. It is a concept best suited to the inspection of evidence. Education researcher Lion Gardiner described critical reasoning as "the capacity to evaluate skillfully and fairly the quality of evidence and detect error, hypocrisy, manipulation, dissembling, and bias." Unfortunately, presented with something like a Max Ernst painting or a Martin Luther King speech, students will be hard-pressed to find error, hypocrisy, or bias. Critical reasoning will not help them to “unpack” the text, as we say in the humanities, though it may help when they are called upon to construct a rigorous argument.
Equally important, critical reasoning is pushed to its limits by contemporary culture and politics. Perhaps the greatest exemplar and champion of critical reasoning was Theodor Adorno, who was driven by his own feeling of integrity to extreme positions of dissent and hysterical rejections of popular culture. What are we to tell students about critical reasoning when the president and his cabinet simply lie about Iraq in order to drum up popular support for a war? If you watch one hour of television programming, you see about 20 minutes of advertising, all of which is likely contaminated with “error, hypocrisy, manipulation, dissembling, and bias.” While Westerners have invented all sorts of defenses against this assault on reason, they are leaky dams at best; most of us simply cannot keep track of every sort of irrational appeal we are simultaneously trying to ignore, or ridicule, or protest against, or embrace in the name of glamor or kitsch.
Teaching a class too much in this mode produces an unhappily smug series of field trips through “our stupid popular culture,” “our stupid political landscape,” and so on, along with the depressing feeling that nobody, the instructor included, will follow through in practice on the overwhelmingly negative evaluations of culture that the “critical thinking” method produces.
Rhetoric solved many of these problems by giving critical thinking a positive, broadly applicable core; rather than merely giving students a way to filter out misinformation, we were empowering them to persuade audiences. All of a sudden, a speech by Martin Luther King that had been almost unreadable (was King giving us good evidence about the lives of African-Americans or not?) became full of content, now that we were seeing it through Aristotle’s Big Three: ethos, pathos, and logos.
Furthermore, the rhetorical approach seemed to resolve the increasingly tense problem of what students ought to be reading or otherwise studying. There were visual and auditory rhetorics earning the attention of scholars in every field; in fact, anything that had an audience apparently had a rhetoric, so you could finally teach pop culture alongside of canonical literature without drearily insisting that pop culture was lies, damn lies, and false advertising. You could seamlessly blend new media into traditional writing curriculums, which was good since students had less stomach for reading, less training in it, and more of an appetite for mixed media or short pieces. Overall, the rhetorical approach tended to produce surprisingly positive evaluations of, well, just about everything, because rhetoric became a pleasure in and of itself: the film Thank You For Smoking is a product of the New Age of Rhetoric, where even a cigarette ad can be the object of much grudging classroom admiration. If an audience liked it or was influenced by it, you were hard-pressed to say, as a detached rhetorician, that the audience was wrong.
The Politics of Teaching Rhetoric
In addition to substituting something more agreeable for the relentlessly negative core of the “critical thinking” curriculum, rhetoric solved an urgent political problem: how institutions of higher education were supposed to weather the Bush years without being relentlessly punished for “extreme” political leanings. After 9/11, when David Horowitz’s star was on the rise, the Congress was majority Republican, governorships were going Republican all over the country, and Dubya had consolidated his popular base, there was a feeling among academics that blindly going forward with some version of leftist theory was simply irresponsible. Doing so created easy targets for Horowitz and his kind, and excluded professors from thrilling conversations about how the Internet could foster a better, more sustainable, more user-driven global market and global culture.
Many academics, abandoning the radical politics of theory, began to talk and write as though they were trying out for a new edition of The Best and the Brightest — as though they were the cabinet advisors to some non-existent moderate Democratic administration, presumably run by Martin Sheen. I remember being dumbfounded when a famous interpreter of the Frankfurt School (a group of philosophers that included Adorno), coming to give a talk at UC Irvine, chucked all that critical nonsense about dialectics to discuss how Bush could have done better at international diplomacy. This was also the period, you may remember, when the American right pushed the hardest for “balanced” course readers and syllabi. It was the second coming of the Intelligent Design movement. All across the country, TAs and adjuncts murmured to each other about how to teach critical thinking without “silencing” conservative perspectives.
Of course, looking back, the post-Clinton years seem like some kind of bad dream, an epiphenomenon that has now been brought to an end by Obama’s election. That may be true at the highest levels of American government, but institutional changes within the academy do not reverse themselves so quickly, particularly when a whole generation of graduate students is trained under a certain politically ambivalent model. Rhetoric, which was already prominent for the reasons I mentioned earlier, easily adapted itself to this environment by simultaneously avowing its neutrality (let’s analyze a speech by George W. Bush!) and promising a sort of sideways “rhetorical critique” that would lead students to the truth. In theory, you could show students that Bush’s speeches used all kinds of logical fallacies in order to divide the word along axes of good and evil, or that his rhetoric was inconsistent in its appeals and therefore untrustworthy.
In reality, however, teachers tended to fall back on dogma whenever they tried to perform a rhetorical critique of politically successful discourse. For example, if you wanted to prove that George Bush presented an overly polarized picture of nations and human beings, you had to invoke your own personal theory that out there, in the real world that transcends discourse, things weren’t so “black and white.” Or, in a different example, you might have to just announce that most scientists believe in evolution or global warming, thus giving your students the “right answer” independent of audience or Aristotle’s categories of appeals.
Students will, of course, dutifully reproduce this kind of information in the essays they submit, but the frame created by the focus on rhetoric makes such information look like bias. Hanging over every discussion is the idea that all perspectives contain bias, or the equivalent idea that everyone has a valid belief. This relativism is inherent to rhetoric itself, when it is isolated as a field of study. It is something that Aristotle narrowly avoided by simply announcing that his essentially technical discourses on rhetoric were subordinate to truth, and that only truthful orators could use rhetoric rightfully. His important corollary has been lost in the contemporary revival of ethos, pathos, and logos. If everyone is right, or everyone is biased, then alliances, not truths, are the highest values.
It may seem strange to talk about evolution or global warming or geopolitics period in this context. After all, our subject is writing courses, which are taught mostly by people with apolitical degrees (English, history, philosophy, etc.). In high school there is a much sharper delineation between English or language arts, which covers literature, expository writing, and creative writing, and other classes that cover recent history or introductory political science. Well, it is strange. The centrist politicizing of the writing classroom is not especially helpful to students, who are neither challenged politically nor pushed as hard as they could be as writers. The political focus is simply the result of the growing power of composition as a discipline, a discipline that blindly attempts to separate writing from literature, and that justifies itself intellectually by citing the supposed political value of rhetorical analysis.
Teaching Them What They Already Know: Composition and Literature
Most people have, within certain familiar realms, a very sophisticated, intuitive understanding of rhetorical strategy. Teenagers know how to shift from one vocabulary to another, depending on audience, and sound completely different in their essays than they do in casual conversation or on IM programs. They have different ways of speaking to parents and friends, and they work hard on crafting online and offline persona that others will find appealing. One of the gratifying things about teaching rhetoric is that students “get it” right away, because it relates to certain fundamental social skills. Thus, when a class works together on a rhetorical analysis, students often manage to rapidly produce useful observations. This is especially true when they are dealing with something comfortable, like a scene from a movie.
Less discussed, though, is the fact that students “get” rhetoric (and we find it easy to teach) because it follows an intersubjective logic similar to that of capital. Rhetoric goes hand-in-hand with advertising, the dominant language of contemporary desire. Students find themselves growing up in a world where demographics — audiences — are created out of thin air by advertising in its various forms, and where mass production aligns itself to the desires of a consumer audience. Furthermore, rhetorical analysis is dissociative: Anyone who has tried to teach ethos, pathos, and logos as operations to be performed on a text knows how students arbitrarily divide the text up into “emotional” sections and “argumentative” sections, even though such divisions are rarely defensible.
This is not the students’ fault, as we send them gunning for whatever holism a text possesses. The lysis of the text feels oddly familiar, though, because contemporary culture is similarly dissociative. Logic is the calculated process of competition and oppression, emotion is the catharsis of sentimentality, and personality is likability; to put the matter crudely, ethos, pathos, and logos correspond to the capitalist triptych of the advertiser (the “front man”), the consumer, and the accountant.
Holism is not always wanted. There are times when ad hoc writing is the most logical response to a particular situation, and there is also a place for the modest ambitions of, say, a humor column. Nonetheless, I believe that teachers of writing ought to see it as their particular mission to teach holism, particularly as it manifests in the peculiar written technologies of literature and longer creative nonfiction. In short, our mission is to teach English, not composition or writing, regardless of what our students choose for their major.
Literature tends to be de-emphasized in composition courses because it is hard to abstract arguments from it, impossible to put your finger on the “speaker’s ethos,” and tough to separate the emotional resonances from the ideas. Even earlier works of non-fiction are less invested in ethos: I taught both Joseph Mitchell and Chuck Klosterman this year, and found that Klosterman but not Mitchell can be easily analyzed for ethos. Klosterman is a 21st century writer, eager to tell you about what he bought at the Gap or how he seduced a woman in Michigan. Mitchell, on the other hand, writes “I caught up with Joe Gould…”, and then writes about Gould, not himself. Over the course of a whole book like Women In Love, we certainly get a sense of something like the breadth of D. H. Lawrence’s personality, but always indirectly, mediated as it is by plot, character, setting, and all the conventions of fiction.
The same problem recurs with studies of literature’s audiences; especially in 2008, trying to discuss the “audience” of Jane Austen is frequently unhelpful. The people Austen was ostensibly writing “for” did not include Edward Said, but by now Said is an important part of any discussion about Austen. There are texts that are heavily determined by (and determining of) audience, and others that are not. There are historical claims to be made about literature’s audiences, but these claims never exhaust the work itself.
There is a great deal of general anxiety among teachers that students will not read big books, particularly big books that aren’t anthologies. This premonition is very often correct; over the course of my life, I have been assigned a lot of big books that I didn’t finish. Nonetheless, by setting the bar high, we get more from students than we otherwise would. The big problem occurs when the alternative, having students write about short opinion pieces and pop culture, gets so entrenched that instruction in writing becomes completely generalized, indistinguishable from the incidental flow of words that fills up the day. It is true that other artistic forms are just as holistic as literature, but unfortunately they do not simultaneously teach writing. That is why writing curricula must emphasize longer texts, and why universities must take a more enlightened view of how undergraduate instruction in English will translate into real-world skills.
At first glance, it seems useless to have engineers or business majors practicing creative writing or analyzing literary form and content. Yet this training is exactly what will make them imaginative, subtle, and compassionate writers. Without such practice, they will be competent, but not compelling. A mixed approach, focusing on literature, serious creative non-fiction, and criticism, with rhetoric as a useful but limited subcategory, will give students the horizon they need to excel as writers, regardless of what kind of writing they eventually do. The field of rhetoric ought to remain a discipline in its own right, instead of becoming simply another word for using language, and as a discipline it is not broad enough to cover all the moments of aesthetic discovery and delight that initiate students into the writer’s world.
That kind of mixed curriculum in today’s academic environment requires immense dedication on the part of students, and it means leaving enough room in student schedules so that they can puzzle over long and unfamiliar texts. Out of discussions of character and circumstance, real conversations about situational ethics and diverse viewpoints can take place, on a far more sophisticated level than discussions of rhetorical efficacy that boil down to relativism. Society can be judged complexly; it does not need to automatically be scolded in the name of “critical reasoning,” or praised because it runs on rhetoric. Out of the intricacies of narration, criticism, and poetics, a conversation about style can take place that allows students to discover authorial voice and to take a writerly approach to individuality that goes infinitely beyond Bush’s “cowboy” schtick. Finally, the classroom can be a place where a felt response to imaginary circumstances prepares students for a world in which they will frequently have to make ethical decisions whose implications go far beyond anything they can directly see or experience.
Such courses seldom reflect what undergraduates “already do” every day, and success will be a struggle for them. It is probably not what they already know, but I fully believe it is what they hope to learn.
Joseph Kugelmass is a graduate student at the University of California, Irvine. During the summer, he teaches ESL and SAT Prep at Phillips Academy, in Andover, Mass. He is a co-editor and contributor for The Valve, and also blogs at The Kugelmass Episodes.
Just before heading to San Francisco for the annual convention of the Modern Language Association, I had a brilliant idea, or so it seemed. Between scholarly panels and face-to-face meetings, I would blog here at Inside Higher Ed. Instead of scribbling notes on a pad and then synthesizing out some kind of continuous text after the fact, this would mean recording the MLA in all its paratactic glory -- perhaps including links to YouTube videos of people saying interesting things in casual discussion after the panels. Your roving reporter would pause every so often to type up something on the laptop, or shoot it with the digital camera, and fire the resulting document out into the world via the wireless ether.
This was indeed, in its conception, a beautiful plan. Except that my laptop (which by now probably counts as vintage) is heavy, and the hotel wireless proved another sort of pain. The rooms where the panels were held tended to be badly lit, so that the video clips were all of a murkiness. Besides which, there were never any fireworks. If the days of Theory are over, so are the days of post-Theory, and "the rediscovery of aesthetics." I have attended six of the past seven MLA conventions. This was the first time when it really felt like a trade show in Detroit -- and not back in the day when next year's auto designs were a big deal, either. More like one right about now. Non-deflation counts as progress.
Be that as it may, I filed occasional causeries along the way, available here. And the blog will continue in the months ahead as an annex or supplement to Intellectual Affairs. In the four years that IA has been running, any number of books, papers, debates, etc. have fallen through the cracks. For whatever reason, I found it impossible to develop a full-length column around them. Given how few nonspecialist journals devote space to university-press books (apart from a handful of crossover titles per season, usually from the same four or five publishers), it might be helpful to offer quick or timely references to work that might otherwise be missed.
One possibility is that the new venue might include the occasional striking passage from my reading, in the manner of a commonplace book.
As blogged early in MLA, the organization has given its most recent lifetime achievement for scholarship to René Girard. His newest book, Mimesis and Theory: Essays on Literature and Criticism, 1953-2005, is from Stanford University Press, which naturally had it prominently displayed at their booth. A couple of paragraphs in "Theory and Its Terrors" (first published in 1989) jumped out as worth quoting here.
"If you consider our numbers in the abstract, you might think we are about the right size for a harmonious and productive intellectual life. How many of us are there in the humanities? How many members does the Modern Language Association have? There must be at least twenty thousand active people. [Twenty years later, it is thirty thousand, according to the MLA website.-SM] We complain about the indifference of the outside world. The public pays no attention to us; it is not interested in criticism; yet our numbers correspond, more or less, to the actual audiences of Shakespeare or Racine at the time they were writing. Our sector of the academic world is as large as the entire cultivated public of Elizabethan England or the France of Louis XIV.
"And yet our cultural world is a far cry from Elizabethan England or la cour et la ville in seventeenth-century France. There is a reason for this, so simple and yet so obvious that no one ever mentions it. At the time of Elizabeth and Louis, one percent, perhaps, of the educated people were producers, and ninety-nine percent were consumers. With us, the proportion is curiously reversed. We are supposed to live in a world of consumerism, but in the university there are only producers. We are under a strict obligation to write, and therefore we hardly have the time to read one another's work. It is very nice, when you give a lecture, to encounter someone who is not publishing, because perhaps that person has not only enough curiosity but enough time to read your books."
It’s often said that one of the great failings of American higher education is that teaching fails to get the respect it deserves. It seems to me, however, that, especially in the humanities, the current academic generation is significantly more dedicated to teaching than most of us were when I started out in this profession in the early sixties. The real problem, as I see it, is that the way we think about teaching needs to change.
At a time when amazing new forms of connectivity are made possible by new digital technologies and when much of the best recent work in the humanities has made us more aware of the social and collective nature of intellectual work, we still think of teaching in ways that are narrowly private and individualistic, as something we do in isolated classrooms with little or no knowledge of what our colleagues are doing in the next classroom or the next building and little chance for each other’s courses to become reference points in our own. Indeed, we betray our assumption that teaching is by nature a solo act in our unreflecting use of “the classroom” as a synecdoche or shorthand for all teaching and learning, as if “the way we teach now” were reducible to “the way I teach now.”
The isolated, privatized classroom is itself a product of a more affluent era for American universities, a luxury made possible by the generous economic support they enjoyed during the first two-thirds of the 20th century. In this heady economic climate, a university could grow by expanding its playing field, proliferating new courses, fields, subfields, and scholarly perspectives while giving each enough separate space to ward off unproductive turf wars. To make a long story short, we became terrific at adding exciting new theories, fields, texts, cultures, and courses to the mix, but we’ve been challenged, to say the least, when it comes to connecting what we’ve added. Interdisciplinary programs have helped make some connections, but ultimately they have reproduced fragmentation rather than lessened it, since interdisciplinary programs tend to be disconnected from each other as well as from the disciplines. And now that we don’t have the financial luxury to keep adding on -- as is seen in our alarming overdependence on underpaid and overworked adjuncts -- we need to get a lot better at putting the components into dialogue, which means getting on the same page in our teaching in ways we lack practice at and may find uncomfortable.
Exhorting you to “get on the same page” may sound strange coming from me, since, if you know nothing else about me, you probably know that I’ve been arguing for years that we should “teach the conflicts,” putting our controversies at the center of our courses and programs, and I’ve often complained that we hide our disagreements from our students or reveal them only in fleeting glimpses. But I want to argue that, as much as we do conceal our disagreements from students, we also conceal our agreements from them as well as from ourselves. And teaching in non-communicating black boxes helps prevent us from discovering and taking advantage of the fact that in fundamental ways, as I will argue in a moment, we already are on the same page.
I believe that our experience of teaching in hermetically sealed classrooms makes us -- to coin a word -- “courseocentric.” Courseocentrism -- like its ethno-, ego-, and Euro- counterparts -- is a kind of tunnel vision in which our little part of the world becomes the whole. We get so used to the restricted confines of our own courses that we became oblivious to the fact -- or simply uninterested in it -- that students are enrolled in other courses whose teachers at any moment may be undercutting our most cherished beliefs. As my retired colleague Larry Poston recently observed, there is something remarkable about the “almost entire lack of interest we manifest as a profession in what is going on in our colleagues' classes.”
To get on the same page, of course, we would need to know something about each other’s teaching and the ways of thinking behind it, and such knowledge might lead to embarrassing disagreements. So perhaps the less we know about each other the better. It’s not surprising, then, that instead of asking us to try to get on the same page in our teaching, universities assume that each of us will figure out how to teach our subjects on our own.
This assumption is understandable, since many of us became academics in the first place because we liked figuring things out on our own and were good at it. I myself certainly appreciate my classroom freedom, and am not about to ask that I be made to submit a lesson plan to my department head, a curriculum committee, or a district supervisor, as many high school teachers must do. I think I understand why untenured and adjunct faculty members may feel that the classroom is a relatively safe zone that would be threatened if their colleagues knew more about their teaching. I know that on my own really bad days as a teacher I’m relieved that the train wreck has been witnessed only by my students and not my senior colleagues and deans.
Still, I can’t help wondering if our professionalism and our prestige would be fatally compromised if we had to coordinate our teaching the way high school faculties often do. I also suspect that we overestimate the safety our classroom privacy confers, and that more transparency and collaboration in our teaching would not only help students make better sense of us, but would ultimately be as safe for the most vulnerable among us as a curriculum that lets us hide out from each other.
The learning community model is one obvious way to go, but an excellent first step would be to pair first year composition and general education courses, as many colleges and universities now do. A step beyond that would be to pair some science and humanities courses and courses in ancient and modern periods. If we don’t make such pairings, students will lose sight of the contrasts and continuities that define the sciences and humanities and differentiate the ancient from the modern. We’re also missing an opportunity every time a big period course isn’t co-taught by colleagues from several different disciplines. Then, too, the more we are part of a team, the less easily replaceable we become -- a fact that could provide more job security to adjunct faculty members.
The trouble with leaving it up to each of us to figure things out on our own is that it really means leaving it up to our students to figure us out on their own. The assumption is that if we all teach our courses conscientiously, each of us making sure our demands are as clear and transparent as possible, our students will make coherent sense of our diverse perspectives and will eventually be socialized into our intellectual community. The problem is that, no matter how transparent each course is, as long as we know little about our colleagues’ courses our students figure to come away with confusingly mixed messages that will be hard to make sense of without more help than we are providing. As the educational thinker Joseph Tussman once put it, all the courses in a program may be admirably coherent, “but a collection of coherent courses may be simply an incoherent collection.”
It would take too much space to list all the confusingly mixed messages students get from an average set of humanities courses in an average academic day. College students have already coped with such mixed messages on making the transition from high school, when what had been called “Language Arts” mysteriously evaporated and morphed into foreign languages and “English” -- a word that is itself far from helpful or self-explanatory. Once in college, a student can go from one teacher who passionately believes that interpretations of literary texts are correct or incorrect -- or at least more correct or incorrect than other interpretations -- to another teacher who smiles or rolls his or her eyes at the naivete of such a belief; or from one teacher who expects undergraduates to analyze literature by using a rigorous methodology and terminology to another who thinks it sufficient if they learn to appreciate books in whatever way is comfortable to them; or from one teacher who discourages students from summarizing, telling them, “I’ve already read the text -- I want to know what you think,” to another who says, “I don’t care what you think, I want to see how carefully you’ve read the text.” No wonder students often come up and ask, “Do you want my ideas in this paper or just a summary of the reading?” And I do not even mention the discrepancies between the humanities and science and business.
Our classrooms allow us on the faculty to tune each other out, but our students don’t have that luxury. They consequently develop their own protective forms of courseocentrism, adapting to a compartmentalized curriculum by mentally compartmentalizing us. I’m thinking of the familiar student practice of “psyching out” successive teachers and giving each of us whatever we seem to want even if it flatly contradicts what the last teacher wanted. Students thus learn to become relativists at ten a.m. and objectivists after lunch. We often complain about the cynicism of this shape-shifting act, but arguably it is precisely the behavior our curricular mixed messages encourage. Since the disjunctions between courses prevent them from forming an intelligible collectivity, students end up concluding that the only way they can figure us out is one at a time. This virtually means starting over from scratch in every new course.
Some defend this mixed message curriculum as a healthy cognitive workout regimen, an antidote to dogmatic certainty, or even as the perfect training for dealing with the ambiguities, instabilities, and unpredictable changes of life in the 21st century. And the high achieving minority of students do flourish under this curriculum, since they are able to synthesize its disparate views or summarize the places where they conflict, creating on their own the connected conversation that the curriculum obscures and thus entering it as insiders. These high achievers detect the places where their diverse courses converge and therefore experience the redundancy and reinforcement our minds need, according to information scientists, to make sense of the world. But for the struggling majority, the discontinuities from one course to the next tend to erase this redundancy and reinforcement, leading them to come way with a greatly exaggerated picture of the differences between faculty members, disciplines, and fields while missing the common practices underneath. When taking courses becomes a process of serially giving your teachers whatever they seem to want -- assuming you can figure out what that is to begin with -- jumping through hoops replaces deep socialization into the intellectual community. In other words, the disconnect between courses and teachers ultimately reproduces itself in a disconnect between students and academic culture itself.
Courseocentrism thus goes far toward explaining the apathy and disengagement that educational researchers have found in reports like the National Survey of Student Engagement. It also helps explain the finding of less well publicized studies that students who learn a subject well enough to get a good grade in a course often fail when they are asked to apply what they learned to a context outside the course. In one study discussed by Howard Gardner in his book The Unschooled Mind, elementary school students who did well on tests that required them to know that the earth is spherical and revolves around the sun reverted to their earlier flat earth beliefs when tested after the course. Their learning was apparently so tied to the course in which they’d learned it that once the course was over they quickly forgot it and regressed to their pre-educated understandings. As my correspondent Jim Salvucci put it, “What you learn in a course stays in the course.”
Again, however, underlying the great diversity and difference in the substantive content of today’s academic intellectual culture lies an important area of common ground with respect to its fundamental practices, though this common ground is hidden both from students and teachers by the disconnection of courses. Whether we follow Lacan or Leavis, we would not have gotten very far in the university unless we had mastered the fundamentals of reading, analysis, and argument, of summarizing others and using them to define our own ideas, that comprise what we now call “critical thinking skills.” It is this implicit agreement on core practices -- as distinct from the content of our ideas -- that explains why colleagues who otherwise have little in common tend to agree overwhelmingly on who the good students are. But our separation from each other in the curriculum prevents us from discovering the existence of these practices and thus alerting students of their existence.
This failure to recognize our common ground has marked the culture war debates that have embroiled us since the mid-1980s. As I have argued elsewhere, we became so caught up in the conflicts over which books should be taught and how that we lost sight of the fact that for most American students -- again with the exception of the high-achieving few -- the great stumbling block has always been the culture of books and book discussion as such, regardless which side gets to draw up the reading list. And today we are still so caught up in the battles between traditional and trendy versions of intellectual culture that we lose sight of the fact that to most students it is the nebulosity of intellectual culture itself that is the problem, whether the form this culture comes in is traditional or trendy.
I have elsewhere described coming up against this problem myself in a course in which I had juxtaposed assigned readings by the arch-traditionalist Allan Bloom and the radical African American feminist bell hooks. To any academic insider, Bloom and hooks are so far apart ideologically as to be on different planets, but I realized that for some of my students they were virtually indistinguishable, both using an obscure academic language to discuss problems the students had a hard time seeing as problems. In a succinct formulation of the point that Michael Bérubé offered me after hearing a talk in which I struggled to articulate it, any two eggheads, no matter how far apart ideologically, will always be far closer to each other than to non-eggheads. Again, the reason is that eggheads -- intellectuals -- whether they are on the Left or the Right, are defined and differentiated from outsiders by their membership in a common culture of ideas and arguments, a common culture that our curricular mixed messages hide from our students and our non-communicating courses hide from us.
I’m often told that I’m naïve in thinking that academics will ever willingly consent to coordinate their courses across their partisan divisions, much less argue with each other in the ways such coordination might require. I am told that, whether rightly or wrongly, arguing out our differences just isn’t the way the academic world works. Yet it’s striking to me that we argue out our differences all the time when we review each other’s books and articles and engage each other in our publications and professional conferences.
In fact, I’m always shocked by the contrast between the academic conference scene, with its intense and lively -- if often acrimonious debates -- and our isolation from each other when we go back home. It’s not uncommon at a conference for me to run into a colleague from my own department whom I’ve passed in the hallways for years and discover that we have common interests we never suspected. I wonder why we had to travel hundreds of miles to have a conversation about the professional issues we care most about, but it’s apparently the fact that we care about them that makes them too risky for home consumption. It’s as if academic conference culture itself came into existence to satisfy a desire for intellectual community that wasn’t met by campus culture -- a fact that might guide us in changing campus culture. When I reflect that I’ve probably learned more about how to be an academic at conferences than I ever did in graduate school, I’m all the more convinced that there has to be a better way to organize intellectual life for educational purposes than dicing it into non-communicating courses.
I mentioned earlier that such courses are at odds with the new forms of connectivity enabled by our new electronic technology. They are also at odds with the most sophisticated and original work in the humanities during the last generation, which has taught us that what seem to be free-standing identities—whether they be texts or selves -- are produced by collective structures of discourse and representation. It seems we have deconstructed the autonomous, self-authorizing subject and the autonomous, self-authorizing literary work. It’s time we got around to deconstructing the autonomous, self-authorizing course.
Gerald Graff is professor of English and education at the University of Illinois at Chicago. He is the immediate past president of the Modern Language Association and this essay is adapted from the presidential address he gave in December at the association's annual meeting.
My assignment didn’t cause consternation, but it presented challenges. I told them to write a poem on their favorite fruit, bring it to next week’s class, read (perform) it, and then share the fruit with the class. By way of preparation, we studied a badly copied (by me) art-book reproduction of a painting by Zurbaran and lingered at length over Pablo Neruda’s blurb on the back cover of Julio Cortazar’s book, Hopscotch, where (I paraphrase), Neruda claims that having never read Cortazar is like never having tasted a peach; a man like that, who never tasted a peach, would become sadder and sadder until one day he’d die of sadness. I dwelt briefly on that ever-bothersome noun “man,” as being generic for Reader, but faintly scented in this case by Neruda’s well-known erotic appetite that set up a suspect dynamic between Man-Reader, Untasted Peach, and Terminal Sadness. The women snickered.
We discussed fruit in poetry throughout the ages, beginning with the plum flowers and, eventually, plums of the Japanese haiku poets, through the modernists, particularly W.C. Williams, whose refrigerated and missing plums are on the lips of every poetry student in America. We then went around the room to see what was everyone’s favorite fruit. Grapes came up first, plums in a close second, melons third, apples fourth, and oranges sixth. Peaches came in a distant tenth, a fact due possibly to our location in southern Louisiana, where peaches don’t grow. I ascertained also that cumquats, fresh figs, persimmons, guava, and star-fruit were unknown to students in Poetry Writing 4000, an intermediate poetry class.
An “intermediate poetry class” is the product of decades-long elaboration of an absurdity that once ensconced within the English Department and the Humanities could only be dislodged by a major thought earthquake, equal in potency to the Dada revolution. No such earthquake-revolution has occured in the teaching of the humanities since Dada itself became the predominant pedagogy of our “higher education” system in the post-modern Sixties.
Some scholars would trace the introduction of Dada teaching in the humanities to the beginning of American education, with its menu of “electives.” “Electives” are Dada by nature, a quality that did not escape Ezra Pound, who credited “electives” for the opening in his poetry to other languages, quotation, parody, ironies, essaying, verse free to dance on the page and out of prosodic strictures, and the introduction of elements hitherto alien to poetry, such as economic opinions.
Still, it was not until the mid-Sixties that the “teaching” of the creative arts became institutionalized. Not coincidentally, the Dada method became “natural” to working artists first, then to poets. After the Dada presence in New York facilitated abstract-expressionism and the poetry of Frank O’Hara, American artists and poets no longer felt provincial when they compared themselves to the Europeans. By the second generation of New York artists and poets, the Dada roots of the new art were starting to be forgotten, to make room by the third, fourth, and fifth generation to the “natural” sense of art-making and “poetry-writing” that then could, through such “normality,” become pedagogy.
Even the description of such an evolution can seem “normal, if it were not for the stubborness of Dada itself, a movement born during World War I out of disgust with all Western “civilized” institutions, including universities, especially the humanities, which the Dadas saw as particularly pernicious. The Dada generation of 1915-1927, led by the brilliant and insufficiently understood poets Tristan Tzara and Hugo Ball, and artists Marcel Duchamp, Marcel Janco Jean Arp, and George Grosz, called for an artist-led revolution in society, a revolution conducted by means of chance, randomness (“electives”), denial of previous esthetic pieties, and the shakeup of traditional institutions, including private property and the family.
The Dada arsenal was vast: laughter, joy, absurdity, unpredictability; in other words, an entirely different sense of existence than that of Unamuno’s “tragic” sense, or the Russian Constructivists’ and Italian Futurists’ aggressive mechanical utopias. Dada made use of everything for the sole purpose of undoing the ideas of everything. It’s not hard to see the dubious, if not downright dangerous, consequences of the Dada method, especially in universities, where rebellion, hormones, and questioning are the very things the institution is charged with controlling.
The students were seated at the seminar table with a fruit or a bowl of fruit in front of them when I walked in the following week. I sat at the head of the table. On my left was Amy, with a large cluster of white grapes in a blue bowl before her; at my right was Melanie, facing a Cassaba melon with several circles of words magic-markered around it; Matt faced a grapefruit; Martin an apple. The 12 students in “intermediate poetry” stood before their inscribed fruits like figures in a tableau-vivant, waiting for the signal to begin the performance. Amy distributed one grape to each of us and asked us to write a word on it. I wrote “peach” on mine; Melanie wrote “love,” and others wrote whatever they wrote, and then some of them ate it, and some of them threw their grape back at Amy who ate them all; six students ate their own word written on Amy’s grape, and Amy ate six words others had written on her grapes, including “peach” and “love.”
Melanie stood holding her Cassaba melon like a globe or Yorick’s skull in her left hand and read it slowly rotating it to see all the lines; she then passed the Cassaba around and everyone read a line; amazingly, there were exactly 13 circular lines on the melon; she then cut it open with a sharp folding knife of illegal dimensions (on an airplane, certainly) and passed slices that everyone ate like communion, there being present also an eerie, nearly sacerdotal silence. And so it went, fruit after fruit, read, performed, eaten, in an order that could have not been more perfect if Noah’s monitors had been there. We thus learned that: a) poetry can be edible (and perhaps it should be); b) fruit is a sexier medium than paper or pixels; c) school could be fun, d) “intermediate” could mean that even though the medium had not been quite reached (advanced), the closeness to experience itself (beginning), made it worthwhile, e) it’s not so easy to write on fruit without good magic markers, and f) T.S. Eliot need not be memorized.
Was Dada domesticated by this pedagogical demonstration? The Dadaists were prolific generators of forms: assemblage, collage, decoupage, simultaneous readings, collaoration (cadavre-exquis), noise making, tattooing innocents, placing people on bookshelves and books in spectators’ seats, wearing hats made from bird cages. Their fertility gave birth to the styles, looks, attitudes, and objects of the 20th century, but the best results were not the objects, but the process of making them.
The current thinking in the humanities is that creativity and artistic production are good things, so good, in fact, that their subversive qualities could be overlooked. After all, Dada, like other modern movements, has been studied to death; nothing alive could survive such exegesis. I am willing to bet, however, that 10 years hence, my fruit-writing students, now in advertising and new media, will look back on their school years and remember nothing except the Dada moment in their “intermediate” poetry class. Is Dada pedagogy useful in today’s clasroom? There isn’t any other worth the absurd price of “higher education.” More Dada!
Submitted by Anna Leahy on January 29, 2009 - 4:00am
I know what you’re thinking: Why is a poet writing about assessment in higher education? Honestly, I wonder that myself. One day, when assessment came up in conversation, I commented that it could be useful to programs as they make curricular decisions. Within 48 hours, the dean placed me on the institution’s assessment committee. Suddenly, assessment is a hot topic and, of all people, I have some expertise.
My years on that committee convinced me that we must pay attention to the rise of assessment because it is required for accreditation, because demands have increased significantly, and because it might be useful in our professional lives. Accrediting bodies are rightly trying to stave off the No Child Left Behind accountability that the Spellings Commission proposes. Maybe the incoming secretary of education will consider how we might be better -- not more -- accountable. Perhaps, too, Wall Street should be held accountable before the Ivory Tower. But assessment for higher education will likely become more pressing in a weak economy.
One tool to which many institutions have turned is the National Survey of Student Engagement (NSSE, pronounced Nessie). NSSE was piloted in 1999 in approximately 70 institutions, and more institutions participate each year. This survey appeals especially to college and university presidents and trustees, perhaps because it’s one-stop, fixed-price assessment shopping. NSSE presents itself as an outside -- seemingly objective -- tool to glean inside information. Even more appealing, it provides feedback on a wide array of institutional issues, from course assignments to interpersonal relationships, in one well-organized document. Additionally, the report places an institution in a context, so that a college can compare itself both with its previous performance and with other colleges generally or those that share characteristics. And it doesn’t require extra work from faculty. NSSE seems a great answer.
Yet, NSSE does not directly measure student learning; the survey tracks students’ perceptions or satisfaction, not performance. Moreover, respondents appraise their perceptions very quickly. In the 2007 NSSE, students were informed, “Filling out the questionnaire takes about 15 minutes” to complete 28 pages, some of which included seven items to rate. So, as with its Scottish homonym, NSSE presents a snapshot of indicators, not the beast itself.
Importantly, NSSE is voluntary. A college or university can participate annually, intermittently, or never. If a college performs poorly, why would that college continue? If a university uses the report to, as they say in assessment lingo, close the loop, wouldn’t that university stagger participation to measure long-term improvements? Over its 10-year existence, more than 1,200 schools have participated in NSSE, and participation has increased every year, but only 774 schools were involved in 2008, which suggests intermittent use. In addition, some institutions use the paper version, while others use the Web version; each mode involves a different sample size based on total institutional enrollment. NSSE determines sample size and randomly selects respondents from the population file of first-years and seniors that an institution submits.
Perhaps, all these factors lead NSSE to make the following statement on its Web site: "Most year-to-year changes in benchmark scores are likely attributable to subtle changes in the characteristics of an institution’s respondents or are simply random fluctuations and should not be used to judge the effectiveness of the institution. The assessment of whether or not benchmark scores are increasing is best done over several years. If specific efforts were taken on a campus in a given year to increase student-faculty interaction, for example, then changes in a benchmark score can be an assessment of the effectiveness of those efforts."
This statement seems to claim that an increase in a score from one year to the next is random unless the institution was intentionally striving to improve, in which case, kudos. Yet, NSSE encourages parents to “interpret the results of the survey as standards for comparing how effectively colleges are contributing to learning” in five benchmark areas, including how academically challenging the institution is.
I have larger concerns, however, about assessment tools like NSSE, which are used for sociological research on human subjects. The humanities and arts are asked to use a methodology in which we have not been trained and for which our disciplines might not be an appropriate fit. NSSE is just one example of current practices that employ outcomes-based sociological research, rubric-dominated methodology, and other approaches unfamiliar in many disciplines.
Such assessment announces , anyone can do it. I’ve seen drafts of outcomes and rubrics, and that’s not true. Programs like education and psychology develop well-honed, measurable outcomes and rubrics that break those outcomes down into discernable criteria. Programs in the sciences do a less effective job; some science faculty assert that the endeavor is invalid without a control group, while admitting that a control group that denies students the environment in which they most likely learn would be unethical.
Those of us in the arts and the humanities want wide, lofty outcomes; we resist listing criteria because we disagree, often slightly or semantically, about what’s most important; we fear omission; and we want contingencies in our rubrics to account for unexpected — individual, creative, original — possibilities. Writing and visual art cannot easily be teased apart and measured. Critical thinking and creative thinking are habits of mind. How can NSSE or rubrics capture such characteristics?
Moreover, by practicing social science, often without reading a single text about those methods, arts and humanities faculty diminish the discipline we poach as well as lessen the value and integrity of our conclusions. If we don’t know what we’re doing — how many of us really understand the difference between direct and indirect measures or between outcomes, objectives, goals, and competencies — the results are questionable. To pretend otherwise is to thumb our noses at our social science colleagues.
Further, this one-size-fits-all, cookie-cutter mentality ignores that different disciplines have different priorities. Included in Thomas A. Angelo and K. Patricia Cross’s Classroom Assessment Techniques is a table of top-priority teaching goals by discipline. Priorities for English are Writing skills, Think for oneself, and Analytic skills, in that order. Arts, Humanities, and English have just one goal in common: Think for oneself. We can survey student perceptions of their thinking — an indirect measure — or maybe we know independent thinking when we see it, but how do we determine thinking for oneself in a data set? These priorities aren’t even grammatically parallel, which may not matter to social scientists, but it matters to this poet!
Other priorities for Arts — Aesthetic appreciation and Creativity — and Humanities — Value of subject and Openness to ideas — are difficult, if not impossible, to measure directly. The priorities of Business and Sciences are more easily measured: Apply principles, Terms and facts,Problem solving, and Concepts and theories. So, a key issue is to determine whether the arts and humanities can develop ways to assess characteristics that aren’t really measurable by current assessment methodology or whether we must relinquish the desire to assess important characteristics, instead focusing on easily measured outcomes.
Another table in Classroom Assessment Techniques lists perceived teaching roles. Humanities, English, and Social Sciences see Higher-order thinking skills as our most essential role, whereas Business and Medicine view Jobs/careers as most essential, Science and Math rank Facts and principles most highly, and Arts see Student development as primary. Both knowledge of Facts and principles and job placement can be directly measured more easily than Student development. For English, all other roles pale in comparison to Higher-order thinking skills, which 47 percent of respondents rated most essential; the next most important teaching role is Student development at 19 percent. No other discipline is close to this wide a gap between its first- and second-ranked roles. Surely, that’s what we should assess. If each discipline has different values and also differently weighted values, do we not deserve a variety of assessment methodologies?
Lest I bash assessment altogether, I do advocate documenting what we do in the arts and humanities. Knowing what and how our students are learning can help us make wise curricular and pedagogical decisions. So, let’s see what we might glean from NSSE.
Here are items from the first page of the 2007 NSSE:
Asked questions in class or contributed to class discussions
Made a class presentation
Prepared two or more drafts of a paper or assignment before turning it in
Worked on a paper or project that required integrating ideas or information from various sources
Included diverse perspectives (different races, religions, genders, political beliefs, etc.) in class discussions or writing assignments
Students were asked to rate these and other items as Very often, Often, Sometimes, or Never, based on experience at that institution during the current year. These intellectual tasks are common in humanities courses.
In another section, students were questioned about the number of books they had been assigned and the number they had read that weren’t assigned. They also reported how many 20+-page papers they’d written, as well as how many of 5-19 pages and how many of fewer than five pages. We can quibble about these lengths, but, as an English professor, I agree with NSSE that putting their ideas into writing engages students and that longer papers allow for research that integrates texts, synthesizes ideas, and encourages application of concepts. And reading books is good, too.
Another relevant NSSE question is “To what extent has your experience at this institution contributed to your knowledge, skills, and personal development in the following areas?” Included in the areas rated are the following:
Acquiring a broad general education
Writing clearly and effectively
Speaking clearly and effectively
Thinking critically and analytically
Working effectively with others
The English curriculum contributes to these areas, and we are often blamed for perceived shortcomings here. While NSSE measures perceptions, not learning, this list offers a simple overview of some established values for higher education. If we are at a loss for learning outcomes or struggle to be clear and concise, we have existing expectations from NSSE that we could adapt as outcomes.
In fact, we can reap rewards both in assessment and in our classrooms when students become more aware of their learning. To do this, we need some common language — perhaps phrases like writing clearly and effectively or integrating ideas or information from various sources — to talk about our courses and assignments. Professional organizations, such as the Modern Language Association in English or the College Art Association in the visual arts, could take the lead. Indeed, this article is adapted from a paper delivered at an MLA convention session on assessment, and the Education Committee of CAA has a session entitled “Pedagogy Not Politics: Faculty-Driven Assessment Strategies and Tools” at their 2009 conference.
We needn’t reorganize our classes through meta-teaching. Using some student-learning lingo, however, helps students connect their efforts across texts, assignments, and courses. Increasingly, my students reveal, for instance, that they use the writerly reading they develop in my creative writing courses to improve their critical writing in other courses. I have not much altered my assignments, but I now talk about assignments, including the reflective essay in their portfolios, so that students understand the skills they hone through practice and what they’ve accomplished. Perhaps, I’m teaching to the test — to NSSE — because I attempt to shift student perceptions as well as the work they produce. But awareness makes for ambitious, engaged, thoughtful writers and readers.
Good teachers appraise their courses, adapt to new situations and information, and strive to improve. As Ken Bain points out in What the Best College Teachers Do, “a teacher should think about teaching (in a single session or an entire course) as a serious intellectual act, a kind of scholarship, a creation.” We are committed to teaching and learning, to developing appropriate programs and courses, and to expectations for student achievement that the Western Association of Colleges and Schools asks of us. We can’t reasonably fight the North Central Association of Colleges and Schools mandate: “The organization provides evidence of student learning and teaching effectiveness that demonstrates it is fulfilling its educational mission.” Assessment is about providing evidence of what we do and its effects on our students. Our task in the arts and humanities is to determine what concepts like evidence, effects, and student learning mean for us. If NSSE helps us achieve that on the individual, program, or institution levels, great. But NSSE is best used, not as an answer, but as one way to frame our questions.
In a lecture, the novelist Robertson Davies once gave a wry characterization of the life of a full-time writer. It is, he said, every bit as gratifying as non-writers usually imagine it to be -- “except for the occasional complete collapse of the will to go on.”
This is quoted from memory (my effort to relocate the passage has not gone well) so the wording may be inexact. But the turn of phrase certainly rhymes with experience -- in particular, the tension of emphasis in “occasional complete collapse,” with its mix of casual surprise and total devastation. I feel less certain that Davies used the expression “the will to go on.” It sounds a bit melodramatic. But then, he was a satirist, and he might well have been making fun of the impulse to indulge in overacted displays of artistic temperament. (Making fun of this need not preclude indulging it.)
Anyone who spends much time trying to put the right words in the right order will accumulate a private anthology of passages like this one: quotations that map the high and the low points on the interior landscape of the writing life. Knowing that others have been there before you is reassuring – if only just so much help.
For Robert D. Richardson – the author of, among other things, William James: In the Maelstrom of American Modernism (Houghton Mifflin Harcourt), which won the Bancroft Prize for 2007 – one such landmark passages appears in “The American Scholar.” There, Ralph Waldo Emerson writes: “Meek young men grow up in libraries believing it their duty to accept the views that Cicero, Locke, and bacon have given, forgetful that Cicero, Locke, and Bacon were only young men in libraries when they wrote those books.”
In First We Read, Then We Write: Emerson on the Creative Process (to be published in March by University of Iowa Press), Richardson says the line “still jolts me every time I run into it.” I think I know what he means, but the quality and intensity of the jolt varies over time. Reading “The American Scholar” as a meek young man, I just found it irritating – as if Emerson were translating the anti-intellectualism of my small town into something more refined and elegant, if scarcely less blockheaded.
This was a naive reading of a remarkable and (at times) very weird essay. "The American Scholar" is actually something like a Yankee anticipation of Nietzsche’s “On the Use and Abuse of History for Life” – with the added strangeness that, when Emerson gets around to pointing out a prototype of the new-model American scholar, the example he gives is ... Emanuel Swedenborg, the 18th century Swedish polymath. Who, when not writing huge works on the natural sciences, spent his time talking to angels and devils and the inhabitants of other planets. WTF?
Rereading Emerson a couple of decades beyond adolescence, I saw that the target of his scorn was meekness -- not bookishness, as such. He was in any case not so genteel as he first appeared. There was a wild streak. There were depths beneath the oracular sentences that made him a kind of cultural revolutionary. You are not necessarily prepared to detect this when reading Emerson as a teenager. Like Bob Dylan says, "Ah, but I was so much older then, I'm younger than that now."
Richardson’s award-winning Emerson: The Mind on Fire (University of California Press, 1996) retraced his subject’s voracious and encyclopedic reading regimen, which seems to have been tinged with the urgency of addiction. That book was intellectual biography. The new one, which is far shorter, is something else again -- a synthesis of all the moments when Emerson muses over his own process, a distillation of his ethos as a reader and (especially) as writer.
“A good head cannot read amiss,” says Emerson. “In every book he finds passages which seem confidences or asides, hidden from all else, and unmistakeably meant for his ear.” Full attention and active engagement are always, by Emerson's lights, present-minded: “I read [something] until it is pertinent to me and mine, to nature and to the hour that now passes. A good scholar will find Aristophanes and Hafiz and Rabelais full of American History.”
Not being prone to foolish consistency, Emerson also maintains that some academic works are incapable of coming to life themselves, let alone revitalizing anyone else. “A vast number of books are written in quiet imitation of the old civil, ecclesiastical, and literary history,” he says. (One may quietly updates this by thinking of comparable 21st century tomes.) “Of these we need take no account. They are written by the dead to be read by the dead.”
By contrast, meaningful writing is an effort “to drop every dead word.” Emerson rules out any effort to rub pieces of jargon together in hopes they will generate sparks. “Scholars are found to make very shabby sentences out of the weakest words because of exclusive attention to the word,” he notes. You don’t say.
The struggle to connect with living currents of thought and meaning should begin with a notebook -- the place to cultivate, as Emerson puts it, “the habit of rendering account to yourself of yourself in some rigorous manner and at more certain intervals than mere conversation.” The important thing is to keep at it: “There is no way to learn to write except by writing.”
This may sound like generic advice, and to some degree it is. But from long years of scholarly attention to the daily progress of the essayist’s labors, Richardson hears the anxious undercurrents in Emerson’s reflections on writing. “There is a strangely appealing air of desperation, finality, of terminal urgency,” he writes, “to many of Emerson’s observations.... In every admonition we hear his willingness to confront his own failures; indeed, he never seems more than a few inches from total calamity. He urges us to try anything – strategies, tricks, makeshifts. And he always seems to be speaking not only of the nuts and bolts of writing, but of the grain and sinew of his – and our – determination.”
When necessary, Richardson points out, Emerson would “just sit down and start writing – anything – to see whether something would happen. He was quick to spot the same trick in others. ‘I have read,’ he noted, ‘that [Richard Brinsley] Sheridan made a good deal of experimental writing with a view to take what might fall, if any wit should transpire in all the waste of pages.”
Kenneth Burke once described Emerson’s prose as a “happiness pill” – that being a common enough assessment, though there is more to the sage than his role as dispenser of transcendental Prozac. It makes some difference to know that the pharmacist also had to heal himself. He cannot have been free from all of the worldly desires felt by lesser writers. The same wishes mean the same frustrations. The challenge is to keep faith with the rest of one’s reasons for writing – the motivations that break through the rubble.
First We Read, Then We Write is worth keeping at hand for moments of occasional complete collapse. I'll end with a passage that now belongs in the anthology, for emergency use:
“Happy is he who looks only into his work to know if it will succeed, never to the times or to the public opinion; and who writes from the love of imparting certain thoughts and not from the necessity of sale – who writes always to the unknown friend.”