Cultural studies

In the American Grain

Howard Zinn -- whose A People’s History of the United States, first published by Harper & Row in 1980, has sold some two million copies -- died last week at the age of 87. His passing has inspired numerous tributes to his role in bringing a radical, pacifist perspective on American history to a wide audience.

It has also provoked denunciations of Zinn as “un-American,” which seems both predictable and entirely to his credit. One of Zinn’s lessons was that protest is a deeply American inclination. The thought is unbearable in some quarters.

One of the most affectionate tributes came from the sports writer Dave Zirin. As with many other readers, he found that reading Zinn changed his whole sense of why you would even want to study the past. “When I was 17 and picked up a dog-eared copy of Zinn's book,” he writes, “I thought history was about learning that the Magna Carta was signed in 1215. I couldn't tell you what the Magna Carta was, but I knew it was signed in 1215. Howard took this history of great men in powdered wigs and turned it on its pompous head.” Zirin went on to write A People’s History of Sports (New Press, 2008), which is Zinnian down to its cells.

Another noteworthy commentary comes from Christopher Phelps, an intellectual historian now in the American and Canadian studies program at the University of Nottingham. He assesses Zinn as a kind of existentialist whose perspective was shaped by the experience of the civil rights struggle. (He had joined the movement in the 1950s as a young professor at Spelman College, a historically black institution in Atlanta.)

An existentialist sensibility -- the tendency to think in terms of radical commitment, of decision making as a matter of courage in the face the Absurd -- was common to activists of his generation. That Phelps can hear the lingering accent in Zinn’s later work is evidence of a good ear.

Zinn “challenged national pieties and encouraged critical reflection on received wisdom,” writes Phelps. “He understood that America’s various radicalisms, far from being ‘un-American,’ have propelled the nation toward more humane and democratic arrangements.... He urged others to seek in the past the inspiration to dispel resignation, demoralization, and deference, the foundations of inertia. The past meant nothing, he argued, if severed from present and future.”

I've spent less time reading the fulminations against Zinn, but they seem like backhanded honors. When a historian known for saying good things about the Fascists who won the Spanish Civil War considers it necessary to denounce somebody, that person’s life has been well-spent.

Others have claimed that Zinn did not sufficiently denounce Stalinism and its ilk. The earliest example of the complaint that I know came in a review of People’s History that appeared in The American Scholar in 1980, when that magazine was a cultural suburb of the neoconservative movement. The charge has been recycled since Zinn’s death.

This is thrifty. It is also intellectually dishonest. For what is most offensive about Zinn (to those who find him so) is that he held both the United States and the Soviet Union to the same standard. He even dared to suggest that they were in the grip of a similar dynamic.

“Expansionism,” he wrote in an essay from 1970, “with its accompanying excuses, seems to be a constant characteristic of the nation-state, whether liberal or conservative, socialist or capitalist. I am not trying to argue that the liberal-democratic state is especially culpable, only that it is not less so than other nations. Russian expansionism into Eastern Europe, the Chinese moving into Tibet and battling with India over border territories -- seem as belligerent as the pushings of that earlier revolutionary upstart, the United States.... Socialism and liberalism both have advantages over feudal monarchies in their ability to throw a benign light over vicious actions.”

Given certain cretinizing trends in recent American political discourse, it bears stressing that Zinn here uses “liberalism” and “socialism” as antonyms. A liberal supports individual rights in a market economy. By any rigorous definition, Sarah Palin is a liberal. And so, of course, is Barack Obama, who can only be called a “socialist” by an abuse of language. (But such abuse is an industry now, and I feel like Sisyphus just for complaining about it.)

The most substantial critique of A People’s History remains the review by Michael Kazin that appeared in Dissent in 2004. Kazin’s polemic seems to me too stringent by half. Zinn's book is not offered as the last word on the history of the United States, but as a corrective to dominant trends. It is meant to be part of an education, rather than the totality of it.

But Kazin does make points sometimes acknowledged even by the book’s admirers: “Zinn reduces the past to a Manichean fable and makes no serious attempt to address the biggest question a leftist can ask about U.S. history: why have most Americans accepted the legitimacy of the capitalist republic in which they live?”

That is indeed the elephant in the room. Coercion has certainly been a factor in preserving the established order, but persuasion and consent have usually played the greater part. Any American leftist who came of age after Antonio Gramsci’s work began to be assimilated is bound to consider hegemony a starting point for discussion, rather than an afterthought.

But Zinn was the product of an earlier moment -- one for which the stark question of commitment had priority. A strategic map of the political landscape was less urgent than knowing that you stood at a crossroads. You either joined the civil rights struggle or you didn’t. You were fighting against nuclear proliferation or the Vietnam War, or you were going along with them. It is possible to avoid recognizing such alternatives -- though you do end up making the choice between them, one way or the other.

There were subtler interpretations of American history than Howard Zinn’s. Anyone whose understanding of the past begins and ends with it has confused taking a vitamin for consuming a meal. But that does not make it worthless. The appreciation of complexity is a virtue, but there are times when a moment of clarity is worth something, too.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

The Mood Is the Message

Certain research topics seem to destined to inspire the question, “Seriously, you study that?” So it is with the field of Twitter scholarship. Which -- just to get this out of the way -- is not actually published in 140 characters or less. (The average “tweet” is the equivalent of two fairly terse sentences. It is like haiku, only more self-involved.)

The Library of Congress announced in April that it was acquiring the complete digital archives of the “microblogging” service, beginning with the very first tweet, from ancient times. At present, the Twitter archive consists of 5 terabytes of data. If all of the printed holdings of the LC were digitalized, it would come to 10 to 20 terabytes (this figure does not include manuscripts, photographs, films, or audio recordings).

Some 50 million new messages are sent on Twitter each day, although one recent discussion at the LC suggested that the rate is much higher -- at least when the site is not shutting down from sheer traffic volume, which seems to be happening a lot lately. A new video on YouTube shows a few seconds from the "garden hose" of incoming Twitter content.

When word of this acquisition was posted to the Library of Congress news blog two months ago, it elicited comment by people who could not believe that anything so casual and hyper-ephemeral as the average tweet was worth preserving for posterity – let alone analyzing. Thanks to the Twitter archive, historians will know that someone ate a sandwich. Why would they care?

Other citizens became agitated at the thought that “private” communications posted to Twitter were being stored and made available to a vast public. Which really does seem rather unclear on the concept. I’m as prone to dire mutterings about the panopticon as anybody -- but come on, folks. The era of digital media reinforces the basic principle that privacy is at least in part a matter of impulse control. Keeping something to yourself is not compatible with posting it to a public forum. Evidently this is not as obvious as it should be. Things you send directly to friends on Twitter won't be part of the Library's holdings, but if you celebrated a hook-up by announcing it to all and sundry, it now belongs to the ages.

A working group of librarians is figuring out how to “process” this material (to adapt the lingo we used when I worked as an archival technician in the Library's manuscript division) before making the collection available to researchers. But it’s not as if scholars have been waiting around until the collection is ready. Public response to the notion of “Twitter studies” might be incredulous, but the existing literature gives you some idea of what can be done with this giant pulsing mass of random discursive particles.

A reading of the scholarship suggests that individual tweets, as such, are not the focus of very much attention. I suppose the presidential papers of Barack Obama will one day include an annotated edition of postings to his Twitter feed. But that is the exception and not the rule.

Instead, the research, so far, tends to fall into two broad categories. One body focuses on the properties of Twitter as a medium. (Or, what amounts to a variation on the same thing, as one part of an emerging new-media ecosystem.) The other approach involves analyzing gigantic masses of Twitter data to find evidence concerning public opinion or mood.

Before giving a thumbnail account of some of this work – which, as the bibliography I’ve consulted suggests, seems intrinsically interdisciplinary – it may be worth pointing out something mildly paradoxical: the very qualities that make Twitter seem unworthy of study are precisely what render it potentially quite interesting. The spontaneity and impulsiveness of expression it encourages, and the fact that millions of people use it to communicate in ways that often blur the distinction between public and private space, mean that Twitter has generated an almost real-time documentary record of ordinary existence over the past four years.

There may be some value to developing tools for understanding ordinary existence. It is, after all, where we spend most of our time.

Twitter shares properties found in numerous other new-media formats. The term “information stream” is sometimes used to characterize digital communication, of whatever sort. Inside Higher Ed “flows” at the rate of a certain number of articles per day during the workweek. An online scholarly journal, by contrast, will probably trickle. A television network’s website -- or the more manic sort of Twitter feed -- will tend to gush. But the “streaming” principle is the same in any case, and you never step into the same river twice.

A recent paper by Mor Naaman and others from the School of Communication and Information at Rutgers University uses a significant variation on this concept, the “social awareness stream,” to label Twitter and Facebook, among other formats. Social awareness streams, according to Naaman et al., “are typified by three factors distinguishing them from other communication: a) the public (or personal-public) nature of the communication and conversation; b) the brevity of posted content; and, c) a highly connected social space, where most of the information consumption is enabled and driven by articulated online contact networks.”

Understanding those “articulated online contact networks” involves, for one thing, mapping them. And such mapping efforts have been underway since well before Twitter came on the scene. What makes the Twitter “stream” particularly interesting is that – unlike Facebook and other social-network services -- the design of the service permits both reciprocal connections (person A “follows” person B, and vice versa) and one-sided (A follows B, but that’s it). This makes for both strong and weak communicative bonds within networks -- but also among them. And various conventions have emerged to allow Twitter users to signal one another or to urge attention to a particular topic or comment. Besides “retweeting” someone’s message, you can address a particular person (using the @ symbol, like so: @JohnDoe) or index a message by topic (noted with the hashtag, thusly: #topicdujour).

All of this is, of course, familiar enough to anyone who uses Twitter. But it has important implications for just what kind of communication system Twitter fosters. To quote the title of an impressive paper by Haewoon Kwak and three other researchers from the department of computer science at the Korea Advanced Institute of Science and Technology: “What is Twitter, a Social Network or a News Media?” (No sense protesting that “media” is not a singular noun. Best to grind one’s teeth quietly.)

Analyzing almost 42 million user profiles and 106 million tweets, Kwak and colleagues find that Twitter occupies a strange niche that combines elements of both mass media and homophilous social groups. (Homophily is defined as the tendency of people to sustain more contact with those they judge to be similar to themselves than with those who they perceive to be dissimilar.) "Twitters shows a low level of reciprocity," they write. "77.9 percent of user pairs with any link between them are connected one-way, and only 22.1 percent have reciprocal relationships between them.... Previous studies have reported much higher reciprocity on other social networking services: 68 percent on Flickr and 84 percent on Yahoo."

In part, this reflects the presence on Twitter of already established mass-media outlets – not to mention already-famous people who have millions of “followers” without reciprocating. But the researchers find that a system of informal but efficient “retweet trees” also function “as communication channels of information diffusion.” Interest in a given Twitter post can rapidly spread across otherwise disconnected social networks. Kwak’s team found that any retweeted item would “reach an average of 1,000 users no matter what the number of followers is of the original tweet. Once retweeted, a tweet gets retweeted [again] almost instantly on the second, third, and fourth hops away from the source, signifying fast diffusion of information after the first retweet.”

Eventually someone will synthesize these and other analyses of Twitter’s functioning -- along with studies of other institutional and mass-media networks -- and give us some way to understand this post-McLuhanesque cultural system. In the meantime, research is being done on how to use the constant landslide of Twitter messages to gauge public attitudes and mood.

As Brendan O’Connor and his co-authors from Carnegie Mellon University note in a paper published last month, the usual method of conducting a public-opinion poll by telephone can cost tens of thousands of dollars. (Besides, lots of us hang up immediately on the suspicion that it will turn into a telemarketing call.)

Using one billion Twitter messages from 2008 and ’09 as a database, O’Connor and colleagues ran searches for keywords related to politics and the economy, then generated a “sentiment score” based on the lists of 1,600 “positive” and 1,200 “negative” words. They then compared these “text sentiment” findings to the results of more traditional public opinion polls concerning consumer confidence, the election of 2008, and the new president’s approval ratings. They found sufficiently strong correlation to be encouraging -- and noted that by the summer of 2009, when many more people were on Twitter than had been the case in 2008, the text-sentiment results proved a good predictor of consumer confidence levels.

A different methodology was used in “Modeling Public Mood and Emotion: Twitter Sentiment and Socio-Economic Phenomena” by John Bollen of Indiana University and two other authors. They collected all public tweets from August 1 to December 20, 2008 and harvested from them data about the content that could be plugged into “a well-established psychometric instrument, the Profile of Mood States” which “measures six individual dimensions of mood, namely Tension, Depression, Anger, Vigor, Fatigue, and Confusion.” This sounds like something from one of Woody Allen’s better movies.

The data crunching yielded “a six dimensional mood vector” covering the months in question. Which, as luck would have it, coincided with both the financial meltdown and the presidential election of 2008. The resulting graphs are intriguing.

Following the election, the negative moods (Tension, Depression, etc.) fell off. There was “a significant spike in Vigor.” Examination of samples of Twitter traffic showed “a preponderance of tweets expressing high levels of energy and positive sentiments.”

But by December 2008, as the Dow Jones Industrial Average fell to below 9000 points, the charts show a conspicuous rise in Anger -- and an even stronger one for Depression. The researchers write that this may have been an early signal of “what appears to be a populist movement in opposition to the new Obama administration.”

“Tweets may be regarded,” write Bollen and colleagues, “as microscopic instantiations of mood.” And they speculate that the microblogging system may do more than reflect shifts of public temper: “The social network of Twitter may highly affect the dynamics of public sentiment…[O]ur results are suggestive of escalating bursts of mood activity, suggesting that sentiment spreads across network ties.”

As good a reason as any to put this archive of the everyday into the time capsule. And while my perspective on this may be a little off-center, I think it is fair that the Twitter record should be stored at the the Library of Congress, which also houses the papers of the American presidents up through Theodore Roosevelt.

Almost 20 years ago, I started to work there just around the corner from the bound volumes containing, among other things, the diaries of George Washington. The experience of taking a quick look at them was something like a rite of passage for people working in the manuscript division. And to judge by later conversations among colleagues, the experience was usually slightly bewildering.

You would open the volume and gaze at the very page where his hand had guided the quill. You would start to read, expecting deep thoughts, or historical-seeming ones, at any rate. And this, more or less, is what you found on every page:

"Rained today. Three goats died. Need to buy a new plow.”

He had another 85 characters to spare.

P.S. Follow me on Twitter here, and keep up with news on scholarly publishing here.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Lifestyles of Mad Men

The first three seasons of "Mad Men" (the fourth begins on Sunday) were set in a world recognizable from The Hidden Persuaders, Vance Packard’s landmark work of pop sociology from 1957. Reviving the spirit of muckraking to probe the inner workings of postwar affluence, Packard reported on how the ad agencies on Madison Avenue used psychological research to boost the manipulative power of their imagery and catchphrases.

To prime the consumer market, habits and attitudes left over from the Great Depression had to be liquidated. Desire must be set free -- or at least educated into enough confidence to be assertive, Advertising meant selling not just a product but a dream. There was, for example, the famous ad campaign portraying women who found themselves in public, in interesting situations while wearing little more their Maidenform undergarments. The idea was to lodge the product in the potential consumer’s unconscious by associating it with a common dream situation.

But my sense is that "Mad Men" is poised to enter a new, post-Packardian phase. At the end of the third season, several characters left the established firm of Sterling Cooper and set out to create their own advertising “shop” – all of this not very long after the Kennedy assassination. Trauma seldom stalls the wheels of commerce for long. And we know, with hindsight, that American mass culture was just about to undergo a sudden, swift de-massification – the proliferation, over the next few years, of ever more sharply defined consumer niches and episodic subcultures.

Stimulating consumer desire by making an end run around the superego was no longer the name of the game. The new emphasis took a different form. It is best expressed by the term “lifestyle” -- which, as far as I can tell, was seldom used before the mid-60s, except as a piece of jargon from the Adlerian school of psychoanalytic revisionism.

Alfred Adler had coined the term to describe the functioning of the inferiority complex. (“Inferiority complex” was another Adler-ism; this was the concept that precipitated his break with Freud in the 1910s.) The neurotic, according to Adler, transformed his inferiority complex into a comprehensive structure of psychic defense – a whole pattern of life, designed to avoid its more disagreeable realities as much as possible.

Obviously “lifestyle” would acquire other meanings. But arguably that original sense is always there, below the surface. What looks like an identity or a niche has its shadow -- its underside of insecurity.

I don’t know how much Alfred Adler the creators of "Mad Men" have read. But they have certainly tuned into this dimension of its central characters.

Don and Peggy have crafted lives for themselves that express, not who they are, but who they want to be. (Or in Don’s case, who he wants to be taken to be. We’re talking double-encrypted personal inauthenticity.) They have turned feelings of inferiority and powerlessness into ambition -- rising to positions in advertising that enable them to elicit and channel those feelings in the consumer.

Pete (easily the most unlikable figure on the show) is the walking embodiment of status anxiety and a borderline sociopath. His only saving grace is that he is too ineptly Machiavellian to succeed at any scheme he might hatch. Unable to advance within the hierarchy at Sterling Cooper, he walked away to help start the new agency.

We’ve seen that he has one forward-looking idea: Pete realizes that there is an African-American market out there that advertisers could target. Nobody at Sterling Cooper had any interest in crafting campaigns to run in Jet magazine. But any sense that his role might be “progressive” runs up against the most salient thing about him: he is a hollow man, incapable of empathy but ready to turn the way the wind blows.

Vance Packard portrayed Madison Avenue as a place staffed by people who were competent and lucid, if not particularly scrupulous. Packard intended The Hidden Persuaders as social criticism, but the book participated in the technocratic imagination. It assumed that advertising’s best and brightest both possessed knowledge and could apply it, steering the marketplace by remote control.

Against this, "Mad Men" has been slowly building up a counternarrative. Its first season was set in 1960 -- the final year of the Eisenhower administration. The third season closed just after an assassin’s shots ended what would, in short order, be recalled as Camelot. A few scattered references have been made to a war underway in Southeast Asia.

Trust in the foresight of technocrats is about to take a hard fall. And the center of gravity in the advertising world is about to shift from masterful “hidden manipulators” to figures who can ride the wave of cultural upheaval because they are skilled at manufacturing niches for themselves.

The characters running the new agency are not confident engineers of consumer desire but – albeit in a special sense -- confidence artists. Not that they are swindlers. But they know how to fabricate a self and sell it to other people.

With its fourth season, "Mad Men" is on the verge of finally becoming a series about the Sixties. It is also a work of historical fiction about where consumerism came from, and what it was like. I suppose the past tense is unavoidable. Over the next decade, to judge by recent trends, people will need a leap of the imagination to remember the Golden Age of Lifestyles.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Storytelling

Once upon a time -- long, long ago -- I spent rather a lot of time reading about the theory of narrative. This was not the most self-indulgent way to spend the 1980s, whatever else you can say about it. Arguably the whole enterprise had begun with Aristotle, but it seemed to be reaching some kind of endgame around the time I was paying attention. You got the sense that narratologists would soon be able to map the genome of all storytelling. It was hard to tell whether this would be a good thing or a bad thing, but they sure seemed to be close.

The turning point had been the work of the Russian folklorist Vladimir Propp. In the late 1920s, he had broken down 100 fairy tales into a set of elementary “functions” performed by the characters, which could then be analyzed as occurring in various combinations according to a handful of fixed sequences. The unrelated-seeming stories were just variations on a very few algebraic formulas.

Of course, fairly tales tend to be pretty formulaic to begin with -- but with some tweaking, Propp's approach could be applied to literary texts. By the 1960s, French structuralist critics such as Roland Barthes and Gerard Genette were analyzing the writings of Poe and Proust (not to mention James Bond novels) to extract their narrative DNA. And then came Hayden White’s Metahistory: The Historical Imagination in Nineteenth Century Europe (1973), which showed how narratology might be able to handle nonfiction. White found four basic modes of “emplotment” -- romantic, comic, tragic, and satirical -- in the storytelling done by historians.

It was obviously just a matter of time before some genius came along to synthesize and supersede all of this work in book called Of Narratology, at least half of which would be written in mathematical symbols. The prospect seemed mildly depressing. In the end, I was more interested in consuming narratives (and perhaps even emitting them, from time to time) than in finding the key to all mythologies. Apart from revisiting Peter Brooks's Reading for the Plot: Design and Intention in Narrative (1984) -- the only book on the topic I recall with any pleasure -- narratology is one of those preoccupations long since forgotten.

And so Christian Salmon’s Storytelling: Bewitching the Modern Mind reads like a dispatch from the road not taken. Published in France in 2007 and recently issued in English translation by Verso, it is not a book contribution to the theory of narrative but a report on its practical applications. Which, it turns out, involve tremendous amounts of power and money -- a plot development nobody would have anticipated two or three decades ago.

“From the mid-1990s onward,” writes Salmon, concentration on narrative structure “affected domains as diverse as management, marketing, politics, and the defense of the nation.” To a degree, perhaps, this is obvious. The expression “getting control of the narrative” has long since become part of the lexicon of mass-media knowingness, at least in the United States. And Salmon -- who is a member of the Centre for Research in the Arts and Language in Paris and a columnist for Le Monde -- has one eye trained on the American cultural landscape, seeing it as the epicenter of globalization.

Roughly half of Salmon’s book is devoted to explaining to French readers the history and nuances of such ubiquitous American notions as “spin” and "branding." He uses the expression “narratocracy” to characterize the form of presidential leadership that has emerged since the days of Ronald Reagan. The ability to tell a compelling story is part of governance. (And not only here. Salmon includes French president Sarkozy as practitioner of “power through narrative.”)

Less familiar, perhaps, is the evidence of a major shift toward narrative as a category within marketing and management. Corporations treat storytelling as an integral part of branding; the public is offered not just a commodity but a narrative to consume. He quotes Barbara Stone, a professor of marketing at Rutgers University: “When you have a product that’s just like another product, there are any number of ways to compete. The stupid way is to lower prices. The smart way is to change the value of the product by telling a story about it.” And so you are not just buying a pair of pants, for example, but continuing the legacy of the Beat Generation.

“It is not as though legends and brands have disappeared,” writes Salmon. But now they “talk to us and captivate us by telling us stories that fit in with our expectations and worldviews. When they are used on the Web, they transform us into storytellers. We spread their stories. Good stories are so fascinating that we are encouraged to tell them again.”

Other stories are crafted for internal consumption. Citing management gurus, Salmon shows the emergence of a movement to use storytelling to regulate the internal life of business organizations. This sometimes draws upon the insights of well-known narrative artists of canonical renown, as in books like Shakespeare on Management. (Or Motivational Secrets of the Marquis de Sade, if I can ever sell that idea.) But it also involves monitoring and analyzing the stories that circulate within a business – the lore, the gossip, the tales that a new employee hears to explain how things got the way they are.

An organization’s internal culture is, from this perspective, the totality of the narratives circulating within it. “It is polyphonic,” notes Salmon, “but it is also discontinuous and made up of interwoven fragments, of histories that are talked about and swapped. They can sometimes be contradictory, but the company becomes a storytelling organization whose stories can be listened to, regulated, and, of course, controlled ... by introducing systematized forms of in-house communications and management based upon the telling of anecdotes.”

At the same time, the old tools of structuralist narratology (with its dream of reducing the world’s stock of stories to a few basic patterns) is reinvented as an applied science. One management guru draws on Vladimir Propp’s Morphology of the Folktale in his own work. And there are software packages that “make it possible to break a narrative text down into segments, to label its main elements and arrange its propositions into temporal-causal sequences, to identify scenes, and to draw up trees of causes and decisions.”

One day corporations will be able to harvest all the stories told about them by consumers and employees, then run them through a computer to produce brand-friendly counter-narratives in real time. That sort of thing used to happen in Philip K. Dick's paranoid science-fiction novels, but now it's hard to read him as anything but a social realist.

All of this diligent and relentless narrativizing (whether in business or politics) comes as a response to ever more fluid social relations under high-speed, quick-turnover capitalism.

The old system, in which big factories and well-established institutions were central, has given way to a much more fluid arrangement. Storytelling, then, becomes the glue that holds things together -- to the degree that they do.

The “new organizational paradigm,” writes Salmon, is “a decentralized and nomadic company…that is light, nimble, and furtive, and which acknowledges no law but the story it tells about itself, and no reality other than the fictions it sends out into the world.”

Not long after Storytelling originally appeared in 2007, the world’s economy grew less forgiving of purely fictive endeavors. The postscript to the English-language edition offers Salmon’s reflections on the presidential campaign of 2008, with Barack Obama here figured as a narratocrat-in-chief “hold[ing] out to a disoriented America a mirror in which shattered narrative elements can be put together again.”

This, it seems to me, resembles an image from a fairy tale. The “mirror” is a magical implement restoring to order everything that has been tending towards chaos throughout the rest of the narrative. Storytelling is a smart and interesting book, for the most part, but it suffers from an almost American defect: the desire for a happy ending.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

As Others See Us

A genome biologist, Gregory Petsko, has gone to bat for the humanities, in an open letter to the State University of New York at Albany president who recently (and underhandedly) announced significant cuts. (For those who haven’t been paying attention: the departments of theater, Italian, Russian, classics, and French at SUNY-Albany are all going to be eliminated).

If you are in academia, and Petsko’s missive (which appeared on this site Monday) hasn’t appeared on your Facebook wall, it will soon. And here’s the passage that everyone seizes on, evidence that Petsko understands us and has our back (that is, we in the humanities): "The real world is pretty fickle about what it wants. The best way for people to be prepared for the inevitable shock of change is to be as broadly educated as possible, because today's backwater is often tomorrow's hot field. And interdisciplinary research, which is all the rage these days, is only possible if people aren't too narrowly trained."

He's right. And if scientists want to speak up for the humanities, I’m all for it. But Petsko understands us differently than we understand ourselves. Why fund the humanities, even if they don’t bring in grant money or produce patents? Petsko points out "universities aren't just about discovering and capitalizing on new knowledge; they are also about preserving knowledge from being lost over time, and that requires a financial investment."

How many us willingly embrace that interpretation of what we do? "My interest is not merely antiquarian...." is how we frame the justification for our cutting edge research. Even as we express our dismay when crucial texts go out of print, any sacred flame that we were tending was blown out when the canon wars were fought to a draw. Why should we resurrect it? Because, says Petsko, "what seems to be archaic today can become vital in the future." His examples are virology and Middle Eastern studies. Mine is 18th-century literature — and with all the imaginative vigor at my disposal, I have trouble discerning the variation on the AIDS scare or 9/11 that would revive interest in my field. That’s OK, though: Petsko has other reasons why the humanities matter:

"Our ability to manipulate the human genome is going to pose some very difficult questions for humanity in the next few decades, including the question of just what it means to be human. That isn't a question for science alone; it's a question that must be answered with input from every sphere of human thought, including -- especially including -- the humanities and arts... If I'm right that what it means to be human is going to be one of the central issues of our time, then universities that are best equipped to deal with it, in all its many facets, will be the most important institutions of higher learning in the future."

Well, that would be great. I have no confidence, though, that we in the humanities are positioned to take advantage of this dawning world, even if our departments escape SUNY-style cost-cutting. How many of us can meaningfully apply what we do to "the question of just what it means to be human" without cringing, or adopting an ironic pose, or immediately distancing ourselves from that very question? How many of us see our real purpose as teaching students to draw the kinds of connections between literature and life that Petsko uses to such clever effect in his diatribe?

Petsko is not necessarily right in his perception of what the humanities are good for, nor are professionals in the humanities necessarily wrong to pursue another vision of what our fields are about. But there is a profound disconnect between how we see ourselves (and how our work is valued and remunerated in the university and how we organize our professional lives to respond to those expectations) and how others see us. If we're going to take comfort in the affirmations of Petsko and those outside of the humanities whom he speaks for, perhaps we need to take seriously how he understands what we do. Perhaps the future is asking something of us that we are not providing — or perhaps we need to do a better job of explaining why anyone other than us should care about what we do.

Author/s: 
Kirstin Wilcox
Author's email: 
info@insidehighered.com

Kirstin Wilcox is senior lecturer in English at the University of Illinois at Urbana-Champaign.

The Year in Reading

For this week’s column (the last one until the new year) I asked a number of interesting people what book they’d read in 2010 that left a big impression on them, or filled them with intellectual energy, or made them wish it were better known. If all three, then so much the better. I didn’t specify that it had to be a new book, nor was availability in English a requirement.

My correspondents were enthusiastic about expressing their enthusiasm. One of them was prepared to name 10 books – but that’s making a list, rather than a selection. I drew the line at two titles per person. Here are the results.

Lila Guterman is a senior editor at Chemical and Engineering News, the weekly magazine published by the American Chemical Society. She said it was easier to pick an outstanding title from 2010 than it might have been in previous years: “Not sleeping, thanks to a difficult pregnancy followed by a crazy newborn, makes it almost impossible for me to read!”

She named Rebecca Skloot’s The Immortal Life of Henrietta Lacks, published by Crown in February. She called it an “elegantly balanced account of a heartbreaking situation for one family that simultaneously became one of the most important tools of biology and medicine. It was a fast-paced read driven by an incredible amount of reporting: A really exemplary book about bioethics.”

Neil Jumonville, a professor of history at Florida State University, is editor of The New York Intellectual Reader (Routledge, 2007). A couple of collections of essays he recently read while conducting a graduate seminar on the history of liberal and conservative thought in the United States struck him as timely.

“The first is Gregory Schneider, ed., Conservatives in America Since 1930 (NYU Press, 2003). Here we find a very useful progression of essays from the Old Right, Classical Liberals, Traditional Conservatives, anticommunists, and the various guises of the New Right. The second book is Michael Sandel, Liberalism and Its Critics (NYU Press, 1984). Here, among others, are essays from Isaiah Berlin, John Rawls, Robert Nozick, Alisdair MacIntyre, Michael Walzer, a few communitarians represented by Sandel and others, and important pieces by Peter Berger and Hannah Arendt.”

Reading the books alongside one another, he said, tends to sharpen up one's sense of both the variety of political positions covered by broad labels like “liberal” and “conservative” and to point out how the traditions may converge or blend. “Some people understand this beneficial complexity of political positions,” he told me, “but many do not.”

Michael Yates retired as a professor of economics and labor relations at the University of Pittsburgh at Johnstown in 2001. His most recent book is In and Out of the Working Class, published by Arbeiter Ring in 2009.

He named Wallace Stegner’s The Gathering of Zion: The Story of the Mormon Trail, originally published in 1964. “I am not a Mormon or religious in the slightest degree,” he said, “and I am well aware of the many dastardly deeds done in the name of the angel Moroni, but I cannot read the history of the Mormons without a feeling of wonder, and I cannot look at the sculpture of the hand cart pioneers in Temple Square [in Salt Lake City] without crying. If only I could live my life with the same sense of purpose and devotion…. It is not possible to understand the West without a thorough knowledge of the Mormons. Their footprints are everywhere."

Adam Kotsko is a visiting assistant professor of religion at Kalamazoo College. This year he published Politics of Redemption: The Social Logic of Salvation (Continuum) and Awkwardness (Zero Books).

“My vote," he said, "would be for Sergey Dogopolski's What is Talmud? The Art of Disagreement, on all three counts. It puts forth the practices of Talmudic debate as a fundamental challenge to one of the deepest preconceptions of Western thought: that agreement is fundamental and disagreement is only the result of a mistake or other contingent obstacle. The notion that disagreements are to be maintained and sharpened rather than dissolved is a major reversal that I'll be processing for a long time to come. Unfortunately, the book is currently only available as an expensive hardcover.”

Helena Fitzgerald is a contributing editor for The New Inquiry, a website occupying some ambiguous position between a New York salon and an online magazine.

She named Patti Smith’s memoir of her relationship with Robert Mapplethorpe, Just Kids, published by Ecco earlier this year and recently issued in paperback. “I've found Smith to be one of the most invigorating artists in existence ever since I heard ‘Land’ for the first time and subsequently spent about 24 straight with it on repeat. She's one of those artists who I've long suspected has all big secrets hoarded somewhere in her private New York City. This book shares a satisfying number of those secrets and that privately legendary city. Just Kids is like the conversation that Patti Smith albums always made you want to have with Patti Smith.”

Cathy Davidson, a professor of English and interdisciplinary studies at Duke University, was recently nominated by President Obama to serve on the National Council on the Humanities. She, too, named Patti Smith’s memoir as one of the books “that rocked my world this year.” (And here the columnist will interrupt to give a third upturned thumb. Just Kids is a moving and very memorable book.)

Davidson also mentioned rereading Tim Berners-Lee's memoir Weaving the Web, first published by HarperSanFrancisco in 1999. She was “inspired by his honesty in letting us know how, at every turn, the World Wide Web's creation was a surprise, including the astonishing willingness of an international community of coders to contribute their unpaid labor for free in order to create the free and open World Wide Web. Many traditional, conventional scientists had no idea what Berners-Lee was up to or what it could possibly mean and, at times, neither did he. His genius is in admitting that he forged ahead, not fully knowing where he was going….”

Bill Fletcher Jr., a senior scholar at the Institute for Policy Studies, is co-author, with Fernando Gapasin, of Solidarity Divided, The Crisis in Organized Labor and A New Path Toward Social Justice, published by the University of California Press in 2009.

He named Marcus Rediker and Peter Linebaugh’s The Many-Headed Hydra: The Hidden History of the Revolutionary Atlantic (Beacon, 2001), calling it “a fascinating look at the development of capitalism in the North Atlantic. It is about class struggle, the anti-racist struggle, gender, forms of organization, and the methods used by the ruling elites to divide the oppressed. It was a GREAT book.”

Astra Taylor has directed two documentaries, Zizek! and Examined Life. She got hold of the bound galleys for James Miller’s Examined Lives: From Socrates to Nietzsche, out next month from Farrar Straus and Giroux. She called it “a book by the last guy I took a university course with and one I've been eagerly awaiting for years. Like a modern day Diogenes Laertius, Miller presents 12 biographical sketches of philosophers, an exploration of self-knowledge and its limits. As anyone who read his biography of Foucault knows, Miller's a master of this sort of thing. The profiles are full of insight and sometimes hilarious.”

Arthur Goldhammer is a senior affiliate of the Center for European Studies at Harvard University and a prolific translator, and he runs an engaging blog called French Politics.

“I would say that Florence Aubenas' Le Quai de Ouistreham (2010) deserves to be better known,” he told me. “Aubenas is a journalist who was held prisoner in Iraq for many months, but upon returning to France she did not choose to sit behind a desk. Rather, she elected to explore the plight of France's ‘precarious’ workers -- those who accept temporary work contracts to perform unskilled labor for low pay and no job security. The indignities she endures in her months of janitorial work make vivid the abstract concept of a ‘dual labor market.’ Astonishingly, despite her fame, only one person recognized her, in itself evidence of the invisibility of social misery in our ‘advanced’ societies.”

Anne Sarah Rubin is an associate professor of history at the University of Maryland, Baltimore County and project director for Sherman’s March and America: Mapping Memory, an interactive historical website.

The book that made the biggest impression on her this year was Judith Giesberg's Army at Home: Women and the Civil War on the Northern Home Front, published by the University of North Carolina Press in 2009. “Too often,” Rubin told me, “historians ignore the lives of working-class women, arguing that we don't have the sources to get inside their lives, but Giesberg proves us wrong. She tells us about women working in Union armories, about soldiers' wives forced to move into almshouses, and African Americans protesting segregated streetcars. This book expands our understanding of the Civil War North, and I am telling everyone about it.”

Siva Vaidhyanathan is a professor of media studies and law at the University of Virginia. His next book, The Googlization of Everything: (And Why We Should Worry), will be published by the University of California Press in March.

He thinks there should have been more attention for Carolyn de la Pena's Empty Pleasures: The Story of Artificial Sweeteners from Saccharin to Splenda, published this year by the University of North Carolina Press: “De la Pena (who is a friend and graduate-school colleague) shows artificial sweeteners have had a powerful cultural influence -- one that far exceeds their power to help people lose weight. In fact, as she demonstrates, there is no empirical reason to believe that using artificial sweeteners helps one lose weight. One clear effect, de la Pena shows, is that artificial sweeteners extend the pernicious notion that we Americans can have something for nothing. And we know how that turns out.”

Vaidhyanathan noted a parallel with his own recent research: “de la Pena's critique of our indulgent dependence on Splenda echoes the argument I make about how the speed and simplicity of Google degrades our own abilities to judge and deliberate about knowledge. Google does not help people lose weight either, it turns out.”

Michael Tomasky covers U.S. politics for The Guardian and is editor-in-chief of Democracy: A Journal of Ideas.

“On my beat,” he said, “the best book I read in 2010 was The Spirit Level (Bloomsbury, 2009), by the British social scientists Richard Wilkinson and Kate Pickett, whose message is summed up in the book's subtitle, which is far better than its execrable title: ‘Why Greater Equality Makes Societies Stronger.’ In non-work life, I'm working my way through Vasily Grossman's Life and Fate from 1959; it's centered around the battle of Stalingrad and is often called the War and Peace of the 20th century. I'm just realizing as I type this how sad it is that Stalingrad is my escape from American politics.”

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Sorry

I was a graduate student in the 1980s, during the heyday of the so-called “culture wars” and the curricular attacks on "Western civilization." Those days were punctuated by some Stanford students chanting slogans like "Hey hey, ho ho, Western Civ has got to go," and by fiery debates about Allan Bloom’s book The Closing of the American Mind, which appeared in 1987, toward the end of my years in graduate school. Back then the battle lines seemed clear: conservatives were for Western civilization courses and the traditional literary canon, while liberals and progressives were against those things and for a new, more liberating approach to education.

In retrospect I find that decade and its arguments increasingly difficult to comprehend, even though I experienced them firsthand. I ask myself: What on earth were we thinking? Exactly why was it considered progressive in the 1980s to get rid of courses like Western civilization (courses that frequently included both progressives and conservatives on their reading lists)? And why did supporting a traditional liberal arts education automatically make one a conservative — especially if such an education included philosophers like Jean-Jacques Rousseau and Karl Marx?

A quarter of a century later, with the humanities in crisis across the country and students and parents demanding ever more pragmatic, ever more job-oriented kinds of education, the curricular debates of the 1980s over courses about Western civilization and the canon seem as if they had happened on another planet, with completely different preconceptions and assumptions than the ones that prevail today. We now live in a radically different world, one in which most students are not forced to take courses like Western civilization or, most of the time, in foreign languages or cultures, or even the supposedly more progressive courses that were designed to replace them. And whereas as late as the 1980s English was the most popular major at many colleges and universities, by far the most popular undergraduate major in the country now is business.

The battle between self-identified conservatives and progressives in the 1980s seems increasingly like rearranging the deck chairs on the Titanic. While humanists were busy arguing amongst themselves, American college students and their families were turning in ever-increasing numbers away from the humanities and toward seemingly more pragmatic, more vocational concerns.

And who can really blame them? If humanists themselves could not even agree on the basic value, structure, and content of a liberal arts education — if some saw the tradition of Western civilization as one of oppression and tyranny, while others defended and validated it; if some argued that a humanistic education ought to be devoted to the voices of those previously excluded from "civilized" discussion, such as people of color and women, while others argued that such changes constituted a betrayal of the liberal arts — is it any wonder that students and their families began turning away from the humanities?

After all, economics and business professors did not fight about the basic structure of business or economics majors, even though there were differences among Keynesian and Friedmanite economists, for instance, over monetary policy. And physics professors did not engage in fundamental debates about physics curriculums — which should one teach, quantum mechanics or relativity? — in spite of Einstein’s problems with quantum mechanics ("God does not play dice with the universe"). In the 1980s the humanities as a whole seemed to be the only field where even experts were unable to agree on what constituted the appropriate object of study.

If I go to a doctor’s office and witness doctors and nurses fighting about whether or not I should take a particular medication, I’m likely to go elsewhere for my health care needs. I think something analogous happened to the humanities in the 1980s, and it is continuing to happen today, although by now the humanities are so diminished institutionally that these changes no longer have the overall significance they had in the 1980s. In the 1980s the humanities still constituted the core of most major universities; by now, at most universities, even major ones, the humanities are relatively marginal, far surpassed, in institutional strength, by business, medical, and law schools.

One of the core functions of the humanities for centuries was the passing down of a tradition from one generation to the next. The idea behind Western civilization courses was supposed to be that students needed them in order to understand the origins and development of their own culture. In the 1980s three developments worked against that idea. The first was an educational establishment that was no longer content simply to pass knowledge down from one generation to the next, and that wanted to create new knowledge. The second development, which dovetailed with the first, was the emergence of new approaches to the humanities that examined structures of oppression and domination in traditions previously viewed as unimpeachable. One could examine women's history, for instance, or non-Western cultures. The third development, which dovetailed with the first and second, was the increasing demand for “relevance” in higher education, with "relevance" being understood as present-oriented and pragmatic, i.e. job-related.

The conflation of these three developments led to the widespread perception — and not just among self-proclaimed progressives — that anything traditional or old was also, almost by definition, conservative, fuddy-duddy, and impractical. In essence those three developments have now long since triumphed, and the educational world of today is largely the result of that triumph.

Unfortunately, however, traditions that are not passed on from one generation to the next die. If an entire generation grows up largely unexposed to a particular tradition, then that tradition can in essence be said to be dead, because it is no longer capable of reproducing itself. It does not matter whether the tradition in question is imagined as the Western tradition, the Christian tradition, or the Marxist tradition (and of course both Christianity and Marxism are part of the Western tradition). Traditions are like languages: if they are not passed on, they die. Most traditions, of course, have good and bad elements in them (some might argue for Christianity, some for Marxism, relatively few for both), and what dies when a tradition dies is therefore often both good and bad, no matter what one’s perspective. But what also dies with a tradition is any possibility of self-critique from within the tradition (in the sense that Marxism, for instance, constituted a self-critique from within the Western tradition), since a tradition’s self-critique presupposes the existence of the tradition. Therefore the death of a tradition is not just the death of the oppression and tyranny that might be associated with the tradition, but also the death of progressive and liberating impulses within the tradition.

We all know, of course, that nature abhors a vacuum, and for that reason when a tradition dies, what fills in the vacuum where the tradition used to be is whatever is strongest in the surrounding culture. In our culture we know quite well what that is: the belief in money, in business, in economics, and in popular culture. That is our real religion, and it has largely triumphed over any tradition, either progressive or tyrannical. It is no more a coincidence that business is the most popular major in the United States today than it was that theology was one of the major fields of the 1700s.

As a result of the triumph of relevance and pragmatism over tradition, the ivy-covered walls of academia, which once seemed so separated from what is often called the “real world,” now offer very little protection from it. In fact the so-called "real world" almost entirely dominates the supposedly unreal world of academia. It may have once been true that academia offered at least a temporary sanctuary for American students on their way to being productive, hard-working contributors to a booming economy; now, however, academia offers very little refuge to students on their way into a shaky, shell-shocked economy where even the seemingly rock-solid belief in the “free market” has been thrown into question. In 1987 Allan Bloom wrote: "Education is not sermonizing to children against their instincts and pleasures, but providing a natural continuity between what they feel and what they can and should be. But this is a lost art. Now we have come to exactly the opposite point." Over two decades later, it seems to me that Bloom was right, and that indeed we have come “to exactly the opposite point.” Unfortunately now, neither self-styled conservatives nor self-styled progressives are likely to want to defend a vision of education that even in Bloom’s view was long gone. And sadder still is the fact that few of our students will even realize what has been lost.

And so I think we owe an apology to our students. We humanists inherited a tradition more or less intact, with all its strengths and weaknesses, but it appears highly likely that we will not be able or willing to pass it on to them. That is a signal failure, and it is one for which we will pay dearly. No doubt there is lots of blame to go around, but instead of looking around for people to blame, it would be more constructive to save what we can and pass it along to the next generation. They are waiting, and we have a responsibility.

Author/s: 
Stephen Brockmann
Author's email: 
info@insidehighered.com

Stephen Brockmann is professor of German at Carnegie Mellon University and president of the German Studies Association.

'Greatest Generation' Gen Ed

In the context of the news that day in February, the announcement was almost jarring in its banality. On a day when legislators at all levels and all over the country were in full panic mode about budget deficits, and at a time when public investments in education, particularly higher education and most particularly the liberal arts, were being offered as examples of excessive government spending, a new commission had been formed.

At the request of a bipartisan group of members of Congress, the American Academy of Arts and Sciences had gathered a group of distinguished citizens and asked them to recommend 10 actions "that Congress, state governments, universities, foundations, educators, individual benefactors, and others should take now to maintain national excellence in humanities and social scientific scholarship and education, and to achieve long-term national goals for our intellectual and economic well-being." A bipartisan request to form a group to engage in long-range planning about the nation’s intellectual well-being by focusing on the liberal arts — such an announcement not only seemed out of place in the newspapers that day, it seemed almost to come from another generation.

Had these people not heard that, as House Speaker John Boehner put it, "We’re broke"? Didn’t they — these misguidedly bipartisan legislators and anachronistic advocates of the liberal arts — realize that we were in a crisis that precluded long-term planning and collective action? How could they fail to see that education today must focus on job training and economic competitiveness? And what were they thinking in focusing on liberal arts?

It has indeed been hard in recent months to hear anything other than the voices of doom. But the language spoken by these voices represents its own form of crisis, for it is almost entirely economic, as if all relevant factors in our current situation could be captured on a spreadsheet or a ledger. The reduction of complex social and political issues to economics signifies a failure of imagination; and "fiscal responsibility," while an excellent principle at all times, has come to serve as a proxy for our fears that we have lost our way in the world, that the future will not be as bright for our children as it was for us when we were young, that America is being outcompeted by countries that used to be "third world," that the future has somehow gotten away from us.

Fear, whose radical form is terror, has temporarily crippled our national imagination. Many young people today can barely recall a time when we were not subject to the shadowy horrors of terror and terrorists. Today, 10 years after 9-11, terror is a fact of life, and fear makes all the sense in the world. How else to explain the emergence of what are in effect survivalist and vigilante attitudes among so many of our political leaders?

At this time, it is useful for those with longer memories to recall that "other generation" that the current effort to support the liberal arts so strongly evokes. This would be the generation that, having fought their way out of the Great Depression, went out and won World War II. That generation, like ours, had things to fear, but they conquered their fears by taking action, including creating a commission charged with long-term planning for the nation’s educational system, focusing on liberal education.

This commission, created by President James Bryant Conant of Harvard, was formed in 1943, in the middle of the war, and completed virtually all of its work while the outcome of the war was still uncertain. Still, the vision its members announced was confident, spacious and radical. Their report, General Education in a Free Society — or the “Redbook,” as it was called — outlined a program of liberal education for both high school and college students, with required courses in the sciences, the social sciences, and the humanities. The intention was to extend to masses of people — including the hundreds of thousands of returning soldiers who would be going to college on the new GI Bill — the kind of non-vocational education previously available only to a select few.

Such a program, the commission thought, would be profoundly American in that it would prepare people for citizenship in a democracy, giving them what they needed not just to find a job but to live rich and abundant lives, the kinds of lives that people in less fortunate societies could only dream about. Announcing the great mission of American education and the new shape of American society after the war, the Redbook was hailed as a powerful symbol of national renewal, and served as an announcement of America’s cultural maturity. Its main arguments were translated into national policy by the six-volume 1947 "Truman Report," called Higher Education for American Democracy.

The program bespoke confidence in democracy, and in the ability of people to decide the course of their lives for themselves. It suggested, too, a conviction that a democracy based on individual freedom required some principle of cohesion, which would, in the program they outlined, be provided by an understanding of history and culture, which they entrusted to the humanities.

Of course, not every institution of higher education has followed this extraordinarily ambitious and idealistic vision. Indeed, by one recent account, only 8 percent of all American institutions of higher education give their students a liberal education. But that 8 percent includes virtually every institution known to the general populace, including Cal Tech and MIT. With their unique dedication to liberal education, American universities are acknowledged to be the best in the world at two of the central tasks of higher education: educating citizens and conducting research.

Mass liberal education was advocated in the face of challenges every bit as great as those we face today. As a consequence of the war, the national debt had exploded, reaching unprecedented levels (121 percent of GDP in 1946, compared with 93 percent in 2010). And as the grim realities of the Cold War set in, including the prospect of nuclear annihilation and the widespread fear of enemies within, many people felt that the nation was vulnerable in ways it never had been. It would have been understandable if the nation had tried to hedge against an unpredictable future by cutting spending, turning inward, and retooling the educational system so that it would produce not well-rounded citizens but technocrats, managers, nuclear engineers, and scientists.

Instead, we created the Marshall Plan, built the interstate highway system, and increased access to higher education so dramatically that, by 1960, there were twice as many people in higher education as in 1945. And incidentally, the middle class was strong and growing, and the fight for civil rights acquired an irresistible momentum. Things were very far from perfect, but we unhesitatingly call the generation that accomplished all this "the greatest."

What really distinguished the American philosophy of higher education in the generation after WWII was its faith in the future. People educated under a system of liberal education were expected not to fill slots but to create their lives in a world that could not be predicted but did not need to be feared. The lesson for today is perfectly clear. Terrors will always be with us, but we can choose to confront them through collective action and a recommitment to the core principles of democracy, including access, for those who wish to have it and are able to profit from it, to a liberal education. "We’re broke" is a sorry substitute for the kind of imagination and boldness needed now, or at any time. We must take the long view, the global view, and the view that does the most credit to ourselves.

I would not presume to tell the new commission which steps to support the liberal arts they should endorse. But I would urge on them a general principle: that liberal education should not be considered a luxury that can be eliminated without cost, much less an expensive distraction from the urgent task of economic growth, but a service to the state and its citizens. It is an essential service because it reflects and strengthens our core commitments as a nation, without which we truly would be broke.

Author/s: 
Geoffrey Harpham
Author's email: 
info@insidehighered.com

Geoffrey Harpham is president and director of the National Humanities Center. His new book is The Humanities and the Dream of America (University of Chicago Press).

The Struggle for Recognition

In a memorable passage from The Philosophy of History, Hegel quotes a common saying of his day that runs, “No man is a hero to his valet-de-chambre.” This corresponds, in contemporary terms, to the familiar sentiment that even the most distinguished individual “puts his pants on one leg at a time like everybody else.” It is somewhere between wisdom and truism. But Hegel seems to take it badly. After quoting the proverb, he adds his own twist: “not because the former is no hero, but because the latter is a valet.”

In other words, the portrait of a world-transforming figure -- say, Napoleon -- left by somebody who shined his shoes and helped him to bed after a night of drinking is no basis for judging the meaning of said figure’s life. For that, presumably, you need a philosopher. Hegel mentions in passing that his quip was repeated “ten years later” by Goethe. I imagine being very casual while dropping that reference, as his students in the lecture hall go “Dude!” (or whatever the German equivalent of emphatic amazement was in 1830).

The dig at butlers seems awfully snobbish – and also rather unwise, at least to admirers of P.G. Wodehouse. But its thrust is really aimed elsewhere. He is thinking of something that is still fairly new in the early 19th century: a mass public, eager to consume intimate revelations and psychological speculations regarding powerful and influential people. This means wallowing in envy and egotism. Hegel says it is driven by the “undying worm” of realizing that one’s “excellent views and vituperations remain absolutely without result in the world.” Anyone distinguished is thereby reduced “to a level with – or rather a few degrees lower than – the morality of such exquisite discerners of spirits.”

This sounds irritable enough. And remember, the telegraph hadn’t even been invented yet. The golden age of cutting everybody down to size was still to come. Nor, indeed, has it ended.

But Joel Best’s new book Everyone’s a Winner: Life in Our Congratulatory Culture, published by the University of California Press, describes a situation that appears, at first blush, the exact opposite of the one that bothered Hegel. The word “heroic,” writes Best, a professor of sociology at the University of Delaware, “once applied narrowly to characterize great deeds by either mythic or historical figures,” but is now often “broadened to encompass virtually anyone who behaves well under difficult – even potentially difficult – circumstances.” And sometimes not even that. (When Stephen Colbert tells his audience that they’re the real heroes, it satirizes the way certain cable TV demagogues flatter the American couch potato.)

“Activists are heroes,” he writes. “Coal miners are heroes. People with terminal cancer are heroes. A word once reserved for the extraordinary is now applied to the merely admirable.”

This is one aspect of a pattern that Best finds emerging in numerous domains of American life. There is an abundance of claims to eminence and excellence. The awards proliferate as we hold public celebration of achievement in every activity imaginable. Restaurants display their rankings from local newsweeklies. Universities are almost always certifiably distinguished, in some regard or other. A horror movie called The Human Centipede (First Sequence) won the 2010 Scream Award in the category “most memorable mutilation.” I have seen the film and believe it deserved this honor. (Seriously, you don’t want to know.)

Anyone possessing even a slight curmudgeonly streak will already have had suspicions about this trend, of course. Best corroborates it with much evidence. A case in point is his graph of the number of British and American awards for mystery novels. In 1946, the figure stood at five. By 1979, it had grown to five times that many, and in 2006 (the last year he charts), there were roughly 110. “Nor is the trend confined to book awards,” he notes. “The number of film prizes awarded worldwide has grown to the point that there are now nearly twice as many awards as there are full-length movies produced. For both books and films, the number of prizes has grown at a far faster clip than the numbers of new books or movies.”

The Congressional Gold Medal honoring an outstanding contribution to the nation (its first recipient, in 1776, was George Washington) was presented five times in the course of the 1950s. The frequency of the award has grown since. Between 2000 and 2009, it was given out 22 times.

The examples could be multiplied, perhaps exponentially. The range of people, products, and activities being honored has expanded. At the same time, the number of awards in each category tends to grow. In short, the total energy invested in assessing, marking, and celebrating claims about status (that is, worthiness of respect or deference) seems to have increased steadily over the past few decades in the United States -- and Best says that discussions with colleagues in Canada, Japan, and Western Europe suggest that the same trend has emerged in other countries.

Older ways of looking at status regarded it as a rare commodity. Gaining it, or losing it was fraught, with anxiety. And it still is, but something important has changed. Hegel’s comments imply that powerlessness and lack of status were bound to inspire resentment over established claims to excellence and significance. In a condition of “status scarcity,” there is bound to be a struggle that unleashes destructive tendencies. But Best maintains that another dynamic has emerged -- the manufacture of status on an almost industrial scale, rather than a mass society in which status is smashed.

This tendency overlaps with the profusion of what he terms “social worlds” (what might otherwise be called subcultures or lifestyle cohorts) which that emerge as people with shared interests or commitments gather and form their own organizations. Giving and getting awards often becomes part of consolidating the niche.

“The perceived shortage of status,” he writes, reflecting a sense of “insufficient status being given to people like us, is one of the reasons disenchanted people form new social worlds.” Doing so “means that folks aren’t forced to spend their whole lives in circles where they inevitably lose the competition for status. Rather, by creating their own worlds, they acquire the ability to mint status of their own. They can decide who deserves respect and why.” The result is what Best calls "status affluence." There are, he acknowledges, grounds to criticize this situation – an obvious one being that status, like currency, becomes devalued when too much of it is being put into circulation. On the whole, though, he judges it as salutary, and as making for greater social cohesion and stability.

And in any case, there is no obvious way to change it. A few years ago, a bill calling for no more than two Congressional Gold Medals to be issued per year won some support -- only to end in limbo. If there is a tap to control the flow of awards, nobody knows how to work it.

"Status affluence" isn't the same as equality -- and I'm struck by the sense that it coexists with profound and growing socioeconomic disparities. As Joseph E. Stiglitz recently pointed out, the income of the top 1 percent in the United States has grown by 18 percent over the past decade, while people in the middle have seen their incomes shrink: "While many of the old centers of inequality in Latin America, such as Brazil, have been striving in recent years, rather successfully, to improve the plight of the poor and reduce gaps in income, America has allowed inequality to grow." As interesting as Best's book is, it leaves me wondering if status affluence isn't a symptom, rather than a sign that the distribution of recognition has grown more equitable. A parachute is better than nothing, but this one seems like it might be made of papier-mâché.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Like a Rolling Stone

This coming weekend's conference on the late Ellen Willis -- essayist, radical feminist, and founder of the cultural reporting and criticism program at New York University -- begins to look as if it is going to be rather a big deal. It coincides with publication by the University of Minnesota Press of Out of the Vinyl Deeps: Ellen Willis on Rock Music, which, besides doing wonders for the reputations of Moby Grape and Creedence Clearwater Revival, is going to consolidate Willis’s role as a figure young writers read, and reread, and dream of somehow becoming. Originally the conference was planned for a small meeting space somewhere in downtown New York, but it’s been relocated to the Tishman Auditorium at NYU, which holds 450 people. Five years after her death, this is Ellen Willis’s moment

As someone who began reading her work almost 30 years ago (to an 18-year-old Velvet Underground fanatic, any collection of essays called Beginning to See the Light needed no further recommendation), I am happy to think so. And as someone scheduled to speak during the first session -- but nowhere near finishing his paper -- I am terrified to think so. Meanwhile, the organizers keep reminding the panelists that the event is being moved to a bigger venue, due to popular demand. And would we please be sure to get there on time? Maybe they are afraid of an unruly crowd; the warm-up act needs to get on stage without undue delay.

So, yes: a big, anxious deal. Though mostly a celebration. A small sampling of her work is available on a website run by her daughter, although this is no substitute for the three collections of essays on social and cultural matters that appeared during her lifetime.

Speaking of which, somebody at the conference needs to address the issue of how it happens that Out of the Vinyl Deeps is only appearing just now. Why is it only in 2011 that we have a book demonstrating that she was one of the best rock critics of the 1960s and ‘70s? Those decades have been mythologized as the era when rock writers of gigantic stature -- Lester Bangs, Robert Christgau, Nick Kent, Greil Marcus, Dave Marsh, Richard Meltzer, and Nick Tosches -- thundered across the countercultural landscape, sometimes doing battle, like dinosaurs. (Big, stoned dinosaurs.) You can find collections of work by all of these guys, and ardent fanboys ready to debate their respective degrees of eminence. In fact, I listed them alphabetically to avoid that sort of thing.

There were only a handful of pieces on rock in Willis's first collection of essays (and none in the subsequent volumes, which focused on feminist theory and cultural politics), but they were stunning. Anecdotal evidence and personal experience suggest that rereading them repeatedly was not an uncommon response. And when you did, you heard (and felt) songs by Bob Dylan, or the Who, or the Velvet Underground, in ways you never had before. She was as insightful as any of the dino-critics -- and a much better writer than some of them -- yet Willis never really figured in the legend.

With dozens of her writings on popular music now gathered between covers, this will change. But again, what took so long? This must be explained. (The possibility of an all-male species of dinosaur was unlikely in any event.)

Most of the pieces in the new book appeared in The New Yorker, to which Willis began contributing in 1968. A few months later, in response to the prevailing and otherwise intractable sexism of the New Left, she started the influential group Redstockings along with Shulamith Firestone, who soon wrote The Dialectic of Sex: The Case for Feminist Revolution (1970).

It was another Redstockings member, Carol Hanisch, who coined the phrase “the personal is political.” And on a personal-political note, I will mention that reading Firestone’s manifesto as a teenager scared the hell out of me, in a salutary way. The trauma had passed by the time Willis collected her own feminist writings in No More Nice Girls: Countercultural Essays (Wesleyan, 1992) -- a volume it is particularly interesting to read alongside Daring to Be Bad: Radical Feminism in America (University of Minnesota Press, 1989), for which Willis wrote the introduction. Clearly this sort of material is still upsetting to some people. A blogger named Doug Phillips, for example, blames Ellen Willis and the Willis-ites for “promot[ing] ultra-radical lesbian-feminist politics, trans-sexuality, and mother goddess worship.” Like that’s a bad thing.

While her libertarian worldview would certainly accommodate transsexual lesbian pagans in its conception of the good society, anyone who actually reads Ellen Willis will learn that she was, in fact, an enthusiastically heterosexual atheist who, at some point, accepted monogamy in practice, if not in theory. None of which will give Doug Phillips much comfort. But apart from specifying her exact position within the culture wars, the stray bits of personal information in her work are interesting for what they reveal about Willis as a writer.

Some of her most memorable pieces were in the vein of what used to be called the New Journalism, in which the reporter’s subjectivity is part of the narrative. But this amounts to only a small part of her output. The proliferation of memoir may be an indirect effect of feminism (“the personal is the literary”), but the role of the “I” in Willis is rarely confessional. Her essays, while usually familiar in tone, tend to be analytic in spirit. The first-person is a lens, not a mirror.

As mentioned, Out of the Vinyl Deeps is Willis’s fourth volume of essays. Following the last one she saw through the press, Don’t Think, Smile! Notes on a Decade of Denial (Beacon, 2000), she published a fair amount of uncollected material and was working on an interpretation of American culture from the perspective of Wilhelm Reich’s psychoanalytic theory. So perhaps there will be another posthumous volume at some point.

If so, it would be her fifth collection -- and her sixth book. Like most readers, I have always assumed that Beginning to See the Light, from 1981, was her first title. (It was reprinted by Wesleyan in 1992.) But almost 20 years earlier, Willis published another book. She did not list it in the summary of her career appearing in volume 106 of the reference-book series Contemporary Authors (Gale Publishers) and seems never to have referred to it in print. Indeed, I wondered if the Library of Congress cataloger didn’t make a mistake by listing Questions Freshmen Ask: A Guide for College Girls (E.P. Dutton, 1962) as written by the same author as No More Nice Girls. After all, there could be two Ellen Willises.

And in a way, there were. I’m still trying to figure out the relationship between them -- how the one became the other.

On page 4, the author of Questions Freshmen Ask explains her qualifications for writing the book: “As a graduate of Barnard College, I feel I have had the kind of experience that enables me to provide the answers to many of your questions. Since Barnard is on the one hand a small women’s college and on the other, part of a large coeducational institution (Columbia University), I am aware of the problems of both types of schools.”

The entry for Ellen Willis in Contemporary Authors notes that she graduated from Barnard in 1962. The 20-year-old author occasionally turns a phrase or writes in a rhythm that will sound familiar to aficionados of her older self -- and the introduction by Barbara S. Musgrave, class dean of Smith College, commends the book as “written so engagingly it gives something of the flavor of college ahead of time.”

It is certainly a time capsule. Exhibit A: “Most colleges estimate that books will cost you in the neighborhood of seventy-five dollars a year.” Exhibit B: “Freshmen often resent all the new regulations under which they are asked to live…. The fact is that your college is less interested in your individual welfare than in the smooth running of the community as a whole.” (Fifty years later, the in loco parentis rules Willis has in mind are long dead. And the administration's communitarian motives count less than its interest in not getting sued.)

Some of the advice remains valid -- especially the parts about the need to budget time and money. And the occasional bit of historical context can be glimpsed between the lines. The author’s freshman year would have been not long after the Sputnik launch. The push was on to expand access to higher education so that the nation would not be overwhelmed by superior brainpower. Willis is explicit about offering guidance to girls who will enter college with no idea what to expect, because their parents didn’t go.

“In the old days,” she writes, “when money or an influential relative seemed almost a ticket of admission to the campus, a student didn’t have to be too purposeful about college. A girl could shrug and say she wanted to go to college, well, because all her friends were going and it had never occurred to her not to go. But times have changed, and you can’t afford to be aimless -- not if you want to justify the admissions director’s faith in you.”

As with a recommendation to “be a good sport” about nitpicky campus rules, this stress on living up to the expectation of an authority figure is hard to square with the later Ellen Willis. But there are passages in which (with abundant hindsight, admittedly) you can see the fault lines.

“No matter what you eventually do after you graduate,” she writes, “you will want to have a mind that’s alert and full of ideas. There will be books you want to understand, important decisions to make, leisure time to fill. With the mental resources your education provides, you will be able to enjoy life more fully….”

Here, the Willis fan thinks: Yes, I know this author. But then you hit a passage like this: “If you spend four years at college single-mindedly preparing yourself for a television production job in New York, and then end up marrying an anthropologist who has to live in the Middle East, what have you accomplished?”

The drive for autonomy vs. the destiny of matrimony: the center cannot hold. Five years after Questions Freshmen Ask: A Guide for College Girls appeared, Janis Joplin recorded her first album and Ellen Willis wrote the first piece in Out of the Vinyl Deeps: an essay on Bob Dylan that is more rewarding than certain books on him that come to mind. Whatever it was that transformed Ellen Willis in the meantime, it almost certainly involved a record player.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Pages

Subscribe to RSS - Cultural studies
Back to Top