Cultural studies


I was a graduate student in the 1980s, during the heyday of the so-called “culture wars” and the curricular attacks on "Western civilization." Those days were punctuated by some Stanford students chanting slogans like "Hey hey, ho ho, Western Civ has got to go," and by fiery debates about Allan Bloom’s book The Closing of the American Mind, which appeared in 1987, toward the end of my years in graduate school. Back then the battle lines seemed clear: conservatives were for Western civilization courses and the traditional literary canon, while liberals and progressives were against those things and for a new, more liberating approach to education.

In retrospect I find that decade and its arguments increasingly difficult to comprehend, even though I experienced them firsthand. I ask myself: What on earth were we thinking? Exactly why was it considered progressive in the 1980s to get rid of courses like Western civilization (courses that frequently included both progressives and conservatives on their reading lists)? And why did supporting a traditional liberal arts education automatically make one a conservative — especially if such an education included philosophers like Jean-Jacques Rousseau and Karl Marx?

A quarter of a century later, with the humanities in crisis across the country and students and parents demanding ever more pragmatic, ever more job-oriented kinds of education, the curricular debates of the 1980s over courses about Western civilization and the canon seem as if they had happened on another planet, with completely different preconceptions and assumptions than the ones that prevail today. We now live in a radically different world, one in which most students are not forced to take courses like Western civilization or, most of the time, in foreign languages or cultures, or even the supposedly more progressive courses that were designed to replace them. And whereas as late as the 1980s English was the most popular major at many colleges and universities, by far the most popular undergraduate major in the country now is business.

The battle between self-identified conservatives and progressives in the 1980s seems increasingly like rearranging the deck chairs on the Titanic. While humanists were busy arguing amongst themselves, American college students and their families were turning in ever-increasing numbers away from the humanities and toward seemingly more pragmatic, more vocational concerns.

And who can really blame them? If humanists themselves could not even agree on the basic value, structure, and content of a liberal arts education — if some saw the tradition of Western civilization as one of oppression and tyranny, while others defended and validated it; if some argued that a humanistic education ought to be devoted to the voices of those previously excluded from "civilized" discussion, such as people of color and women, while others argued that such changes constituted a betrayal of the liberal arts — is it any wonder that students and their families began turning away from the humanities?

After all, economics and business professors did not fight about the basic structure of business or economics majors, even though there were differences among Keynesian and Friedmanite economists, for instance, over monetary policy. And physics professors did not engage in fundamental debates about physics curriculums — which should one teach, quantum mechanics or relativity? — in spite of Einstein’s problems with quantum mechanics ("God does not play dice with the universe"). In the 1980s the humanities as a whole seemed to be the only field where even experts were unable to agree on what constituted the appropriate object of study.

If I go to a doctor’s office and witness doctors and nurses fighting about whether or not I should take a particular medication, I’m likely to go elsewhere for my health care needs. I think something analogous happened to the humanities in the 1980s, and it is continuing to happen today, although by now the humanities are so diminished institutionally that these changes no longer have the overall significance they had in the 1980s. In the 1980s the humanities still constituted the core of most major universities; by now, at most universities, even major ones, the humanities are relatively marginal, far surpassed, in institutional strength, by business, medical, and law schools.

One of the core functions of the humanities for centuries was the passing down of a tradition from one generation to the next. The idea behind Western civilization courses was supposed to be that students needed them in order to understand the origins and development of their own culture. In the 1980s three developments worked against that idea. The first was an educational establishment that was no longer content simply to pass knowledge down from one generation to the next, and that wanted to create new knowledge. The second development, which dovetailed with the first, was the emergence of new approaches to the humanities that examined structures of oppression and domination in traditions previously viewed as unimpeachable. One could examine women's history, for instance, or non-Western cultures. The third development, which dovetailed with the first and second, was the increasing demand for “relevance” in higher education, with "relevance" being understood as present-oriented and pragmatic, i.e. job-related.

The conflation of these three developments led to the widespread perception — and not just among self-proclaimed progressives — that anything traditional or old was also, almost by definition, conservative, fuddy-duddy, and impractical. In essence those three developments have now long since triumphed, and the educational world of today is largely the result of that triumph.

Unfortunately, however, traditions that are not passed on from one generation to the next die. If an entire generation grows up largely unexposed to a particular tradition, then that tradition can in essence be said to be dead, because it is no longer capable of reproducing itself. It does not matter whether the tradition in question is imagined as the Western tradition, the Christian tradition, or the Marxist tradition (and of course both Christianity and Marxism are part of the Western tradition). Traditions are like languages: if they are not passed on, they die. Most traditions, of course, have good and bad elements in them (some might argue for Christianity, some for Marxism, relatively few for both), and what dies when a tradition dies is therefore often both good and bad, no matter what one’s perspective. But what also dies with a tradition is any possibility of self-critique from within the tradition (in the sense that Marxism, for instance, constituted a self-critique from within the Western tradition), since a tradition’s self-critique presupposes the existence of the tradition. Therefore the death of a tradition is not just the death of the oppression and tyranny that might be associated with the tradition, but also the death of progressive and liberating impulses within the tradition.

We all know, of course, that nature abhors a vacuum, and for that reason when a tradition dies, what fills in the vacuum where the tradition used to be is whatever is strongest in the surrounding culture. In our culture we know quite well what that is: the belief in money, in business, in economics, and in popular culture. That is our real religion, and it has largely triumphed over any tradition, either progressive or tyrannical. It is no more a coincidence that business is the most popular major in the United States today than it was that theology was one of the major fields of the 1700s.

As a result of the triumph of relevance and pragmatism over tradition, the ivy-covered walls of academia, which once seemed so separated from what is often called the “real world,” now offer very little protection from it. In fact the so-called "real world" almost entirely dominates the supposedly unreal world of academia. It may have once been true that academia offered at least a temporary sanctuary for American students on their way to being productive, hard-working contributors to a booming economy; now, however, academia offers very little refuge to students on their way into a shaky, shell-shocked economy where even the seemingly rock-solid belief in the “free market” has been thrown into question. In 1987 Allan Bloom wrote: "Education is not sermonizing to children against their instincts and pleasures, but providing a natural continuity between what they feel and what they can and should be. But this is a lost art. Now we have come to exactly the opposite point." Over two decades later, it seems to me that Bloom was right, and that indeed we have come “to exactly the opposite point.” Unfortunately now, neither self-styled conservatives nor self-styled progressives are likely to want to defend a vision of education that even in Bloom’s view was long gone. And sadder still is the fact that few of our students will even realize what has been lost.

And so I think we owe an apology to our students. We humanists inherited a tradition more or less intact, with all its strengths and weaknesses, but it appears highly likely that we will not be able or willing to pass it on to them. That is a signal failure, and it is one for which we will pay dearly. No doubt there is lots of blame to go around, but instead of looking around for people to blame, it would be more constructive to save what we can and pass it along to the next generation. They are waiting, and we have a responsibility.

Stephen Brockmann
Author's email:

Stephen Brockmann is professor of German at Carnegie Mellon University and president of the German Studies Association.

'Greatest Generation' Gen Ed

In the context of the news that day in February, the announcement was almost jarring in its banality. On a day when legislators at all levels and all over the country were in full panic mode about budget deficits, and at a time when public investments in education, particularly higher education and most particularly the liberal arts, were being offered as examples of excessive government spending, a new commission had been formed.

At the request of a bipartisan group of members of Congress, the American Academy of Arts and Sciences had gathered a group of distinguished citizens and asked them to recommend 10 actions "that Congress, state governments, universities, foundations, educators, individual benefactors, and others should take now to maintain national excellence in humanities and social scientific scholarship and education, and to achieve long-term national goals for our intellectual and economic well-being." A bipartisan request to form a group to engage in long-range planning about the nation’s intellectual well-being by focusing on the liberal arts — such an announcement not only seemed out of place in the newspapers that day, it seemed almost to come from another generation.

Had these people not heard that, as House Speaker John Boehner put it, "We’re broke"? Didn’t they — these misguidedly bipartisan legislators and anachronistic advocates of the liberal arts — realize that we were in a crisis that precluded long-term planning and collective action? How could they fail to see that education today must focus on job training and economic competitiveness? And what were they thinking in focusing on liberal arts?

It has indeed been hard in recent months to hear anything other than the voices of doom. But the language spoken by these voices represents its own form of crisis, for it is almost entirely economic, as if all relevant factors in our current situation could be captured on a spreadsheet or a ledger. The reduction of complex social and political issues to economics signifies a failure of imagination; and "fiscal responsibility," while an excellent principle at all times, has come to serve as a proxy for our fears that we have lost our way in the world, that the future will not be as bright for our children as it was for us when we were young, that America is being outcompeted by countries that used to be "third world," that the future has somehow gotten away from us.

Fear, whose radical form is terror, has temporarily crippled our national imagination. Many young people today can barely recall a time when we were not subject to the shadowy horrors of terror and terrorists. Today, 10 years after 9-11, terror is a fact of life, and fear makes all the sense in the world. How else to explain the emergence of what are in effect survivalist and vigilante attitudes among so many of our political leaders?

At this time, it is useful for those with longer memories to recall that "other generation" that the current effort to support the liberal arts so strongly evokes. This would be the generation that, having fought their way out of the Great Depression, went out and won World War II. That generation, like ours, had things to fear, but they conquered their fears by taking action, including creating a commission charged with long-term planning for the nation’s educational system, focusing on liberal education.

This commission, created by President James Bryant Conant of Harvard, was formed in 1943, in the middle of the war, and completed virtually all of its work while the outcome of the war was still uncertain. Still, the vision its members announced was confident, spacious and radical. Their report, General Education in a Free Society — or the “Redbook,” as it was called — outlined a program of liberal education for both high school and college students, with required courses in the sciences, the social sciences, and the humanities. The intention was to extend to masses of people — including the hundreds of thousands of returning soldiers who would be going to college on the new GI Bill — the kind of non-vocational education previously available only to a select few.

Such a program, the commission thought, would be profoundly American in that it would prepare people for citizenship in a democracy, giving them what they needed not just to find a job but to live rich and abundant lives, the kinds of lives that people in less fortunate societies could only dream about. Announcing the great mission of American education and the new shape of American society after the war, the Redbook was hailed as a powerful symbol of national renewal, and served as an announcement of America’s cultural maturity. Its main arguments were translated into national policy by the six-volume 1947 "Truman Report," called Higher Education for American Democracy.

The program bespoke confidence in democracy, and in the ability of people to decide the course of their lives for themselves. It suggested, too, a conviction that a democracy based on individual freedom required some principle of cohesion, which would, in the program they outlined, be provided by an understanding of history and culture, which they entrusted to the humanities.

Of course, not every institution of higher education has followed this extraordinarily ambitious and idealistic vision. Indeed, by one recent account, only 8 percent of all American institutions of higher education give their students a liberal education. But that 8 percent includes virtually every institution known to the general populace, including Cal Tech and MIT. With their unique dedication to liberal education, American universities are acknowledged to be the best in the world at two of the central tasks of higher education: educating citizens and conducting research.

Mass liberal education was advocated in the face of challenges every bit as great as those we face today. As a consequence of the war, the national debt had exploded, reaching unprecedented levels (121 percent of GDP in 1946, compared with 93 percent in 2010). And as the grim realities of the Cold War set in, including the prospect of nuclear annihilation and the widespread fear of enemies within, many people felt that the nation was vulnerable in ways it never had been. It would have been understandable if the nation had tried to hedge against an unpredictable future by cutting spending, turning inward, and retooling the educational system so that it would produce not well-rounded citizens but technocrats, managers, nuclear engineers, and scientists.

Instead, we created the Marshall Plan, built the interstate highway system, and increased access to higher education so dramatically that, by 1960, there were twice as many people in higher education as in 1945. And incidentally, the middle class was strong and growing, and the fight for civil rights acquired an irresistible momentum. Things were very far from perfect, but we unhesitatingly call the generation that accomplished all this "the greatest."

What really distinguished the American philosophy of higher education in the generation after WWII was its faith in the future. People educated under a system of liberal education were expected not to fill slots but to create their lives in a world that could not be predicted but did not need to be feared. The lesson for today is perfectly clear. Terrors will always be with us, but we can choose to confront them through collective action and a recommitment to the core principles of democracy, including access, for those who wish to have it and are able to profit from it, to a liberal education. "We’re broke" is a sorry substitute for the kind of imagination and boldness needed now, or at any time. We must take the long view, the global view, and the view that does the most credit to ourselves.

I would not presume to tell the new commission which steps to support the liberal arts they should endorse. But I would urge on them a general principle: that liberal education should not be considered a luxury that can be eliminated without cost, much less an expensive distraction from the urgent task of economic growth, but a service to the state and its citizens. It is an essential service because it reflects and strengthens our core commitments as a nation, without which we truly would be broke.

Geoffrey Harpham
Author's email:

Geoffrey Harpham is president and director of the National Humanities Center. His new book is The Humanities and the Dream of America (University of Chicago Press).

The Struggle for Recognition

In a memorable passage from The Philosophy of History, Hegel quotes a common saying of his day that runs, “No man is a hero to his valet-de-chambre.” This corresponds, in contemporary terms, to the familiar sentiment that even the most distinguished individual “puts his pants on one leg at a time like everybody else.” It is somewhere between wisdom and truism. But Hegel seems to take it badly. After quoting the proverb, he adds his own twist: “not because the former is no hero, but because the latter is a valet.”

In other words, the portrait of a world-transforming figure -- say, Napoleon -- left by somebody who shined his shoes and helped him to bed after a night of drinking is no basis for judging the meaning of said figure’s life. For that, presumably, you need a philosopher. Hegel mentions in passing that his quip was repeated “ten years later” by Goethe. I imagine being very casual while dropping that reference, as his students in the lecture hall go “Dude!” (or whatever the German equivalent of emphatic amazement was in 1830).

The dig at butlers seems awfully snobbish – and also rather unwise, at least to admirers of P.G. Wodehouse. But its thrust is really aimed elsewhere. He is thinking of something that is still fairly new in the early 19th century: a mass public, eager to consume intimate revelations and psychological speculations regarding powerful and influential people. This means wallowing in envy and egotism. Hegel says it is driven by the “undying worm” of realizing that one’s “excellent views and vituperations remain absolutely without result in the world.” Anyone distinguished is thereby reduced “to a level with – or rather a few degrees lower than – the morality of such exquisite discerners of spirits.”

This sounds irritable enough. And remember, the telegraph hadn’t even been invented yet. The golden age of cutting everybody down to size was still to come. Nor, indeed, has it ended.

But Joel Best’s new book Everyone’s a Winner: Life in Our Congratulatory Culture, published by the University of California Press, describes a situation that appears, at first blush, the exact opposite of the one that bothered Hegel. The word “heroic,” writes Best, a professor of sociology at the University of Delaware, “once applied narrowly to characterize great deeds by either mythic or historical figures,” but is now often “broadened to encompass virtually anyone who behaves well under difficult – even potentially difficult – circumstances.” And sometimes not even that. (When Stephen Colbert tells his audience that they’re the real heroes, it satirizes the way certain cable TV demagogues flatter the American couch potato.)

“Activists are heroes,” he writes. “Coal miners are heroes. People with terminal cancer are heroes. A word once reserved for the extraordinary is now applied to the merely admirable.”

This is one aspect of a pattern that Best finds emerging in numerous domains of American life. There is an abundance of claims to eminence and excellence. The awards proliferate as we hold public celebration of achievement in every activity imaginable. Restaurants display their rankings from local newsweeklies. Universities are almost always certifiably distinguished, in some regard or other. A horror movie called The Human Centipede (First Sequence) won the 2010 Scream Award in the category “most memorable mutilation.” I have seen the film and believe it deserved this honor. (Seriously, you don’t want to know.)

Anyone possessing even a slight curmudgeonly streak will already have had suspicions about this trend, of course. Best corroborates it with much evidence. A case in point is his graph of the number of British and American awards for mystery novels. In 1946, the figure stood at five. By 1979, it had grown to five times that many, and in 2006 (the last year he charts), there were roughly 110. “Nor is the trend confined to book awards,” he notes. “The number of film prizes awarded worldwide has grown to the point that there are now nearly twice as many awards as there are full-length movies produced. For both books and films, the number of prizes has grown at a far faster clip than the numbers of new books or movies.”

The Congressional Gold Medal honoring an outstanding contribution to the nation (its first recipient, in 1776, was George Washington) was presented five times in the course of the 1950s. The frequency of the award has grown since. Between 2000 and 2009, it was given out 22 times.

The examples could be multiplied, perhaps exponentially. The range of people, products, and activities being honored has expanded. At the same time, the number of awards in each category tends to grow. In short, the total energy invested in assessing, marking, and celebrating claims about status (that is, worthiness of respect or deference) seems to have increased steadily over the past few decades in the United States -- and Best says that discussions with colleagues in Canada, Japan, and Western Europe suggest that the same trend has emerged in other countries.

Older ways of looking at status regarded it as a rare commodity. Gaining it, or losing it was fraught, with anxiety. And it still is, but something important has changed. Hegel’s comments imply that powerlessness and lack of status were bound to inspire resentment over established claims to excellence and significance. In a condition of “status scarcity,” there is bound to be a struggle that unleashes destructive tendencies. But Best maintains that another dynamic has emerged -- the manufacture of status on an almost industrial scale, rather than a mass society in which status is smashed.

This tendency overlaps with the profusion of what he terms “social worlds” (what might otherwise be called subcultures or lifestyle cohorts) which that emerge as people with shared interests or commitments gather and form their own organizations. Giving and getting awards often becomes part of consolidating the niche.

“The perceived shortage of status,” he writes, reflecting a sense of “insufficient status being given to people like us, is one of the reasons disenchanted people form new social worlds.” Doing so “means that folks aren’t forced to spend their whole lives in circles where they inevitably lose the competition for status. Rather, by creating their own worlds, they acquire the ability to mint status of their own. They can decide who deserves respect and why.” The result is what Best calls "status affluence." There are, he acknowledges, grounds to criticize this situation – an obvious one being that status, like currency, becomes devalued when too much of it is being put into circulation. On the whole, though, he judges it as salutary, and as making for greater social cohesion and stability.

And in any case, there is no obvious way to change it. A few years ago, a bill calling for no more than two Congressional Gold Medals to be issued per year won some support -- only to end in limbo. If there is a tap to control the flow of awards, nobody knows how to work it.

"Status affluence" isn't the same as equality -- and I'm struck by the sense that it coexists with profound and growing socioeconomic disparities. As Joseph E. Stiglitz recently pointed out, the income of the top 1 percent in the United States has grown by 18 percent over the past decade, while people in the middle have seen their incomes shrink: "While many of the old centers of inequality in Latin America, such as Brazil, have been striving in recent years, rather successfully, to improve the plight of the poor and reduce gaps in income, America has allowed inequality to grow." As interesting as Best's book is, it leaves me wondering if status affluence isn't a symptom, rather than a sign that the distribution of recognition has grown more equitable. A parachute is better than nothing, but this one seems like it might be made of papier-mâché.

Scott McLemee
Author's email:

Like a Rolling Stone

This coming weekend's conference on the late Ellen Willis -- essayist, radical feminist, and founder of the cultural reporting and criticism program at New York University -- begins to look as if it is going to be rather a big deal. It coincides with publication by the University of Minnesota Press of Out of the Vinyl Deeps: Ellen Willis on Rock Music, which, besides doing wonders for the reputations of Moby Grape and Creedence Clearwater Revival, is going to consolidate Willis’s role as a figure young writers read, and reread, and dream of somehow becoming. Originally the conference was planned for a small meeting space somewhere in downtown New York, but it’s been relocated to the Tishman Auditorium at NYU, which holds 450 people. Five years after her death, this is Ellen Willis’s moment

As someone who began reading her work almost 30 years ago (to an 18-year-old Velvet Underground fanatic, any collection of essays called Beginning to See the Light needed no further recommendation), I am happy to think so. And as someone scheduled to speak during the first session -- but nowhere near finishing his paper -- I am terrified to think so. Meanwhile, the organizers keep reminding the panelists that the event is being moved to a bigger venue, due to popular demand. And would we please be sure to get there on time? Maybe they are afraid of an unruly crowd; the warm-up act needs to get on stage without undue delay.

So, yes: a big, anxious deal. Though mostly a celebration. A small sampling of her work is available on a website run by her daughter, although this is no substitute for the three collections of essays on social and cultural matters that appeared during her lifetime.

Speaking of which, somebody at the conference needs to address the issue of how it happens that Out of the Vinyl Deeps is only appearing just now. Why is it only in 2011 that we have a book demonstrating that she was one of the best rock critics of the 1960s and ‘70s? Those decades have been mythologized as the era when rock writers of gigantic stature -- Lester Bangs, Robert Christgau, Nick Kent, Greil Marcus, Dave Marsh, Richard Meltzer, and Nick Tosches -- thundered across the countercultural landscape, sometimes doing battle, like dinosaurs. (Big, stoned dinosaurs.) You can find collections of work by all of these guys, and ardent fanboys ready to debate their respective degrees of eminence. In fact, I listed them alphabetically to avoid that sort of thing.

There were only a handful of pieces on rock in Willis's first collection of essays (and none in the subsequent volumes, which focused on feminist theory and cultural politics), but they were stunning. Anecdotal evidence and personal experience suggest that rereading them repeatedly was not an uncommon response. And when you did, you heard (and felt) songs by Bob Dylan, or the Who, or the Velvet Underground, in ways you never had before. She was as insightful as any of the dino-critics -- and a much better writer than some of them -- yet Willis never really figured in the legend.

With dozens of her writings on popular music now gathered between covers, this will change. But again, what took so long? This must be explained. (The possibility of an all-male species of dinosaur was unlikely in any event.)

Most of the pieces in the new book appeared in The New Yorker, to which Willis began contributing in 1968. A few months later, in response to the prevailing and otherwise intractable sexism of the New Left, she started the influential group Redstockings along with Shulamith Firestone, who soon wrote The Dialectic of Sex: The Case for Feminist Revolution (1970).

It was another Redstockings member, Carol Hanisch, who coined the phrase “the personal is political.” And on a personal-political note, I will mention that reading Firestone’s manifesto as a teenager scared the hell out of me, in a salutary way. The trauma had passed by the time Willis collected her own feminist writings in No More Nice Girls: Countercultural Essays (Wesleyan, 1992) -- a volume it is particularly interesting to read alongside Daring to Be Bad: Radical Feminism in America (University of Minnesota Press, 1989), for which Willis wrote the introduction. Clearly this sort of material is still upsetting to some people. A blogger named Doug Phillips, for example, blames Ellen Willis and the Willis-ites for “promot[ing] ultra-radical lesbian-feminist politics, trans-sexuality, and mother goddess worship.” Like that’s a bad thing.

While her libertarian worldview would certainly accommodate transsexual lesbian pagans in its conception of the good society, anyone who actually reads Ellen Willis will learn that she was, in fact, an enthusiastically heterosexual atheist who, at some point, accepted monogamy in practice, if not in theory. None of which will give Doug Phillips much comfort. But apart from specifying her exact position within the culture wars, the stray bits of personal information in her work are interesting for what they reveal about Willis as a writer.

Some of her most memorable pieces were in the vein of what used to be called the New Journalism, in which the reporter’s subjectivity is part of the narrative. But this amounts to only a small part of her output. The proliferation of memoir may be an indirect effect of feminism (“the personal is the literary”), but the role of the “I” in Willis is rarely confessional. Her essays, while usually familiar in tone, tend to be analytic in spirit. The first-person is a lens, not a mirror.

As mentioned, Out of the Vinyl Deeps is Willis’s fourth volume of essays. Following the last one she saw through the press, Don’t Think, Smile! Notes on a Decade of Denial (Beacon, 2000), she published a fair amount of uncollected material and was working on an interpretation of American culture from the perspective of Wilhelm Reich’s psychoanalytic theory. So perhaps there will be another posthumous volume at some point.

If so, it would be her fifth collection -- and her sixth book. Like most readers, I have always assumed that Beginning to See the Light, from 1981, was her first title. (It was reprinted by Wesleyan in 1992.) But almost 20 years earlier, Willis published another book. She did not list it in the summary of her career appearing in volume 106 of the reference-book series Contemporary Authors (Gale Publishers) and seems never to have referred to it in print. Indeed, I wondered if the Library of Congress cataloger didn’t make a mistake by listing Questions Freshmen Ask: A Guide for College Girls (E.P. Dutton, 1962) as written by the same author as No More Nice Girls. After all, there could be two Ellen Willises.

And in a way, there were. I’m still trying to figure out the relationship between them -- how the one became the other.

On page 4, the author of Questions Freshmen Ask explains her qualifications for writing the book: “As a graduate of Barnard College, I feel I have had the kind of experience that enables me to provide the answers to many of your questions. Since Barnard is on the one hand a small women’s college and on the other, part of a large coeducational institution (Columbia University), I am aware of the problems of both types of schools.”

The entry for Ellen Willis in Contemporary Authors notes that she graduated from Barnard in 1962. The 20-year-old author occasionally turns a phrase or writes in a rhythm that will sound familiar to aficionados of her older self -- and the introduction by Barbara S. Musgrave, class dean of Smith College, commends the book as “written so engagingly it gives something of the flavor of college ahead of time.”

It is certainly a time capsule. Exhibit A: “Most colleges estimate that books will cost you in the neighborhood of seventy-five dollars a year.” Exhibit B: “Freshmen often resent all the new regulations under which they are asked to live…. The fact is that your college is less interested in your individual welfare than in the smooth running of the community as a whole.” (Fifty years later, the in loco parentis rules Willis has in mind are long dead. And the administration's communitarian motives count less than its interest in not getting sued.)

Some of the advice remains valid -- especially the parts about the need to budget time and money. And the occasional bit of historical context can be glimpsed between the lines. The author’s freshman year would have been not long after the Sputnik launch. The push was on to expand access to higher education so that the nation would not be overwhelmed by superior brainpower. Willis is explicit about offering guidance to girls who will enter college with no idea what to expect, because their parents didn’t go.

“In the old days,” she writes, “when money or an influential relative seemed almost a ticket of admission to the campus, a student didn’t have to be too purposeful about college. A girl could shrug and say she wanted to go to college, well, because all her friends were going and it had never occurred to her not to go. But times have changed, and you can’t afford to be aimless -- not if you want to justify the admissions director’s faith in you.”

As with a recommendation to “be a good sport” about nitpicky campus rules, this stress on living up to the expectation of an authority figure is hard to square with the later Ellen Willis. But there are passages in which (with abundant hindsight, admittedly) you can see the fault lines.

“No matter what you eventually do after you graduate,” she writes, “you will want to have a mind that’s alert and full of ideas. There will be books you want to understand, important decisions to make, leisure time to fill. With the mental resources your education provides, you will be able to enjoy life more fully….”

Here, the Willis fan thinks: Yes, I know this author. But then you hit a passage like this: “If you spend four years at college single-mindedly preparing yourself for a television production job in New York, and then end up marrying an anthropologist who has to live in the Middle East, what have you accomplished?”

The drive for autonomy vs. the destiny of matrimony: the center cannot hold. Five years after Questions Freshmen Ask: A Guide for College Girls appeared, Janis Joplin recorded her first album and Ellen Willis wrote the first piece in Out of the Vinyl Deeps: an essay on Bob Dylan that is more rewarding than certain books on him that come to mind. Whatever it was that transformed Ellen Willis in the meantime, it almost certainly involved a record player.

Scott McLemee
Author's email:

Attitudes Toward Kenneth Burke

“I am not a donkey,” Max Weber once said, “and I do not have a field.” And yet it is always possible to label Weber as a sociologist without unduly provoking anybody. Things are decidedly more complicated in the case of the American thinker Kenneth Burke (1897-1993). Situating such Burkean treatises as Permanence and Change (1935), A Grammar of Motives (1945), and Language as Symbolic Action (1966) in cultural and intellectual history is a task to test the limits of interdisciplinary research. His theories concerning aesthetics, communications, social order and ecology took shape through dialogue with the work of Aristotle, Marx, Freud, Nietzsche, Bergson, and the American pragmatist philosophers, to make the list as short as possible. (And Weber too, of course.) It’s still hard to improve upon the assessment made by Stanley Edgar Hyman, the literary critic and Bennington College professor, more than 60 years ago: “He has no field, unless it be Burkology.”

The triennial meeting of the Kenneth Burke Society, held at Clemson University over the Memorial Day weekend, drew a diverse crowd, numbering just over one hundred people -- with at least a third, by my estimate, being graduate students or junior faculty. The Burkological elders told tales of the days when incorporating more than a couple of citations from “KB” in a dissertation would get you scolded by an adviser. Clearly things have changed in the meantime. Tables near registration were crowded with secondary literature from the past decade or so, as well as a couple of posthumous collections of KB's work. The program featured papers on the implications of his ideas for composition textbooks, disability studies, jazz, environmental activism, and the headscarf controversy.

There were also Burkean discussions of “Mad Men,” Mein Kampf, and the Westboro Baptist Church. Unfortunately I missed it, but Camille Kaminski Lewis gave a paper based on her continuing analysis of the history and ideology of Bob Jones University, where she once taught. (Her book on the subject did not meet with the institution's approval, a matter she discussed in an essay for the Burke Society's Journal.)

The range of topics would sound bewildering to anyone uninitiated into KB’s work; likewise with the vocabulary he created along the way (“dramatism,” “logology,” “terministic screen,” “socio-anagogic interpretation”). But people attending the conference received commemorative tee-shirts bearing excerpts from KB’s “Definition of Man” -- an essay attempting to reduce his thinking to a succinct formula, devoid of any jargon:

"Man is the symbol-making animal, inventor of the negative, separated from his natural condition by instruments of his own making, moved by the sense of order, and rotten with perfection."

Quit a bit is going on within that nutshell. (The phrase “rotten with perfection,” for example, is Burke’s idiosyncratic take on Aristotle’s idea of entelechy.) But an academic organization devoted to an esoteric thinker who fits comfortably in no particular departmental pigeonhole would seem unlikely to have much potential for growth. On the final day of the conference, David Cratis Williams told me that when the Kenneth Burke Society formed in 1984, he suspected that it would for the most part appeal to people who had known KB personally. And that small circle was bound to shrink over time, as people retired.

Something else has happened instead. There was more to it than a few then-young Burkologists becoming institutionally well-situated – though that no doubt made a huge difference. Williams, for example, is the director of the graduate program in communication and media studies at Florida Atlantic University. (He is also working on a biography of the maverick thinker.) And David Blakesley, who organized the conference at Clemson just a few months after arriving there to assume an endowed chair in English, is also the founder of Parlor Press, a peer-reviewed scholarly publishing house. The name of the press is taken from a passage in which Burke describes the world as a parlor where an unending conversation unfolds.

Having a few well-placed and entrepreneurial Burkeans has certainly helped to consolidate the Society. But I suspect that other factors are involved in the continuing vitality of the KB scholarship. Three things stood out about the conference: the crowd was multigenerational; many of the younger Burkeans have a strong interest in archival research; and the scholarship is now orienting toward digital media, not just to study it but to use it.

These tendencies seem to be mutually reinforcing. Since the early 1990s, Jack Selzer, a professor of English at Pennsylvania State University's main campus, has not only been doing archival research on Burke’s involvement with a number of literary and intellectual circles, but encouraging his students to use the Burke papers at Penn State as well. One of his graduate students was Ann George, now an associate professor of English at Texas Christian University. In 2007, the University of South Carolina Press published Kenneth Burke in the 1930s, which situates its subject in the political and cultural context of the Depression. (While specialized and extremely suggestive to the longtime Burkean, it’s also the book I’d be most likely to recommend to someone new to KB.)

Now students of both Selzer and George are digging around in the 55 linear feet of Burke papers at PSU -- and sometimes taking trips out to the farmhouse in New Jersey where Burke lived and worked, full of still more manuscripts as well as KB’s heavily annotated library. Besides his correspondence with other literary and academic figures, they’re finding unpublished manuscripts and notes showing his concern with economics, music, and other areas relatively neglected by earlier Burke scholars. One senior figure told me that the influx of graduate students was both encouraging and anxiety-inducing: “I really have to finish the project I’ve been working on because now it’s just a matter of time before one of them beats me to it.”

The value of having digital editions of his writings seems clear -- especially in the case of works that Burke revised from edition to edition. In the meantime, two graduate students are digitizing "Conversations with Kenneth Burke," which consists of eight hours of interview footage with Burke conducted by Clarke Rountree at the University of Iowa in 1986. (He is now a professor of communication arts at the University of Alabama in Huntsville.)

Joel Overall, who is one of Ann George's students at TCU, told me about it. "Our project involves upgrading 8 hours of interview footage from VHS to DVD format,” he said. “In addition to upgrading the graphic design of packaging materials, DVD titles, and credits, we're also working on transcriptions of the interview that will be included through subtitles and a searchable pdf file. This is a particularly valuable contribution since KB was somewhat difficult to understand at the age of 89.“ (The other member of the project, Ethan Sproat, is at Purdue, where he worked with David Blakesley before DB's move to Clemson.)

The DVD will be released by the Society within the next year. “Since [Burke’s] written works are often difficult when first encountered, these interviews allow us to hear his voice and see him in cinematic motion, providing us with extra-textual elements that are crucial to understanding his work.”

Following the conference, David Blakesley pointed out another development in the Burkological world. While he was a polyglot as well as a polymath -- reading and translating work from from French and German, and an ardent student of Latin literature as well -- Burke's reputation has long been almost exclusively confined to the United States. But Belgian and French scholars were at the conference.

“They, too, felt welcome, “ he said, “and are excited about their prospects for future work on Burke. In fact, Ronald Soetart (University of Ghent) wants to organize a European Burke conference now. The French contingent was eager to see that as well since there appears to be a groundswell of interest in Burke throughout Europe. I noticed that when I presented on Burke and visual rhetoric at the International Association of Visual Semiotics in Venice last April, too.”

I attended the conference as a keynote speaker, and also delivered a paper -- and so was sitting there feeling mildly fried when I was invited to participate in another multimedia project. A group of Clemson graduate students in the master of arts in professional communication (MAPC) program were conducting a series of interviews for a video on the field of rhetoric. (That is rhetoric understood as the well-established study of effective communication, rather than in the modern sense of a technique for evading reality.)

Drew Stowe, a second-year student in the program, explained that the project would “show the importance of rhetoric for modern students, in the modern university, and to lay audiences such as parents of prospective students, the board of trustees and other corporate partners who recruit graduates from the MAPC program.” Burke is considered one of the most innovative thinkers in rhetoric since antiquity, so scouting the conference for talking heads made sense.

In front of the camera, I aspired to coherence rather than eloquence. My main point was that KB’s work is a toolbox of ideas useful for analyzing the messages with which everyone is bombarded. As someone who’s read a few of Burke’s books until they’ve worn out -- my hardback copy of the first edition of Philosophy of Literary Form (1941), for example, started falling apart during the conference -- I take his continuing relevance as a given. But where did it come from?

“I've always sensed that KB lived at a particularly interesting cultural moment,” wrote Jack Selzer to me by email, following the conference. “Major wars were changing international affairs fundamentally, new communications technologies were so important, and of course postmodern and post-Nietzschean philosophies (and the linguistic turn) were troubling modernist and rationalist assumptions. Somehow he was brilliant enough to perceive the vitality of these changes even as he was living amidst them, and he was able to theorize and meditate on things so productively -- even though (or because?) he was so close to them. As a consequence, what he has to say remains very contemporary. It was wonderful to see the younger scholars drawn to his work in every way imaginable, and I think it has to do with how shrewd KB was about such important intellectual currents.”

Ann George described teaching Burke in a couple of courses over the past years and finding that students “were struck with, and even a little dispirited by,” the parallels between Burke’s motivating concerns and the present scene. “His political, economic, and environmental insights are remarkable: American exceptionalism and the war in Iraq; 'socialization of losses' via government bailouts, 'rereadings' of the Constitution, Ponzi schemes -- it's all there. Of all the theorists we read in the modern rhetoric course … though, students felt Burke offered more answers -- or more hope -- because he didn't idealize human motives or overestimate how much we might be able to change things for the better. “

That’s a very good point -- and there is a profoundly humanist vision that emerges as the pieces of Burke’s theoretical jigsaw puzzle come together.

He put it best in Attitudes Toward History (1937): "The progress of human enlightenment can go no further than in picturing people not as vicious, but as mistaken.

“When you add that people are necessarily mistaken, that all people are exposed to situations in which they must act as fools, that every insight contains its own special kind of blindness, you complete the comic circle, returning again to the lesson of humility that undergirds great tragedy.” Studying Burke is sometimes difficult, but there are moments when it makes the world seem a little less mad.

Scott McLemee
Author's email:

Major Move Ahead?

Smart Title: 
Cal State Northridge could become the first college in the country to offer a Central American studies major.

Stanford's New Grants for the Humanities

Smart Title: 

Winning research support is tough for faculty members in all disciplines -- and makes or breaks careers, especially at research universities. For those in the sciences, competition from many federal agencies has grown more intense in recent years, but there are still billions given out annually and even relatively junior professors can hope to land grants of significant size.

Academic Fashions Aren't Just Sartorial

Smart Title: 

At last year's annual meeting of the Modern Language Association, Elisabeth Ladenson found herself in discussion with a bus dispatcher while waiting for a shuttle. "You all have that look," the dispatcher told Ladenson, an associate professor of French at Columbia University.

Dramatic Plan for Language Programs

Smart Title: 
Panel wants departments to move beyond literature -- with overhauls in staffing and curriculum for undergrad and Ph.D. education.

Academic Oscar Predictions

Smart Title: 
Our panel of scholarly experts offers insights into the competition -- and lessons for students from this year's contenders.


Subscribe to RSS - Cultural studies
Back to Top