English

New Academy of Arts and Sciences report stresses importance of humanities and social sciences

Smart Title: 

Amid talk of outcomes-based education, a new report from the Commission on the Humanities and Social Sciences stresses the disciplines' role in long-term career success and international competitiveness.

review of Ted Anton, 'The Longevity Seekers: Science, Business, and the Fountain of Youth'

Standing in line at the drugstore a couple of weeks ago, I spied on the magazine rack nearby this month’s issue of National Geographic – conspicuous as one of the few titles without a celebrity on the cover. Instead it showed a photograph of an infant beneath a headline saying "This Baby Will Live to Be 120."

The editors must have expected disbelief, because there was a footnote to the headline insisting that the claim was not hype: "New science could lead to very long lives." When was the last time you saw a footnote in a popular periodical, on the cover, no less? It seemed worth a look, particularly after the septuagenarian in front of me had opened complex, in-depth negotiations with the pharmacist.

The headline, one learns from a comment on the table of contents, alludes to a traditional Jewish birthday wish or blessing: "May you live to be 120." This was the age that Moses was said to have reached when he died. The same figure appears -- not so coincidentally perhaps – at an important moment in the book of Genesis. Before sending the Flood, Jehovah announces that man’s lifespan will henceforth peak at 120 years. (I take it there was a grandfather clause for Noah. When the waters recede, he lives another 350 years.)

The cap on longevity, like the deluge itself, is ultimately mankind’s own fault, given our tendency to impose too much on the Almighty’s patience and good humor. He declares in about so many words that there is a limit to how much He must endure from any single one of us. Various translations make the point more or less forcefully, but that’s the gist of it. Even 120 years proved too generous an offer – one quietly retracted later, it seems. Hence the Psalmist’s lament:

“The days of our years are threescore years and ten; and if by reason of strength they be fourscore years, yet is their strength labor and sorrow; for it is soon cut off, and we fly away.”

Nursing homes are full of people who passed the fourscore marker a while ago. If you visit such places very often, as I have lately, “May you live to be 120” probably sounds more like a curse than a blessing. Not even a funeral obliges more awareness of mortal frailty. There is more to life than staving off death. The prospect of being stranded somewhere in between for 30 or 40 years is enough to make an atheist believe in hell.

Meanwhile, in science…. The medical and biological research surveyed in that NatGeo article promises to do more than drag out the flesh’s “labor and sorrow” a lot longer. The baby on the magazine cover will live his or her allotted span of six score decades with an alert mind, in a reasonably healthy body. Our genetic inheritance plays a huge but not absolutely determinate role in how long we live. In the wake of the mapping of genome, it could be possible to tinker with the mechanisms that accelerate or delay the aging process. It may not be the elixir of youth, but close enough.

Besides treating the same research in greater depth, Ted Anton’s The Longevity Seekers: Science, Business, and the Fountain of Youth (University of Chicago Press) emphasizes how profound a change longevity research has already wrought. It means no longer taking for granted the status of aging as an inescapable, biologically hardwired, and fundamentally irreversible process of general decline. Challenging the stereotypes and prejudices about the elderly has been a difficult process, but longevity engineering would transform the whole terrain of what aging itself entails.

Anton, a professor of English at DePaul University, tells the story in two grand phases. The first bears some resemblance to James Watson’s memoir The Double Helix, which recounts the twists and turns of laboratory research in the struggle to determine the structure of DNA – work for which he and Francis Crick received a Nobel Prize in medicine in 1962. Watson’s book is particularly memorable for revealing science as an enterprise in which personalities and ambitions clash as much as theories ever do. (And with far more rancor as Watson himself demonstrated in the book’s vicious and petty treatment of Rosalind Franklin, a crystallographer whose contribution he downplayed as much as possible.)

A practitioner of long-form journalism rather than a longevity researcher, Anton writes about conflicts in the field with some detachment, even while remaining aware that the discoveries may change life in ways we can’t yet picture. The initial phase of the research he describes consisted largely of experiments with yeast cells and microscopic worms conducted in the 1990s. Both are short-lived, meaning that the impact of biochemical adjustments to their genetic “thermostats” for longevity would register quickly.

During the second phase of Anton’s narrative, lab research involved more complex organisms. But that that was not the most important development. The public began hearing news flashes that scientists had discovered that the key to a longer life was, say, restricted caloric intake, or a chemical called resveratrol found in red wine. Findings presented in scientific journals were reported on morning news programs, or endorsed on Oprah, within days or even hours of publication. Hypotheses became hype overnight.

This generated enthusiasm (more for drinking red wine than restricting calories, if memory serves) as well as additional confidence that biotechnological breakthroughs were on the way. Everybody in longevity research, or almost everybody, started a company and ran around looking for venture capital. Models, evidence, and ideas turned proprietary information -- with the hurry to get one’s findings into professional journals looking more and more like the rush to issue a press release.

So far, no pharmaceutical has arrived on the market to boost our lifespans as dramatically as the worm and yeast cells in the laboratory worms. “The dustbin of medical breakthroughs,” Anton reminds us, “bears the label ‘It Worked in Mice.’ ” On the other hand, the research has been a boon to the cosmetics industry.

As it is, we’re nowhere near ready to deal with the cumulative effect of all the life-extending medical developments from the past few decades. The number of centenarians in the world “is expected to increase tenfold between 2010 and 2050,” the author notes, “and the number of older poor, the majority of them women,” is predicted “to go from 342 million today to 1.2 billion by that same year.”

But progress is ruthless about doing things on its own terms. Biotech is still in its infancy, and its future course -- much less its side effects -- is beyond imagining. The baby on the magazine cover might well live to see the first centenarian win an Olympic medal. I wish that prospect were more cheering than it is.

Editorial Tags: 

Review of Matthew L. Jockers, 'Macroanalysis: Digital Methods & Literary History'

“A poem,” wrote William Carlos Williams toward the end of World War II, “is a small (or large) machine of words.” I’ve long wondered if the good doctor -- Williams was a general practitioner in New Jersey who did much of his writing between appointments – might have come up with this definition out of weariness with the flesh and all its frailties. Traditional metaphors about “organic” literary form usually imply a healthy and developing organism, not one infirm and prone to messes. The poetic mechanism is, in Williams’s vision, “pruned to a perfect economy,” and there is “nothing sentimental about a machine.”

Built for efficiency, built to last. The image this evoked 70 years ago was probably that of an engine, clock, or typewriter. Today it’s more likely to be something with printed circuits. And a lot of poems in literary magazines now seem true to form in that respect: The reader has little idea how they work or what they do, but the circuitry looks intricate, and one assumes it is to some purpose.

I had much the same response to the literary scholarship Matthew L. Jockers describes and practices in Macroanalysis: Digital Methods & Literary History (University of Illinois Press). Jockers is an assistant professor of English at the University of Nebraska at Lincoln. The literary material he handles is prose fiction -- mostly British, Irish, and American novels of the 18th and 19th centuries -- rather than poetry, although some critics apply the word “poem” to any literary artifact. In the approach Jockers calls “macroanalysis,” the anti-sentimental and technophile attitude toward literature defines how scholars understand the literary field, rather than how authors imagine it. The effect, in either case, is both tough-minded and enigmatic.

Following Franco Moretti’s program for extending literary history beyond the terrain defined by the relatively small number of works that remain in print over the decades and centuries, Macroanalysis describes “how a new method of studying large collections of digital material can help us to understand and contextualize the individual works within those collections.”

Instead of using computer-based tools to annotate or otherwise explore a single work or author, Jockers looks for verbal patterns across very large reservoirs of text, including novels that have long since been forgotten. The author notes that only “2.3 percent of the books published in the U.S. between 1927 and 1946 are still in print” (even that figure sounds high, and may be inflated by the recent efforts of shady print-on-demand “publishers” playing fast and loose with copyright) while the most expansive list of canonical 19th-century British novels would represent well under 1 percent of those published.

Collections such as the Internet Archive and HathiTrust Digital Library available for analysis. Add to this the capacity to analyze the metadata about when and where the books were published, as well as available information on the authors, and you have a new, turbocharged sort of philology – one covering wider swaths of literature than even the most diligent and asocial researcher could ever read.

Or would ever want to, for that matter. Whole careers have been built on rescuing “unjustly neglected” authors, of course, but oblivion is sometimes the rightful outcome of history and a mercy for everyone involved. At the same time, the accumulation of long-unread books is something like a literary equivalent of the kitchen middens that archeologists occasionally dig up – the communal dumps, full of leftovers and garbage and broken or outdated household items. The composition of what’s been discarded and the various strata of it reveal aspects of everyday life of long ago.

Jockers uses his digital tools to analyze novels by, essentially, crunching them -- determining what words appear in each book, tabulating the frequency with which they are used, likewise quantifying the punctuation marks, and working out patterns among the results according to the novel’s subgenre or publication date, or biographical data about the author such as gender, nationality, and regional origin.

The findings that the author reports tend to be of a very precise and delimited sort. The words like, young, and little “are overrepresented in Bildungsroman novels compared to the other genres in the test data.” There is a “high incidence of locative prepositions” (over, under, within, etc.) in Gothic fiction, which may be “a direct result of the genre’s being ‘place oriented.’” That sounds credible, since Gothic characters tend to find themselves moving around in dark rooms within ruined castles with secret passageways and whatnot.

After about 1900, Irish-American authors west of the Mississippi began writing more fiction than their relations on the other side of the river, despite their numbers being fewer and thinner on the ground. Irish-American literature is Jockers’s specialty, and so this statistically demonstrable trend proves of interest given that “the history of Irish-American literature has had a decidedly eastern bias…. Such neglect is surprising given the critical attention that the Irish in the West have received from American and Irish historians.”

As the familiar refrain goes: More research is needed.

Macroanalysis is really a showcase for the range and the potential of what the author calls “big data” literary study, more than it is a report on its discoveries. And his larger claim for this broad-sweep combination of lexometric and demographic correlation-hunting – what Moretti calls “distant reading” -- is that it can help frame new questions about style, thematics, and influence that can be pursued through more traditional varieties of close reading.

And he’s probably right about that, particularly if the toolkit includes methods for identifying and comparing semantic and narrative elements across huge quantities of text. (Or rather, when it includes them, since that’s undoubtedly a matter of time.)

Text-crunching methodologies offer the possibility of establishing verifiable, quantifiable, exact results in a field where, otherwise, everything is interpretive, hence interminably disputable. This sounds either promising or menacing. What will be more interesting, if we ever get it, is technology that can recognize and understand a metaphor and follow its implications beyond the simplest level of analogy. A device capable of, say, reading Williams’s line about the poem as machine and then suggesting something interesting about it – or formulating a question about what it means.

Editorial Tags: 

Essay on how to keep humanities vibrant by rejecting elite universities' models

In "Howl," a blistering poetical rant and perhaps the most important poem of the 60’s counterculture, Allen Ginsberg anatomizes the minds of his generation. They are young men and women who "studied Plotinus Poe St. John of the Cross telepathy and bop kabbalah because the cosmos instinctively vibrated at their feet in Kansas." When students come to our offices to consider studying the humanities, we can all recite the litany of reasons for doing so. It provides them with the critical thinking skills needed for success in any career; it endows them with the cultural capital of the world’s great civilizations; and it helps them explore what it means to be human.

But for those of us who have spent our lives studying the humanities, such reasons are often just the fossilized remains of the initial impulse that set us on our educational journey -- the feeling that Kansas was vibrating at our feet, and that to chart our futures we desperately needed to understand the meaning of that vibration.

The main challenge for the humanities teacher has always been to show how the great works of philosophy, literature, religion, history, and art answer to the good vibrations in our young people. But at the dawn of the 21st century the academic scaffolding of the humanities thwarts this fundamental goal. The central problem is that the Harvard University model of humanistic study dominates academia.

The Harvard model sees the humanities as a set of distinct and extensively subdivided disciplines, overseen by hyper-specialized scholars who produce disciplinary monographs of extraordinary intellectual subtlety and technical expertise. Though the abstruse work produced with this model periodically makes it the butt of media jokes, no one with an appreciation for good scholarship would want to eliminate the rigorous discipline represented by the work of scholars at Harvard and institutions like it. But neither should it be allowed to dominate the agenda of all higher education, which it now incontestably does, to the detriment of both the humanities and the students who want to understand the meaning of their unique vibration.

The disciplining of knowledge was central to the creation of the modern research university. In the second half of the 19th century, Harvard and then schools across the academic landscape dropped their common curriculum, creating instead departments and majors. Beginning with the natural sciences of physics, chemistry, and biology, this flowering of disciplines issued in countless discoveries and insights with repercussions far beyond the university. Flushed with this success, this triumph of knowledge production, and the 19th century scientific methodology that was its seed, spread to the examination of society. The newly-invented social sciences -- economics, sociology, anthropology and the like — grabbed hold of the explosive new problems that followed in the wake of modern industrial life. But at the same time they marginalized the traditional questions posed in the humanities. The social sciences raised "humanistic" questions within the strictures of 19th century positivist assumptions about scientific "objectivity," and they have been doing so, despite post-modern blows dealt to claims of objectivity, ever since.

As the natural and social sciences divided the world between themselves the humanities threatened to become a mere leftover, a rump of general reflections and insights that lacked the rigor of the special sciences. Eager to be properly scientific themselves, and thereby forestall such a humiliating fate, the humanities disciplined themselves. They sought to emulate the success of the sciences by narrowing their intellectual scope, dividing and subdividing their disciplines into smaller and ever smaller scholarly domains, and turning themselves into experts.

The norm became the creation of inward-looking groups of experts who applied a variety of analytic approaches to sets of increasingly technical problems. In short, the humanities found themselves squeezed by the demands for professionalization and disciplinization, the need to become another regional area of study analogous in form, if not in content, to the other special sciences. And the humanities have been content to play this disciplinary game ever since.

In the last 30 years, the rise of Theory promised to breathe a new, post- modern life into this disciplinary game. By the mid-20th century, the sterility of old fashioned explication de texte was becoming apparent. The linguistic turn opened up a new way for the humanists to ape the rigor of the sciences while simultaneously extending their scholarly turf. In their zeal for technical rigor, they discovered to their delight that texts marvelously shift shape depending upon the theoretical language used in their analyses. Into the moribund body of the humanities flowed the European elixirs of psychoanalysis, phenomenology and hermeneutics, structuralism and post-structuralism, all of which boasted technical vocabularies that would make a quantum physicist blush. With these languages borrowed from other disciplines, the great books of the Western tradition looked fresh and sexy, and whole new fields of scholarship opened up overnight.

At the same moment, however, scholars of the humanities outside the graduate departments of elite universities suddenly found themselves under-serving their students. For the impulse that drives young people to the humanities is not essentially scholarly. The cult of expertise inevitably muffles the jazzy, beating heart of the humanities, and the students who come to the university to understand their great vibration return home unsatisfied. Or worse, they turn into scholars themselves, funneling what was an enormous intellectual curiosity through the pinhole of a respectable scholarly specialty.

Indeed, their good vibrations fade into a barely discernable note, a song they recall only with jaded irony, a sophisticated laugh at the naiveté of their former selves, as if to go to school to learn the meaning of their own lives were an embarrassing youthful enthusiasm. The triumph of irony among graduate students in the humanities, part of the deformation professionelle characteristic of the Harvard virus, exposes just how far the humanities have fallen from their original state. As they were originally conceived, the humanities squirm within the research paradigm and disciplinary boxes at the heart of the Harvard model.

The term "humanities" predates the age of disciplinary knowledge. In the Renaissance, the studia humanitatis formed part of the attempt to reclaim classical learning, to serve the end of living a rich, cultivated life. Whether they were contemplative like Petrarch or engaged like Bruni, Renaissance humanists devoted themselves to the study of grammar, rhetoric, logic, history, literature, and moral philosophy, not simply as scholars, but as part of the project of becoming a more complete human being.

Today, however, the humanities remain entrenched in an outmoded disciplinary ideology, wedded to an academic model that makes it difficult to discharge this fundamental obligation to the human spirit. Despite the threat of the Great Recession, the rise of the for-profit university, and a renewed push for utility the humanities continue to indulge their fetish of expertise and drive students away. Some advocate going digital, for using the newest techno and cyber techniques to improve traditional scholarly tasks, like data-mining Shakespeare. Others turn to the latest discoveries in evolutionary psychology to rejuvenate the ancient texts. But both of these moves are inward looking — humanists going out into the world, only to return to the dusty practices that have led the humanities to their current cul-de-sac. In so doing, colleges and univeristies across the country continue to follow the Harvard model: specialize, seek expertise, and turn inward.

When Descartes and Plotinus and Poe and St. John of the Cross created their works of genius, they were responding not to the scholar’s task of organizing and arranging, interpreting and evaluating the great works of the humanistic tradition, but rather to their own Kansas. Descartes and Rousseau were latter-day Kerouacs, wandering Europe in search of their souls. These men and women produced their works of genius through a vibrant, vibrating attunement to the needs of their time.

The Humanities! The very name should call up something wild. From the moment Socrates started wandering the Greek market and driving Athenian aristocrats to their wits end, their place has always been out in the world, making connections between the business of living and the higher reaches of one’s own thought, and drawing out implications from all that life has to offer. The genius of the humanities lies in the errant thought, the wild supposition, the provocation -- in Ginsburg’s howl at society. What this motley collection of disciplines is missing is an appreciation of the fact that the humanities have always been undisciplined, that they are essentially non-disciplinary in nature. And if we want to save them, they have to be de-disciplined and de-professionalized.

De-disciplining the humanities would transform both the classroom and the curriculum. Disengaging from the Harvard model would first and foremost help us question the assumption that a scholarly expert in a particular discipline is the person best suited to teaching the subject. The quality that makes a great scholar — the breadth and depth of learning in a particular, narrow field — does not make a great teacher; hungry students demand much more than knowledge. While the specialist is hemming himself in with qualifications and complications, the broadly-educated generalist zeros in on the vital nub, the living heart of a subject that drives students to study.

While a scholarly specialist is lecturing on the ins and outs of Frost’s irony, the student sweats out his future, torn between embracing his parent’s dream of having a doctor in the family or taking the road less traveled and becoming a poet. The Harvard model puts great scholars in charge of classrooms that should be dominated by great teachers. And if the parents who are shelling out the price of a contemporary college education knew their dollars were funding such scholarly hobbyhorses, they would howl in protest.

De-disciplining the humanities would also fundamentally change the nature of graduate and undergraduate education. At the University of North Texas Department of Philosophy and Religious Studies, located in the Dallas Metroplex, we are training our graduate students to work with those outside their discipline —  with scientists, engineers, and policy makers — to address some of the most pressing environmental problems the country faces. We call it field philosophy: taking philosophy out into the world to hammer out solutions to highly complex and pressing social, political, and economic problems. Graduate students participate in National Science Foundation grants and practice the delicate skill of integrating philosophic insights into public policy debates, often in a "just-in-time" manner. In class they learn how to frame and reframe their philosophical insights into a variety of rhetorical formats, for different social, political, economic purposes, audiences and time constraints.

At Calumet College of St. Joseph, an urban, Roman Catholic commuter college south of Chicago that serves underprepared, working-class Hispanic, African-American, and Anglo students, we are throwing the humanities into the fight for social justice. Here the humanities are taught with an eye toward creating not a new generation of scholars, but a generation of humanely educated citizens working to create a just society. At Calumet, students are required to take a social justice class.

In it they learn the historical and intellectual roots of Catholic social justice teaching within the context of performing ten hours of community service learning. They work in a variety of social service fields (e.g. children, the elderly, homeless, etc.), which exposes them to the real-life, street-level experience of social challenges. Before, during, and after, students bring this experience back to the classroom to deepen it through reflective papers and class discussion.

High-level humanistic scholarship will always have a place within the academy. But to limit the humanities to the Harvard model, to make scholarship rather than, say, public policy or social justice, the highest ideal of humanistic study, is to betray the soul of the humanities. To study the humanities, our students must learn textual skills, the scholarly operations of reading texts closely, with some interpretive subtlety. But the humanities are much more than a language game played by academic careerists.

Ultimately, the self-cultivation at the heart of the humanities aims to develop the culture at large. Unless they end up where they began -- in the marketplace, alongside Socrates, questioning, goading, educating, and improving citizens -- the humanities have aborted their mission. Today, that mission means finding teachers who have resisted the siren call of specialization and training undergraduate and graduate students in the humanities in the art of politics.

The humanist possesses the broad intellectual training needed to contextualize social problems, bring knowledge to bear on social injustice, and translate disciplinary insights across disciplines. In doing so, the humanist helps hold together an increasingly disparate and specialized society. The scholasticism of the contemporary academy is anathema to this higher calling of the humanities.

We are not all Harvard, and nor should we want to be.

Chris Buczinsky is head of the English program at Calumet College of St. Joseph in Whiting, Indiana. Robert Frodeman is professor of philosophy and director of the Center for the Study of Interdisciplinarity at the University of North Texas.
 

Section: 
Editorial Tags: 

Essay on how the public humanities reshaped a dean's thinking about academic humanities

With so much focus on higher education's obligations to job preparation, the humanities are perpetually playing defense, especially in public higher education. We academic defenders of the humanities generally take one of two lines: we argue that 1) our majors ARE work force preparation -- we develop strong analytical skills, good writing, problem-solving, etc., or 2) we have no need to justify what we teach because the value of the humanities, the study of what makes us human, is self-evident.

These arguments over the value of degrees in the humanities run parallel to a set of arguments I find myself making as part of a role I occupy, as a board member for my state council for the humanities. The National Endowment for the Humanities allocates about a third of its funding through the state councils, and the councils in turn fund humanities initiatives at the state level.

State humanities councils such as mine (Rhode Island's) re-grant our NEH allocation as well as the money we raise locally to community humanities projects. We've funded research on communities of Cape Verdean longshoremen in Providence, oral histories of Second World War vets in hospice care, talk-back events at local theaters, seashore sound archives, a documentary film about a female 19th-century life-saving lighthouse-keeper, and lots of fascinating digital work, from archiving to app development. All the projects must involve humanities scholars — some of those scholars are affiliated with universities, and others aren’t. All of it aims at helping Rhode Islanders to understand ourselves, our histories, and our many cultures.

When economic times are tough, an agency such as the NEH is vulnerable unless legislators understand and value the role of the humanities in a strong democracy -- just as university humanities programs are vulnerable in state funding contexts when legislators, boards of trustees, or voters don't have a clear understanding of the value of the humanities in the culture and in the workplace.

In a career spent in higher education in the humanities, most of it at a liberal arts college, I rarely had to justify teaching what I taught. The value of an English major was self-evident to my colleagues and my students. Sure, the occasional parent would squeak, "But how will she make a living?" But I never hesitated to reassure the anxious check-writers of the value of our product. Having worked in the worlds of both journalism and Washington nonprofits, I knew how many good jobs demanded only a bachelor's degree, writing skills, research and analytic abilities, and common sense.

But then came the Great Recession and what many are calling the end of the higher education bubble. Questions about tuition increases, student debt, and colleges' lack of accountability (that is, the paucity of data on employment for recent graduates) get attached, in public perception, to the unemployment rate and to a re-emergence of the old post-Sputnik fears that the nation is not training enough folks in STEM fields.

Organizations such at the Association of American Colleges and Universities have been proactive in making the case for liberal learning as preparation for good citizenship, pointing to its employers' surveys. They have found that employers believe that the skills colleges should focus on improving are: written and oral communication; critical thinking and analytic reasoning; the application of knowledge and skills in real-world settings; complex problem solving; ethical decision making, and teamwork skills. These skills are not exclusive to the humanities, but they certainly line up with the student learning outcomes in humanities instruction at my institution.

It's not as if defenders of the values of a liberal arts education are ignoring economic realities: many liberal arts colleges are adding business majors, humanities fields are requiring internships and experiential learning, and colleges and universities are scrambling to make contact with successful alumni and to gather post-graduation employment data.

There's nothing wrong with linking liberal arts education in general, and the humanities in particular, to work. The humanities are inextricably linked to work and to U.S. civic life. When Lyndon Johnson signed legislation to bring the NEH into existence in 1965, it was in a context in which the federal government was pushed to invest in culture, as it had in science. NEH's account of its own history explains that the head of the Atomic Energy Commission told a Senate committee: "We cannot afford to drift physically, morally, or esthetically in a world in which the current moves so rapidly perhaps toward an abyss. Science and technology are providing us with the means to travel swiftly. But what course do we take? This is the question that no computer can answer."

Through my role in public humanities, I have come to understand that the humanities are what allow us to see ourselves as members of a civic community. Public history, public art, shared cultural experiences make us members of communities. This link has not been stressed enough in defense of the academic humanities. It's past time to make this important connection -- to help our boards of trustees, our communities, and our legislators to know what the humanities brings to civil society and gives to students as they enter the workforce.

In the first class I ever taught as a teaching assistant, I did my first lecture on Death of a Salesman. My topic was work -- how Willy's job is his identity. I pointed to a student I knew in the 150-student lecture hall and told him that his surname, Scribner, probably indicated the employment of some ancestor of his, a "scrivener," like Bartleby. Then I asked who else had last names that might have indicated a job. We had Millers and Coopers and Smiths, and many more.

When those students' ancestors worked as barrel-makers or at their forges, they worked those jobs for life, and their sons afterward did the same. But how many of us do the job our parents did? How many of our students will do the same job in their 30s that they will do in their 20s? Narrow ideas about work force preparation will not prepare our students for the work of the rest of their lives. Each job they take will train them in the skills they need to succeed in that particular industry. But a broad, liberal education will have been what made them people worth hiring, people who have learned the value of curiosity, initiative, problem-solving. Students in STEM fields and students in arts, social sciences, and humanities all will become members of communities, and a good background in the humanities will enrich their membership.

I loved the humanities as an English professor. But it was only when I became involved in public humanities that I began to understand their value not just for individuals but for communities. That's the public good. And that's why we cannot afford to let a narrow rhetoric of work force preparation push the humanities from our curriculums or defund the work of the National Endowment for the Humanities.

Paula M. Krebs is dean of the College of Humanities and Social Sciences at Bridgewater State University, in Massachusetts, and a member of the board of directors of the Rhode Island Council of the Humanities.

Editorial Tags: 

Essay on digital humanities

Last year Temple University Press published Toby Miller's Blow Up the Humanities, a book that starts straining for provocation with its title and never lets up. The author is a professor of media and cultural studies at the University of California at Riverside. His preferred rhetorical stance is that of the saucy lad -- pulling the nose of Matthew Arnold and not fooled for a minute by all that “culture as the best which has been thought and said” jazz, man.

What we must recognize, his argument goes, is that there are two forms of the humanities now. What the author calls "Humanities One" (with literature, history, and philosophy at their core) is just the privileged and exclusionary knowledge of old and dying elites, with little value, if any, to today’s heterogeneous, globalized, wired, and thrill-a-minute world. By contrast, we have studies of mass media and communications making up “Humanities Two,” which emerged and thrived in the 20th century outside “fancy schools with a privileged research status.”

In the future we must somehow establish a third mode: “a blend of political economy, textual analysis, ethnography, and environmental studies such that students learn the materiality of how meaning is made, conveyed, and discarded.” Enough with the monuments of unaging intellect! Let the dead bury the dead; henceforth, culture must be biodegradable.

What I chiefly remember about Blow Up the Humanities, a few months after reading it, is exclaiming “What a cheeky monkey you are!” every few pages -- or at least feeling like this was expected of me. Otherwise it mostly seemed like vintage cultural-studies boilerplate. But one passage in the book did strike me as genuinely provocative. It takes the form of a footnote responding to Google’s claim of a "commitment to the digital humanities." Here it is, in full:

“In the United States, ‘the digital humanities’ can mean anything from cliometric analysis to ludic observation. It refers to a method of obtaining funds for conventional forms of Humanities One, dressed up in rather straightforward electronic empiricism. So counting convicts in law reports or references to Australia in Dickens becomes worthy of grant support because it is archival and computable.”

A scrawl in the margin records my immediate response upon reading this: “Cute but misleading.” But now, on second thought… Well, actually “cute but misleading” pretty well covers it. The caricature of the digital humanities might have been recognizable a dozen years ago, though just barely even then. What makes Miller’s polemical blast interesting is the angle of the assault. For once, a complaint about the digital humanities isn’t coming from traditionalist, semi-luddite quarters -- “traditionalist” with regard to the objects of study (i.e., books, manuscripts, paintings) if not necessarily the theories and methods for analyzing them.

On the contrary, Miller regards video games as a rich cultural medium, both profitable and profound. To shore up his claims for Humanities Two (or, fingers crossed, Three) he finds it useful to pretend that the digital humanities will, in effect, take us back to the era of professors tabulating Chaucer’s use of the letter “e.” The scholarship will be more efficient, if no less dull.

Now, I have no interest in impeding the forward march of Angry Birds studies, but there is no way that Miller doesn’t know better. The days when humanities computing was used to count dead convicts are long gone. Much more likely now would be a project in which all of the surviving files of Victorian prisons are not simply rendered searchable but integrated with census data, regional maps, and available documentation of riots, strikes, and economic trends during any given year.

My griping about Miller’s griping has as its catalyst the recent appearance of Literary Studies in the Digital Age: An Evolving Anthology, which is part of the Modern Language Association’s “commons” site.  (Anyone can read material published there; only members can contribute.)

MLA is a major component of the Humanities One infrastructure, of course, but has enough Humanities Two people in it to suggest that the distinction is anything but airtight. And while Miller pillories the digital humanities as nothing but “a method of obtaining funds for conventional forms of Humanities One,” even old-school philological practice takes on new valences in a digital environment.

“In the humanities,” write Charles Cooney, Glenn Roe, and Mark Olsen in their contribution, “scholars are primarily concerned with the specifics of language and meaning in context, or what is in the works. [Textbases] tend to represent specific linguistic or national traditions, genres, or other characteristics reflecting disciplinary concerns and scholarly expertise.… [T]extbases in the digital humanities are generally retrospective collections built with an emphasis on canonical works in particular print traditions.”

So far, so Humanities One-ish -- with only the neologism “textbase” to show that much has changed since Isaac Casaubon’s heroic proof that the Corpus Hermeticum wasn’t as ancient as everybody thought. Textbase just means “collection,” of course. For that matter, the options available in textbase design (the ways of annotating a text, of making it searchable, of cross-referencing it with other items in the textbase or even in other textbases) are basically high-tech versions of what scholars did four hundred years ago.

Alas, what Casaubon could do alone in his study now requires an interdisciplinary team, plus technicians. But he did not have the distractions we do.

If digital humanists were limited to converting cultural artifacts of the print era into textbases, that would still be useful enough, in its way. The classics aren’t going to annotate themselves. But the warehouse is much larger than that. Besides the inherited mass of documents from the past 5,000 years, more and more texts are now “born digital.” Besides warehousing and glossing such material, the digital humanities incorporate the changes in how people receive and engage with cultural material, as Alan Liu discusses in “From Reading to Social Computing,” his essay for the MLA anthology.

What Liu calls “the core circuit of literary activity” – the set of institutions, routines, and people involved in transmitting a poem (or whatever) from the author’s notebook to the reader's eyeballs – has been reconfigured dramatically over the past two decades. Besides making it possible to publish or annotate a text in new ways, the developing communication system transforms the culture itself. The digital humanist has to map, and remap, the very ground beneath our feet.

Nor is that a new development. Other papers in the anthology will give you a sense of how the digital humanities have developed over the long term -- beginning when Roberto Busa started using a computer to prepare an exhaustive concordance of Thomas Aquinas in the 1940s. At some point, an important change in the digital humanities will be necessary, which is to drop the word "digital."

(Note: This essay has been updated from an earlier version to correct Toby Miller's name.)

Editorial Tags: 

Educators consider struggles of the humanities worldwide

Smart Title: 

Budget cuts? Politicians questioning disciplines' "relevance"? It's not just the U.S. But at international education gathering, a business school offers hope to those who value literature and history.

Essay argues that English professors can be entrepreneurs

In 1892, the president of Leland Stanford University, David Starr Jordan, managed to convince Ewald Flügel, a scholar at the University of Leipzig, to join the young institution’s rudimentary English department. Flügel had received his doctoral degree in 1885 with a study of Thomas Carlyle under the aegis of Richard Wülcker, one of the founders of English studies in Europe. Three years later, he finished his postdoctoral degree, with a study on Sir Philip Sydney, and was appointed to the position of a Privatdozent at Leipzig.

The position of the Privatdozent is one of the most fascinating features at the modern German universities in the late 19th century. Although endowed with the right to direct dissertations and teach graduate seminars, the position most often offered only the smallest of base salaries, leaving the scholar to earn the rest of his keep by students who paid him directly for enrolling in his seminars and lectures. In a 1903 Stanford commencement speech Flügel warmly recommended that his new colleagues in American higher education embrace the Privatdozent concept:

What would the faculty of Stanford University say to a young scholar of decided ability, who, one or two years after his doctorate (taken with distinction), having given proof of high scholarly work and spirit, should ask the privilege of using a certain lecture room at a certain hour for a certain course of lectures? What would Stanford University say, if – after another year or two this young man, unprotected but regarded with a certain degree of kindly benevolence […], this lecturer should attract more and more students (not credit hunters), if he should become an influence at the university? What if the university should become in the course of years a perfect hive of such bees? […] It would modify our departmental boss-system, our worship of "credits," and other traits of the secondary schools; it would stimulate scholarly life at the university; it would foster a healthy competition in scholarly work, promote survival of the fittest, and keep older men from rusting.

Unabashedly Darwinian, Flügel was convinced that his own contingent appointment back in Germany had pushed him, and pushed all Privatdozenten, to become competitive, cutting-edge researchers and captivating classroom teachers until one of the coveted state-funded chair positions might become available. He held that the introduction of this specific academic concept was instrumental at furthering the innovative character and international reputation of higher education in Germany. Flügel himself had thrived under the competitive conditions, of course, and his entrepreneurial spirit led him to make a number of auspicious foundational moves: He took on co-editorship of Anglia, today the oldest continually published journal worldwide focusing exclusively on the study of “English.” And he founded Anglia Beiblatt, a review journal that quickly established an international reputation.

Despite his formidable achievements, however, he could not secure a chair position as quickly as he hoped. Since he was among the very few late 19th-century German professors of English who possessed near-native proficiency, he began to consider opportunities overseas. Even the dire warnings from a number of east coast colleagues ("the place seems farther away from Ithaca, than Ithaca does from Leipzig"; "they have at Stanford a library almost without books") could not scare him away. Once he had begun his academic adventure in the Californian wilderness, he took on a gargantuan research project, the editorship of the Chaucer Dictionary, offered to him by Frederick James Furnivall, the most entrepreneurial among British Chaucerians and founder of the Chaucer Society. As soon as he took over from colleagues who had given up on the project, he found, in this pre-computer age of lexicography, "slips of all sizes, shapes, colors, weights, and textures, from paper that was almost tissue paper to paper that was almost tin. Every slip contained matter that had to be reconsidered, revised, and often added to or deleted.”

Undeterred by this disastrous state of affairs, he decided to resolve the problem with typically enterprising determination: Although grant writing was uncharted territory for him, he applied for and secured three annual grants for $7,500 and one for $11,000 (altogether the equivalent of at least $300,000 in today’s money!) from the Carnegie Foundation for the Advancement of Teaching between 1904 and 1907 "for the preparation of a lexicon for the works of Geoffrey Chaucer," bought himself some time away from Stanford, and signed up a dozen colleagues and students in Europe and North America to assist him in his grand plan.

His and their work would become the foundation of the compendious Middle English Dictionary which now graces every decent college library in the English-speaking world and beyond. Beyond the work on the Chaucer Dictionary, the completion of which he never saw because of his sudden death in 1914, he maintained an impressive publication record and served in leadership positions such as the presidency of the Pacific Branch of the American Philological Association. When Flügel passed away, his American colleagues celebrated his "enthusiastic idealism" and remembered him as "more essentially American" than the other foreign-born colleagues they knew, an appreciation due to his entrepreneurial spirit.

I am relating this story to counteract the often defeatist chorus sung by colleagues in English and other humanities departments when confronted with a request, usually from impatient administrators in more grant-active areas, for at least giving grant writing and other entrepreneurial activities a try. There is no doubt that, compared to the situation in most other Western democracies, government support through the National Endowments for the Humanities and Arts is small in the U.S. Conversely, the number of private foundations, from the American Council of Learned Societies through the Spencer Foundation, makes up for some of the difference.

In my experience, what keeps the majority of English professors from even considering an involvement with entrepreneurial activities is that they deem them an unwelcome distraction from the cultural work they feel they have been educated, hired, and tenured to do. Most grant applications require that scholars explain not only the disciplinary, but also the broader social and cultural relevance of their work. In addition, they entail that scholars put a monetary value on their planned academic pursuits and create a bothersome budget sheet, learn how to use a spreadsheet, develop a timeline, and compose an all-too-short project summary, all grant-enabling formal obstacles many colleagues consider beneath the dignity of their profession.

In fact, many of us believe that the entire discipline of English and the humanities in general may have been created so as to counterbalance the entrepreneurial principles and profit motives which, from within the English habitat, seem to have a stranglehold over work in colleges of business, computing, engineering, and science. However, by making English a bastion of (self-)righteous resistance against the evil trinity of utilitarianism, pragmatism, and capitalism, English professors have relinquished the ability to be public intellectuals and to shape public discourse. After all, too many of our books and articles speak only to ourselves or those in the process of signing up to our fields at colleges and universities.

Ewald Flügel labored hard to remain socially and politically relevant even as he was involved in professionalizing and institutionalizing the very discipline we now inhabit. Recognizing that the skills and kinds of knowledge provided by his emerging field were insufficient for solving complex real-world issues, he became a proponent of a more co-disciplinary approach to academic study, a kind of cultural studies scholar long before that term was invented. Most of us would agree that he applied his formidable linguistic and literary expertise to a number of problematic goals, speaking to academic and public audiences about how the steadily increasing German immigration and the powers of German(ic) philology should and would inevitably turn the United States into an intellectual colony of his beloved home country. However, even if his missionary zeal reeks of the prevailing nationalist zeitgeist, I can appreciate his desire to experiment, innovate, and compete to make the study of historical literature and language as essential to the academy and to humanity as did his approximate contemporaries Roentgen, Eastman, Edison, Diesel, Marconi, and Pasteur with their scientific endeavors.

Perhaps his example might entice some of us to revisit and even befriend the idea of entrepreneurship, especially when it involves NGOs or the kind of for-profit funding sources the Just Enough Profit Foundation might define as (only) "mildly predatory" or (preferably) "somewhat," "very" and "completely humanistic." At the very least, Flügel’s biography provides evidence that today’s prevailing anti-entrepreneurial mindset has not always been among the constitutive elements defining the "English" professoriate.

There are encouraging signs that some colleagues in English studies have begun to abandon that mindset: George Mason University’s Center for Social Entrepreneurship (directed by Paul Rogers, a professor of English) and the University of Texas consortium on Intellectual Entrepreneurship (directed by Richard Cherwitz, a professor of rhetoric and communication), generate promising cross-disciplinary collaboration between the academy and society; English professors at Duke, Georgia Tech, and Ohio State, funded by the Bill & Melinda Gates Foundation, are among the national leaders testing the pedagogical viability of the controversial massive open online courses (MOOCs); and Ellito Visconsi of the University of Notre Dame, and Bryn Mawr colleague Katherine Rowe created Luminary Digital Media LLC, a startup that distributes their "Tempest for iPad," an application designed for social reading, authoring, and collaboration for Shakespeare fans with various levels of education. I believe Ewald Flügel would find these projects exciting.

Richard Utz is professor and chair in the School of Literature, Media, and Communication at the Georgia Institute of Technology.

Editorial Tags: 

Essay critiques Garrison Keillor for his jokes about English majors

Dear Garrison,

After yet another joke on "A Prairie Home Companion" about an English major who studies Dickens and ends up at a fast-food restaurant frying chickens, I couldn’t take it anymore. I had to write.

You and I go way back. I started listening to you during my undergraduate years as an English major in the mid-'80s and continued while in graduate school in English literature, when making a nice dinner and listening to "Prairie Home" was my Saturday night ritual. I get that you’re joking. I get the whole Midwesterner take down of — and fascination with — cultural sophistication that animates your show. I get that you yourself were an English major. And I get affectionate irony.

I’m afraid, however, that jokes about bitter and unemployed English majors that are already unfortunate in an economy humming along at 4.5 percent unemployment are downright damaging when the unemployment rate is near 8 percent — and some governors, in the name of jobs, are calling for liberal arts heads. Likewise, the most recent annual nationwide survey of the attitudes of college freshmen reported an all-time high in the number of students who said that "to be able to get a better job" (87.9 percent) and "to be able to make more money" (74.6 percent) were "very important" reasons to go to college. Not surprisingly, the same survey reported that the most popular majors were the most directly vocational: business, the health professions, and engineering (biology was also among the most popular). 

The truth, however, is that reports of the deadliness of English to a successful career are greatly exaggerated. According to one major study produced by the Georgetown University Center on Education and the Workforce, the median income for English majors with a bachelor’s but no additional degree is $48,000. This figure is just slightly lower than that for bachelor’s degree holders in biology ($50,000), and slightly higher than for those in molecular biology or physiology (both $45,000). It’s the same for students who received their bachelor’s in public policy or criminology (both $48,000), slightly lower than for those who received their bachelor’s in criminal justice and fire protection ($50,000) and slightly higher than for those who received it in psychology ($45,000). 

Another study by the same center paints a similar picture with respect to unemployment. In this study, the average unemployment rate for recent B.A. holders (ages 22-26) over the years 2009-10 was 8.9 percent; for English it was 9.2 percent. Both rates are higher than we would wish, but their marginal difference is dwarfed by that between the average for holders of the B.A. and that of high school graduates, whose unemployment rate during the same period was 22.9 percent (also too high). 

Of course, majors in engineering and technology, health, and business often have higher salary averages, between $60,000 (for general business) and $120,000 (for petroleum engineering) and marginally lower unemployment rates, especially for newly minted B.A.s. But there’s nothing reckless about majoring in English compared to many other popular majors. Students who love business or engineering, or who are good at them and simply want to earn the highest possible income, make reasonable choices to pursue study in these fields. But students who want to major in English and are good at it should not believe that they are sacrificing a livelihood to pursue their loves. And students who don’t love what they are learning are less likely to be successful.

Because this kind of information is readily available, it makes me wonder why you, Garrison — and you’re not alone — continue to dump on English as a major. I think it must be because in the world of Lake Wobegon the English major has cultural pretensions that need to be punished with loneliness and unemployment. Likewise, the Midwesterner in you can’t believe that anyone who gets to do these things that you yourself love so much — revel in the pleasures of language and stories — could also be rewarded with a decent job.

Garrison, when it comes to English majors, let your inner Midwesterner go. You can study English and not be a snob. And you can study English and not fail in the world. I know you know these things; you’ve lived them. So my plea to you, Garrison, is this. Your "Writer’s Almanac" does a terrific job promoting the love of language and the study of English. But in my media market it plays at 6:35 am. Even where it gets better play, it has nowhere near the prominence of "A Prairie Home Companion." Can you find a way on the latter to tell stories about English majors that don’t involve failure? These stories would make a fresh alternative on your show to a joke way past its sell-by date. And they might make a few parents less likely to discourage their kids from studying English.

And here’s my final plea to all former English majors. "A Prairie Home Companion" can help, but English also needs its "CSI" or "Numb3rs." I know some of you are out there now writing for television and film. I admit it will take some creative chops to develop stories about English study that are as glamorous and engaging as crime drama. But you were an English major. I know you can do it. And it’s time to pay it forward.

Sincerely,
Robert Matz
Chair, English Department
George Mason University

P.S. to all former English majors: Since writing this letter I’ve learned about a new Fox TV show called "The Following" that features an English professor. He’s a serial killer who inspires others to kill. Maybe next time the English professor could be the hero? Thanks.

Section: 
Editorial Tags: 

Review of Jason Dittmer, "Captain America and the Nationalist Superhero: Metaphors, Narratives, and Geopolitics"

Intellectual Affairs

In an essay first published in 1948, the American folklorist and cultural critic Gershon Legman wrote about the comic book -- then a fairly recent development -- as both a symptom and a carrier of psychosexual pathology. An ardent Freudian, Legman interpreted the tales and images filling the comics’ pages as fantasies fueled by the social repression of normal erotic and aggressive drives. Not that the comics were unusual in that regard: Legman’s wider argument was that most American popular culture was just as riddled with misogyny, sadomasochism, and malevolent narcissism. And to trace the theory back to its founder, Freud had implied in his paper “Creative Writers and Daydreaming” that any work of narrative fiction grows out of a core of fantasy that, if expressed more directly, would prove embarrassing or offensive. While the comic books of Legman’s day might be as bad as Titus Andronicus – Shakespeare’s play involving incest, rape, murder, mutilation, and cannibalism – they certainly couldn’t be much worse.

But what troubled Legman apart from the content (manifest and latent, as the psychoanalysts say) of the comics was the fact that the public consumed them so early in life, in such tremendous quantity. “With rare exceptions,” he wrote, “every child who was six years old in 1938 has by now absorbed an absolute minimum of eighteen thousand pictorial beatings, shootings, stranglings, blood-puddles, and torturings-to-death from comic (ha-ha) books alone, identifying himself – unless he is a complete masochist – with the heroic beater, strangler, blood-letter, and/or torturer in every case.”

Today, of course, a kid probably sees all that before the age of six. (In the words of Bart Simpson, instructing his younger sister: “If you don't watch the violence, you'll never get desensitized to it.”) And it is probably for the best that Legman, who died in 1999, is not around to see the endless parade of superhero films from Hollywood over the past few years. For in the likes of Superman, he diagnosed what he called the “virus” of a fascist worldview.

The cosmos of the superheroes was one of “continuous guilty terror,” Legman wrote, “projecting outward in every direction his readers’ paranoid hostility.” After a decade of supplying Superman with sinister characters to defeat and destroy, “comic books have succeeded in giving every American child a complete course in paranoid megalomania such as no German child ever had, a total conviction of the morality of force such as no Nazi could even aspire to.”

A bit of a ranter, then, was Legman. The fury wears on the reader’s nerves. But he was relentless in piling up examples of how Americans entertained themselves with depictions of antisocial behavior and fantasies of the empowered self. The rationale for this (when anyone bothered to offer one) was that the vicarious mayhem was a release valve, a catharsis draining away frustration. Legman saw it as a brutalized mentality feeding on itself -- preparing real horrors through imaginary participation.

Nothing so strident will be found in Jason Dittmer’s Captain America and the Nationalist Superhero: Metaphors, Narratives, and Geopolitics (Temple University Press), which is monographic rather than polemical. It is much more narrowly focused than Legman’s cultural criticism, while at the same time employing a larger theoretical toolkit than his collection of vintage psychoanalytic concepts. Dittmer, a reader in human geography at University College London, draws on Homi Bhabha’s thinking on nationalism as well as various critical perspectives (feminist and postcolonial, mainly) from the field of international relations.

For all that, the book shares Legman’s cultural complaints to a certain degree, although none of his work is cited. But first, it’s important to stress the contrasts, which are, in part, differences of scale. Legman analyzed the superhero as one genre among others appealing to the comic-book audience -- and that audience, in turn, as one sector of the mass-culture public. 

Dittmer instead isolates – or possibly invents, as he suggests in passing – a subgenre of comic books devoted to what he calls “the nationalist superhero.” This character-type first appears, not in 1938, with the first issue of Superman, but in the early months of 1941, when Captain America hits the stands. Similar figures emerged in other countries, such as Captain Britain and (somewhat more imaginatively) Nelvana of the Northern Lights, the Canadian superheroine. What set them apart from the wider superhero population was their especially strong connection with their country. Nelvana, for instance, is the half-human daughter of the Inuit demigod who rules the aurora borealis. (Any relationship with actual First Nations mythology here is tenuous at best, but never mind.)

Since Captain America was the prototype –- and since many of you undoubtedly know as much about him as I did before reading the book, i.e., nothing – a word about his origins seems in order. Before becoming a superhero, he was a scrawny artist named Steve Rogers who followed the news from Germany and was horrified by the Nazi menace. He tried to join the army well before the U.S entered World War Two but was rejected as physically unfit. Instead, he volunteered to serve as a human guinea pig for a serum that transforms him into an invincible warrior. And so, as Captain America -- outfitted with shield and spandex in the colors of Old Glory – he went off to fight Red Skull, who was not only a supervillain but a close personal friend of Adolf Hitler.  

Now, no one questions Superman’s dedication to “truth, justice, and the American way,” but the fact remains that he was an alien who just happened to land in the United States. His national identity is, in effect, luck of the draw. (I learn from Wikipedia that one alternate-universe narrative of Superman has him growing up on a Ukrainian collective farm as a Soviet patriot, with inevitable consequences for the Cold War balance of power.) By contrast, Dittmer’s nationalist superhero “identifies himself or herself as a representative and defender of a specific nation-state, often through his or her name, uniform, and mission.”

But Dittmer’s point is not that the nationalist superhero is a symbol for the country or a projection of some imagined or desired sense of national character. That much is obvious enough. Rather, narratives involving the nationalist superhero are one part of a larger, ongoing process of working out the relationship between the two entities yoked together in the term “nation-state.”

That hyphen is not an equals sign. Citing feminist international-relations theorists, Dittmer suggests that one prevalent mode of thinking counterposes “the ‘soft,’ feminine nation that is to be protected by the ‘hard,’ masculine state” -- which is also defined, per Max Weber, as claiming a monopoly on the legitimate use of violence. From that perspective, the nationalist superhero occupies the anomalous position of someone who performs a state-like role (protective and sometimes violent) while also trying to express or embody some version of how the nation prefers to understand its own core values.

And because the superhero genre in general tends to be both durable and repetitive (the supervillain is necessarily a master of variations on a theme), the nationalist superhero can change, within limits, over time. During his stint in World War II, Captain America killed plenty of people in combat with plenty of gusto and no qualms. It seems that he was frozen in a block of ice for a good part of the 1950s, but was thawed out somehow during the Johnson administration without lending his services to the Vietnam War effort. (He went in Indochina just a couple of times, to help out friends.) At one point, a writer was on the verge of turning the Captain into an overt pacifist, though the publisher soon put an end to that.

Even my very incomplete rendering of Dittmer’s ideas here will suggest that his analysis is a lot more flexible than Legman’s denunciation of the superhero genre. The book also makes more use of cross-cultural comparisons. Without reading it, I might never known that there was a Canadian superhero called Captain Canuck, much less the improbable fact that the name is not satirical.

But in the end, Legman and Dittmer share a sense of the genre as using barely conscious feelings and attitudes in more or less propagandistic ways. They echo the concerns of one of the 20th century's definitive issues: the role of the irrational in politics. And that doesn't seem likely to become any less of a problem any time soon.

 

 

Editorial Tags: 

Pages

Subscribe to RSS - English
Back to Top