Time has not been especially kind to Richard Posner’s Public Intellectuals: A Study of Decline (Harvard, 2001). The frequent complaints about scholars wandering beyond their areas of expertise to pontificate on the Clinton impeachment feel like yesteryear’s editorials. The book’s statistical tables tried to quantify the influence of various thinkers and writers – as registered, for one thing, by Google’s turn-of-the-millennium algorithm. And even Posner’s overarching generalizations seem now to have been overcome by events. It retains some interest as a landmark, though, even at this growing distance.
Posner took the old notion of a “marketplace of ideas” in a new direction by treating the activity of public intellectuals as governed by supply and demand. “With hundreds of television channels to fill,” he wrote, “with the Internet a growing medium for the communication of news and opinion, and with newspapers becoming ever more like magazines in an effort to maintain readership in the face of the lure of continuously updated news on television and over the Internet, the opportunity cost to media of providing a platform for public intellectuals has shrunk.”
The ultimate consumer of the public intellectuals’ “symbolic goods” was the lay reader or viewer, who lacked (and presumably desired) the specialized information, the command and context, and the analytical tools used by the commentators. Unfortunately this also meant that the public was in no position to judge the quality of the goods being proffered. “The media through which the public intellectual reaches his audience perform virtually no gatekeeping function,” wrote Posner. “The academic whose errors of fact, insight, and prediction in the public-intellectual market are eventually detected can, as I have emphasized, abandon the market, returning to full-time academic work, at slight cost.”
Rereading the book now, I get the feeling that Posner had a satirical novel inside him that might have held up better than its nonfiction substitute. His model is specific to roughly the last half of the 20th century. It takes as a given the one-way flow of communication from academic specialists, through mass media, to a mass audience incapable of judging what it receives and unable to generate any “symbolic goods” of its own.
All that has changed, for good and for ill. The credentialed specialist and the uninformed layperson turn out to be endpoints of a continuum, rather than absolute opposites. Any given idea or analysis can now inspire a Socratic colloquy. Of course, it’s just as likely to inspire a howling mob of abject ignoramuses, but of course Socrates’s interventions in public discourse did not always turn out well, either.
The term “public intellectual” itself, according to Posner, “was coined by Russell Jacoby in a book published in 1987.” In fact it was first used by C. Wright Mills in 1958, but the phrase entered wide usage only in the wake of Jacoby’s The Last Intellectuals: American Culture in the Age of Academe. Mills and Jacoby were referring to something quite different from Posner’s cohort of moonlighting celebrity academics. Rather, they had in mind generations of writers and thinkers for whom the demands of either the university or mass media were a minor concern, if even that. My essay for Bookforum on the 20th anniversary of The Last Intellectuals discusses the cultural ecology that made such figures possible, and the changes rendering them all but extinct.
Since the book appeared, Jacoby has published a few more volumes, as well a great many essays and reviews, though seldom through an academic press or journal. And his position in the history department at the University of California at Los Angeles is sufficiently irregular – he is listed as “professor in residence” and does not have tenure – to suggest someone half in the door and half out. He lists among his awards the Moishe Gonzales Folding Chair in Critical Theory -- an homage to the late social theorist Paul Piccone, founding editor of the journal Telos. The improbable name Moishe Gonzales was the pseudonym Piccone used for some of his particularly scathing critiques of academic trends.
The news that someone had made a documentary about Jacoby came as a surprise. It also made me realize that, after reading him for more than a quarter of a century, I had no idea what he looked like. If the Posnerian public intellectual is a talking head, clearly the Jacobean variety is not – or was not, anyway, until the appearance of "Velvet Prisons: Russell Jacoby on American Academia," available on DVD and currently available for viewing as part of the Humanity Explored film festival hosted by Culture Unplugged, which describes itself as a “new media studio.” (Not sure how that would work unplugged, but never mind.)
Ten or 15 minutes into watching "Velvet Prisons," curiosity about its origins got the better of me, so I hit pause and made contact with Kurt Jacobsen, one of the producers and directors, whose name was familiar from various publications including Logos: A Journal of Modern Society and Culture (here) where he is book-review editor. With Warren Leming -- an actor and musician who has directed a number of documentaries – he founded Cold Chicago Productions, which brought out two films before "Velvet Prisons," its latest release. (Another, "American Road," will be out this summer.)
Jacobsen refers to the enterprise as Debtors Prison Productions, since “the budget came out of our thin pockets, like everything else we do.” In 2008, Leming invited Jacoby to come to Chicago for extensive interviews, running to six hours of footage. Jacobsen, a research associate in political science at the University of Chicago, held Jacoby’s work in high regard and was glad to participate in the interviews, although the project itself seemed unlikely to get much funding. “We went ahead because it seemed a needed thing,” he writes in an email note, “a necessary intervention.” The producers spent five or six thousand dollars out of pocket: “That leaves out the incalculables of hundreds of hours of free labor by myself and Warren and some others.”
"Velvet Prisons" sketches Jacoby’s intellectual development from high school through his years on the academic job market, while also working in brief characterizations of most of Jacoby’s books – some of them, such as Social Amnesia: A Critique of Contemporary Psychology (1975) and Dialectic of Defeat: Contours of Western Marxism (1981), in very broad strokes, to be sure.
“Our key challenge,” Jacobsen says, “was how to keep a solo talking head, no matter how provocative or profound, visually interesting. Initially we thought we might only hold the most dedicated viewers for half an hour but eventually worked out and settled on a 55 minute version.” The finished product incorporates historically pertinent film footage and book covers, as well as portraits of philosophers and sociologists, sometimes accompanied by passages from their work read in voice-over.
One particularly memorable and effective sequence appears in the course of Jacoby’s very sharp comments on the academic mores that marginalize writers with an interest in addressing a general and educated audience – an ethos that “rewards careerism and networking and backslapping” and people “making quiet non-contributions to micro-fields” rather than “taking it big,” as his hero C. Wright Mills encouraged young sociologists to do. As he begins to discuss the forces pushing scholars to focus on talking about their work only with one another, the screen fills with photographs taken in the meeting rooms and auditoriums of hotel conference centers. The chairs and the ambiance are always the same. (My immediate reaction, on first viewing, was to scan the pictures, expecting to find a familiar face.) How is this in any way preferable to what Posner complained about – the colleagues willing to provide grist for media blather mills?
"There is nothing said in ‘Velvet Prisons,’ by the way,” Jacobsen tells me in the course of our e-mail discussion, “that does not resonate with my own experiences and observations in the darling groves of academe.” He calls the documentary “the proverbial labor of love, and lament…. [The] worst thing I've heard [about Velvet Prisons] is a British scholar friend calling it an ‘elegy’ -- and he probably has a point.”
He says that Jacoby “was very genial, quite modest and, I think, awfully shocked when we actually came up with the doc.” As a matter of fact, by that point Russell Jacoby himself had answered a request for his thoughts on the film, and they corroborated the director’s impressions.
“I did not think they were serious,” Jacoby responded by e-mail. “Why me? I did indeed sit for some interviews, but I really thought that I would never hear from them again. I could not imagine the project going forward. To my great surprise it did go forward. It turned out they were serious. I still don't get it. I'm in no position to judge it. I find it embarrassing to watch.” His response to seeing himself hold forth on screen was “Who is that idiot?”
Hardly a fair assessment. "Velvet Prisons" will irritate some people very much, while many more will watch it with interest and sympathy and even decide to go read Jacoby’s books. All to the good, either way. But my own impression is that the documentary feels unfinished, perhaps because Jacoby’s interpretation of “American culture in the age of academe” is unfinished.
It is at very least in need of an update. Arguing that the pursuit of tenure distorts the development and ethos of young intellectuals has begun to sound like someone complaining that the visual quality of a film is ruined when put on VHS. It may be true, but it’s a problem for fewer and fewer people all the time. At the same time, Jacoby has little to say about the situation of the public intellectual now, with the means of communication between thinker and public in flux. "Velvet Prisons" itself is an example of instance of such change.
It would be worth having another documentary in which Russell Jacoby follows up the arguments left undeveloped in his cinematic debut. But that, alas, remains unlikely. “My cinematic debut,” he told me, “will converge with my cinematic exit.”
Toward the end of one summer — 1994, to be precise — I arrived at St. Lawrence University as an 18-year-old freshman, excited yet nervous to begin my college career. I had a vague notion that I wanted to be a writer someday, though I didn’t really have an idea of what that would entail or how difficult it would be. I wasn’t particularly anxious about the classes I would be taking — though in hindsight, judging by my grades that first semester, I probably should have been.
No, my concerns were more social in nature. Would I like my roommate? Who would become my friends? Would the people who promised in my high school yearbook that we would be "friends forever" still matter to me, and I to them, by the time we saw each other again at Thanksgiving? Would I finally have sex? The answer to these questions were: Not particularly, a bunch of people, some, and no.
The last answer was the most devastating, to the freshman me, but all in all, that first year of college was a good experience. I read King Lear. I learned from my new female friends that feminists were not, as I had been led to believe, castrating man-haters. I saw my first Kurosawa film. I attended several meetings of the Black Student Union — for the first time, I experienced what it’s like to be the only white person in a room. I was in a play. I perfected my impressions of both R.E.M.’s Michael Stipe and the B-52’s Fred Schneider, in order to entertain my friends on Friday nights fueled by cheap beer and Boone’s Farm "wine products." I read memoirs and essays by the likes of Tobias Wolff, Piri Thomas, and Maxine Hong Kingston that created and nourished my interest in creative nonfiction forms.
As that first year came to a close, I was a little stressed by final exams and papers, and somewhat concerned that I’d never get a girlfriend. Mostly, though, I thought college was an exciting, intellectually challenging, and fun place to be, and I knew I didn’t ever want to leave. So, with the exception of a short break due to some health issues, I really didn’t — I went to grad school, eventually earned a Ph.D., and have been employed on college campuses ever since.
I’ve recently returned to my beloved alma mater — which I’ve written about for Inside Higher Edbefore — in order to teach creative writing and literature. This one-year visiting position came along at a time when, to be honest, I had been thinking about getting out of the academy altogether. Although I still loved teaching and writing and developing as a scholar and thinker, I had begun to feel, at the very least, like I did not belong — and could not stay — at the college where I had been working since 2008. There were many reasons for this feeling, but the important point is that I realized that I was unhappy where I was — that this was not the job I thought it would be. Worse still, I began to fear that the problem wasn’t that specific location, but rather that I’m not cut out for this line of work. So I returned to the scene of the crime, the place where I first learned to love literature, writing, and the academic life.
In "Once More to the Lake," E.B. White talks of returning to the lake where his father used to take the family on vacations, this time as a grown man with a son of his own. The essay is noteworthy for a variety of reasons, but kind of funny for his insistence that this place is just as he remembered it, even though he gives a list of things that have changed. "I could tell," he notes after observing the fact that the road leading to the camp was now paved, "that it was going to be pretty much the same as it had been before....” Or when talking about the nearby store: "Inside, all was as it had been, except...." Or the waitresses who serve them their pie, who were "the same country girls, there having been no passage of time, only the illusion of it as in a dropped curtain — the waitresses were still fifteen; their hair had been washed, that was the only difference — they had been to the movies and seen the pretty girls with the clean hair."
Different, but the same. Timeless, yet pushed forward in time. I didn’t really understand White’s disorientation until I returned to St. Lawrence. As White returns to the lake as a father, I’ve returned to St. Lawrence as a professor. He feels, at times, his own father next to him — or perhaps within him, as if he has become his father by bringing his son to this place. I teach in "The Shakespeare Room" in Richardson Hall, dedicated to Emeritus Professor of English Thomas L. Berger, my own Shakespeare professor from 15 years ago, whose blown-up photograph hangs on the wall to my left as I do my best to lead a discussion on Emily Dickinson.
Professor Berger isn’t really beside me, just as White’s father is not with him, yet his presence on that wall reminds me of what type of professor I want to be — erudite, funny, and maybe a little bit intimidating to students who haven’t done the reading.
On days when it’s not too cold — and here in New York’s North Country, those days can be few and far between this time of year — I like to walk around campus. I made a point of showing my wife the dorm I lived in freshman year, where I met the friend who would later ask me to be the godfather to her son. I walked through the building that now houses the theater and fine arts department, but that used to be the student union, where we would occasionally get pizza or burgers at the Northstar Pub, which stopped selling beer after my freshman year but was still called "The Pub" when I graduated. The new student union — located in a more centralized area of campus — houses the Northstar Café, but the students still call it "The Pub" for reasons that are probably a complete mystery to them.
As I was walking home from a poetry reading on campus one night last semester, a student smoking in front of his dorm called out "Dr. Bradley!" and walked toward me in order to talk about class. I haven’t had a cigarette in years, but I almost asked him for one. It seemed like the thing to do. Smoke a cigarette, talk about what you’d been reading. How many times did I do just that with my friends? Those actors and singers and painters and writers who were all so into this world they were just discovering. How many cigarettes did I smoke, talking about Uta Hagen, or Annie Dillard, or Quentin Tarantino? Of course, we smoked inside, back then. It was the '90s. A different era.
White notes that the souvenir counters at the store offer "postcards that showed things looking a little better than they looked," which is sometimes how the past seems when we reflect. If I talk of loving college, I should also tell you that I frequently drove myself crazy, putting the finishing touches on a paper at 4:30 when it was due at 5:00, then running around campus with a disk in hand, trying to find an available printer (again, it was the '90s). There were those times, towards the end of the semester, when — out of money on my meal card — I had to eat sandwiches made of generic white bread and processed cheese slices for every meal. And there were the romantic relationships. They all started out fun, but frequently ended with someone crying.
Still, if the experience was sometimes painful, it was also always educational. I wouldn’t want to trade those experiences or forget those lessons — they’ve shaped the writer, teacher, friend, and husband I am today. And something about this experience of being back on this campus has reminded me — and I’m shocked that I needed to be reminded — that my students are having those very same experiences right now. They’re reading something that’s going to change their lives. They’re falling in love. They’re learning not to send e-mails drunk. They’re listening to the Velvet Underground for the very first time. They’re figuring out who they’re going to be as they begin their adult lives.
So much is different. Everything’s the same.
In my previous Inside Higher Ed column, I talked about remembering my own youthful mistakes when I find myself frustrated with my students. I’m glad to have such perspective — it sometimes saves my sanity — but I’m also glad to remember how awesome it was to be young, to be humbled by the realization that there was so much out there to learn. I had lost some of that enthusiasm in the years since my own undergrad days, but being here, seeing and identifying with these students, has caused me to remember. As a 21st-century academic, it’s awfully easy to get nervous and jaded — it seems like every day, someone from outside of the academy is throwing around words and phrases like "strategic dynamism," "innovative disruption" or "paradigm shift" that don’t really mean anything to me except that the speaker or author doesn’t think very highly of the work we do in the academy, or at least the way we do it. I frequently feel embattled or unappreciated, but this year at my old school has reminded me that I didn’t go to grad school to make politicians or business leaders like me. I went because I wanted to help young people have the same life-changing experience I had.
It’s cold here in Canton right now — one day this week, it didn’t even get above zero — but you wouldn’t know it from all the activity happening on campus. There are informational meetings for students interested in studying abroad in the Czech Republic and Thailand. There’s a screening of the film "Argo." The student organization dedicated to environmental activism is having a vegetarian dinner, open to all interested students. There are athletic events. And, of course, there are classes. I’m not saying that these are activities special to St. Lawrence — I’m sure if you work on a college campus, similar stuff is happening around you. But sometimes, I think, the stress of our jobs causes us to forget what an awesome place a vibrant campus can be.
At the end of White’s essay, he talks of feeling "the chill of death" as he watches his son prepare to swim in the rain, but my recent experience with students at my alma mater has reminded me of how powerful it can be, to be surrounded by the warmth of lives that are really just beginning. I don’t know where I’ll be in a few months, but I’m glad for having learned this lesson this year.
William Bradley is visiting assistant professor of English at St. Lawrence University.
After yet another joke on "A Prairie Home Companion" about an English major who studies Dickens and ends up at a fast-food restaurant frying chickens, I couldn’t take it anymore. I had to write.
You and I go way back. I started listening to you during my undergraduate years as an English major in the mid-'80s and continued while in graduate school in English literature, when making a nice dinner and listening to "Prairie Home" was my Saturday night ritual. I get that you’re joking. I get the whole Midwesterner take down of — and fascination with — cultural sophistication that animates your show. I get that you yourself were an English major. And I get affectionate irony.
I’m afraid, however, that jokes about bitter and unemployed English majors that are already unfortunate in an economy humming along at 4.5 percent unemployment are downright damaging when the unemployment rate is near 8 percent — and some governors, in the name of jobs, are calling for liberal arts heads. Likewise, the most recent annual nationwide survey of the attitudes of college freshmen reported an all-time high in the number of students who said that "to be able to get a better job" (87.9 percent) and "to be able to make more money" (74.6 percent) were "very important" reasons to go to college. Not surprisingly, the same survey reported that the most popular majors were the most directly vocational: business, the health professions, and engineering (biology was also among the most popular).
The truth, however, is that reports of the deadliness of English to a successful career are greatly exaggerated. According to one major study produced by the Georgetown University Center on Education and the Workforce, the median income for English majors with a bachelor’s but no additional degree is $48,000. This figure is just slightly lower than that for bachelor’s degree holders in biology ($50,000), and slightly higher than for those in molecular biology or physiology (both $45,000). It’s the same for students who received their bachelor’s in public policy or criminology (both $48,000), slightly lower than for those who received their bachelor’s in criminal justice and fire protection ($50,000) and slightly higher than for those who received it in psychology ($45,000).
Another study by the same center paints a similar picture with respect to unemployment. In this study, the average unemployment rate for recent B.A. holders (ages 22-26) over the years 2009-10 was 8.9 percent; for English it was 9.2 percent. Both rates are higher than we would wish, but their marginal difference is dwarfed by that between the average for holders of the B.A. and that of high school graduates, whose unemployment rate during the same period was 22.9 percent (also too high).
Of course, majors in engineering and technology, health, and business often have higher salary averages, between $60,000 (for general business) and $120,000 (for petroleum engineering) and marginally lower unemployment rates, especially for newly minted B.A.s. But there’s nothing reckless about majoring in English compared to many other popular majors. Students who love business or engineering, or who are good at them and simply want to earn the highest possible income, make reasonable choices to pursue study in these fields. But students who want to major in English and are good at it should not believe that they are sacrificing a livelihood to pursue their loves. And students who don’t love what they are learning are less likely to be successful.
Because this kind of information is readily available, it makes me wonder why you, Garrison — and you’re not alone — continue to dump on English as a major. I think it must be because in the world of Lake Wobegon the English major has cultural pretensions that need to be punished with loneliness and unemployment. Likewise, the Midwesterner in you can’t believe that anyone who gets to do these things that you yourself love so much — revel in the pleasures of language and stories — could also be rewarded with a decent job.
Garrison, when it comes to English majors, let your inner Midwesterner go. You can study English and not be a snob. And you can study English and not fail in the world. I know you know these things; you’ve lived them. So my plea to you, Garrison, is this. Your "Writer’s Almanac" does a terrific job promoting the love of language and the study of English. But in my media market it plays at 6:35 am. Even where it gets better play, it has nowhere near the prominence of "A Prairie Home Companion." Can you find a way on the latter to tell stories about English majors that don’t involve failure? These stories would make a fresh alternative on your show to a joke way past its sell-by date. And they might make a few parents less likely to discourage their kids from studying English.
And here’s my final plea to all former English majors. "A Prairie Home Companion" can help, but English also needs its "CSI" or "Numb3rs." I know some of you are out there now writing for television and film. I admit it will take some creative chops to develop stories about English study that are as glamorous and engaging as crime drama. But you were an English major. I know you can do it. And it’s time to pay it forward.
Chair, English Department
George Mason University
P.S. to all former English majors: Since writing this letter I’ve learned about a new Fox TV show called "The Following" that features an English professor. He’s a serial killer who inspires others to kill. Maybe next time the English professor could be the hero? Thanks.
In an essay first published in 1948, the American folklorist and cultural critic Gershon Legman wrote about the comic book -- then a fairly recent development -- as both a symptom and a carrier of psychosexual pathology. An ardent Freudian, Legman interpreted the tales and images filling the comics’ pages as fantasies fueled by the social repression of normal erotic and aggressive drives. Not that the comics were unusual in that regard: Legman’s wider argument was that most American popular culture was just as riddled with misogyny, sadomasochism, and malevolent narcissism. And to trace the theory back to its founder, Freud had implied in his paper “Creative Writers and Daydreaming” that any work of narrative fiction grows out of a core of fantasy that, if expressed more directly, would prove embarrassing or offensive. While the comic books of Legman’s day might be as bad as Titus Andronicus – Shakespeare’s play involving incest, rape, murder, mutilation, and cannibalism – they certainly couldn’t be much worse.
But what troubled Legman apart from the content (manifest and latent, as the psychoanalysts say) of the comics was the fact that the public consumed them so early in life, in such tremendous quantity. “With rare exceptions,” he wrote, “every child who was six years old in 1938 has by now absorbed an absolute minimum of eighteen thousand pictorial beatings, shootings, stranglings, blood-puddles, and torturings-to-death from comic (ha-ha) books alone, identifying himself – unless he is a complete masochist – with the heroic beater, strangler, blood-letter, and/or torturer in every case.”
Today, of course, a kid probably sees all that before the age of six. (In the words of Bart Simpson, instructing his younger sister: “If you don't watch the violence, you'll never get desensitized to it.”) And it is probably for the best that Legman, who died in 1999, is not around to see the endless parade of superhero films from Hollywood over the past few years. For in the likes of Superman, he diagnosed what he called the “virus” of a fascist worldview.
The cosmos of the superheroes was one of “continuous guilty terror,” Legman wrote, “projecting outward in every direction his readers’ paranoid hostility.” After a decade of supplying Superman with sinister characters to defeat and destroy, “comic books have succeeded in giving every American child a complete course in paranoid megalomania such as no German child ever had, a total conviction of the morality of force such as no Nazi could even aspire to.”
A bit of a ranter, then, was Legman. The fury wears on the reader’s nerves. But he was relentless in piling up examples of how Americans entertained themselves with depictions of antisocial behavior and fantasies of the empowered self. The rationale for this (when anyone bothered to offer one) was that the vicarious mayhem was a release valve, a catharsis draining away frustration. Legman saw it as a brutalized mentality feeding on itself -- preparing real horrors through imaginary participation.
Nothing so strident will be found in Jason Dittmer’s Captain America and the Nationalist Superhero: Metaphors, Narratives, and Geopolitics (Temple University Press), which is monographic rather than polemical. It is much more narrowly focused than Legman’s cultural criticism, while at the same time employing a larger theoretical toolkit than his collection of vintage psychoanalytic concepts. Dittmer, a reader in human geography at University College London, draws on Homi Bhabha’s thinking on nationalism as well as various critical perspectives (feminist and postcolonial, mainly) from the field of international relations.
For all that, the book shares Legman’s cultural complaints to a certain degree, although none of his work is cited. But first, it’s important to stress the contrasts, which are, in part, differences of scale. Legman analyzed the superhero as one genre among others appealing to the comic-book audience -- and that audience, in turn, as one sector of the mass-culture public.
Dittmer instead isolates – or possibly invents, as he suggests in passing – a subgenre of comic books devoted to what he calls “the nationalist superhero.” This character-type first appears, not in 1938, with the first issue of Superman, but in the early months of 1941, when Captain America hits the stands. Similar figures emerged in other countries, such as Captain Britain and (somewhat more imaginatively) Nelvana of the Northern Lights, the Canadian superheroine. What set them apart from the wider superhero population was their especially strong connection with their country. Nelvana, for instance, is the half-human daughter of the Inuit demigod who rules the aurora borealis. (Any relationship with actual First Nations mythology here is tenuous at best, but never mind.)
Since Captain America was the prototype –- and since many of you undoubtedly know as much about him as I did before reading the book, i.e., nothing – a word about his origins seems in order. Before becoming a superhero, he was a scrawny artist named Steve Rogers who followed the news from Germany and was horrified by the Nazi menace. He tried to join the army well before the U.S entered World War Two but was rejected as physically unfit. Instead, he volunteered to serve as a human guinea pig for a serum that transforms him into an invincible warrior. And so, as Captain America -- outfitted with shield and spandex in the colors of Old Glory – he went off to fight Red Skull, who was not only a supervillain but a close personal friend of Adolf Hitler.
Now, no one questions Superman’s dedication to “truth, justice, and the American way,” but the fact remains that he was an alien who just happened to land in the United States. His national identity is, in effect, luck of the draw. (I learn from Wikipedia that one alternate-universe narrative of Superman has him growing up on a Ukrainian collective farm as a Soviet patriot, with inevitable consequences for the Cold War balance of power.) By contrast, Dittmer’s nationalist superhero “identifies himself or herself as a representative and defender of a specific nation-state, often through his or her name, uniform, and mission.”
But Dittmer’s point is not that the nationalist superhero is a symbol for the country or a projection of some imagined or desired sense of national character. That much is obvious enough. Rather, narratives involving the nationalist superhero are one part of a larger, ongoing process of working out the relationship between the two entities yoked together in the term “nation-state.”
That hyphen is not an equals sign. Citing feminist international-relations theorists, Dittmer suggests that one prevalent mode of thinking counterposes “the ‘soft,’ feminine nation that is to be protected by the ‘hard,’ masculine state” -- which is also defined, per Max Weber, as claiming a monopoly on the legitimate use of violence. From that perspective, the nationalist superhero occupies the anomalous position of someone who performs a state-like role (protective and sometimes violent) while also trying to express or embody some version of how the nation prefers to understand its own core values.
And because the superhero genre in general tends to be both durable and repetitive (the supervillain is necessarily a master of variations on a theme), the nationalist superhero can change, within limits, over time. During his stint in World War II, Captain America killed plenty of people in combat with plenty of gusto and no qualms. It seems that he was frozen in a block of ice for a good part of the 1950s, but was thawed out somehow during the Johnson administration without lending his services to the Vietnam War effort. (He went in Indochina just a couple of times, to help out friends.) At one point, a writer was on the verge of turning the Captain into an overt pacifist, though the publisher soon put an end to that.
Even my very incomplete rendering of Dittmer’s ideas here will suggest that his analysis is a lot more flexible than Legman’s denunciation of the superhero genre. The book also makes more use of cross-cultural comparisons. Without reading it, I might never known that there was a Canadian superhero called Captain Canuck, much less the improbable fact that the name is not satirical.
But in the end, Legman and Dittmer share a sense of the genre as using barely conscious feelings and attitudes in more or less propagandistic ways. They echo the concerns of one of the 20th century's definitive issues: the role of the irrational in politics. And that doesn't seem likely to become any less of a problem any time soon.
In his inaugural address, President Obama referred repeatedly to education – but exclusively to education in STEM disciplines, as if only those fields had a defensible public purpose. Sadly, this is no aberration: in December the White House issued a report entitled "Transformation and Opportunity: The Future of the U.S. Research Enterprise," which completely overlooked research in the humanities and social sciences, even in its brief history of the growth of research at American universities.
Such a narrow focus is surprising, as the president himself apparently consults historians (and probably other scholars); and it is counterproductive, whether in strict dollars and cents terms or broader ones. Some politicians have gone further, aggressively asserting that various humanities and social science disciplines are useless, and attempting to impose higher tuitions on students who major in them, making it all the more important that those who know better actively affirm the value of teaching and research beyond the STEM fields.
I will focus here on the case for history: it is what I know best, and since history straddles the line between humanities and social sciences, many arguments for its importance apply to various allied fields. One might loosely group these into three categories, ranging from the most social scientific to the most humanistic. The first applies to lessons drawn from circumstances relatively close to our own; the second to learning about times and places we know are quite different. The third applies to research showing that some currently accepted ideas are actually fairly novel, and that people not so different from us saw did without them; engaging the concepts they used instead may help us see additional possibilities in the world, whether for good or ill.
Examples of the first category underlie almost any sound public policy debate, as well as many private deliberations. Take, for example, the 2009 stimulus bill. By itself, no mathematical calculation could assess the relative accuracy of the more-or-less Keynesian models suggesting that the stimulus would help the economy and the "real business cycle" models, which predicted that it would be an expensive waste. The difference lay in historical research about how various modern economies had responded to historically specific policy initiatives. Other examples abound, though most are less well-known: closest to home in this regard would be evaluating options for STEM investment in light of the vast literature on what has given rise to specific clusters of innovation in the past, and which innovations proved most beneficial. One would also expect development efforts to gain from examining research on past relationships among, say, education, urbanization, birthrates, and investment.
The benefits of research into the importance of understanding differences in the context of policy decisions abound, with special clarity emerging in what we might call "area studies" knowledge – an enormous part of the growth of U.S. research universities after WWII. Surely we could have saved lives and money had policy-makers known more about religious differences within Iraqi society, the political and social history of Afghanistan, or class relations and popular nationalism in Vietnam before military interventions in those places. The same, I would argue, goes for using research into the evolution of Chinese notions of ethnicity, nationality, race, and geopolitics to understand likely governmental and popular reactions to possible American policies on Tibet, trade, the Diaoyu/Senkaku Islands, and so on.
Perhaps less obvious, but equally important, is the usefulness of research that shows that many ideas we may take to be "natural," or at least of very long standing, are actually relatively new.. Some of these insights may be "just" a contribution to increased self-understanding, but others bear directly on public issues. Urgent debates over how fixed the concept of "marriage" has been come first to mind, but there are many more actual and potential examples. Recognizing that the term "ethnic group" is barely 75 years old reminds us how mutable are our understandings of the basis and implications of human groupings; that "gross national product" is of roughly the same vintage suggests maximizing that particular measurement is not inevitably the paramount goal of economic policy.
It hardly seems a stretch to think that a world facing our current challenges might benefit from awareness of other ways that people have thought about the relationship of work, citizenship, adult status, "independence" and dignity, or about consumption, economic growth, leisure and the nature of progress. Or to take some narrower examples, consider the implications of learning how relatively recently life insurance went from seeming like a morally dubious gambling on death to a taken-for-granted tool for managing risk. Or that, while (as Thomas Ricks noted in a recent Atlantic) almost no U.S. generals were removed from their commands for poor performance during Vietnam, Afghanistan or Iraq, many were so removed during World War II – suggesting that the recent situation does not represent an inevitable feature of government, much less of hierarchy generally. Historical knowledge of this kind does not provide lessons as straightforward as “deficit spending can work,” but it can add significantly to our understandings of what is possible, for better or worse, and how things may become, or cease to be, unthinkable.
Research that produces these results, both testing earlier certainties and responding to new questions , thus seems a useful, even necessary complement to research in the STEM fields. Fortunately, most historical research is also relatively cheap, but it does not thrive on complete neglect.
Kenneth Pomeranz is University Professor of History at the University of Chicago and president of the American Historical Association. The views expressed here are his alone.