“Pessimism of the intellect,” runs a familiar saying from the Italian revolutionary theorist Antonio Gramsci, “optimism of the will.” In other words, plan as if the worst-case scenario were inevitable, but act with all the vigor and confidence necessary to win in the (very) long term. It is an inspiring quotation -- or seemed to be, the first several thousand times I heard it.
Gramsci himself suffered years of imprisonment under Mussolini; his laconic advice had a certain moral authority. When it caught on among American leftists during the Reagan years, things were not nearly that bad. But we kept reciting it, and eventually the repetition wore Gramsci's incisive formulation down into a trite formula. It came to embody not courage so much as a mood of profound ineffectiveness.
While interviewing Todd Gitlin recently for an Inside Higher Ed podcast, I was tempted to ask if he had deliberately avoided using Gramsci’s line in his new book, The Bulldozer and the Big Tent: Blind Republicans, Lame Democrats, and the Recovery of American Ideals, just published by John Wiley and Sons.
Gitlin was once president of Students for a Democratic Society, the largest of the New Left organizations in the 1960s. Now he’s a professor of journalism and sociology at Columbia University. In recent years, he has greatly annoyed some people by suggesting that the American left has not only painted itself into a corner, but even revels in its own marginality and distance from power.
“Doesn’t defeat taste sweet in a good cause?” he asks in the introduction to The Intellectuals and the Flag, a collection of essays that Columbia University Press published in 2006. “The honest truth is that negativity has its rewards and they are far from negligible.... It grants nobility. It stokes the psychic fires. Defeated outrage cannot really be defeated. It burns with a sublime and cleansing flame. It confirms one’s righteousness. It collapses the indeterminate future into a burning present.”
Against this, Gitlin has counseled a less strident and more pragmatic-minded approach to progressive politics -- one that places economic concerns ahead of questions of culture and identity. Richard Rorty made the same call in his book Achieving Our Country (Harvard, 1998), which advised radicals to “put a moratorium on theory” and “try to mobilize what remains of our pride in being Americans” by learning to ask itself “how the country of Lincoln and Whitman might be achieved.”
Naturally this did not go over very well in some quarters. It was dubbed “left conservatism,” and denounced in solemn convocations. The debate unfolded while Bill Clinton was still in office (if just barely, for a while there) and soon exhausted itself without, it seems, any participant changing anyone else’s mind.
Perhaps Gitlin, Rorty et al. were right that a combination of Nietzsche and Nader was a recipe for political irrelevance. But they often seemed to be treating the “cultural left” as scapegoats for failures that mainstream American liberalism had achieved by its own devices. (Identity politics did not put Dukakis in a tank. No queer theorist bombed a pharmaceutical factory in the Sudan.)
At the same time, the academic radicals who complained about “left conservatism” were clearly quite content talking only to themselves. When Judith Butler announced that "the critique of cultural iconicity is the means by which cultural iconicity is achieved," it was not the sort of slogan anyone would want to put on a banner. Maybe the old fogeys who preferred “an injury to one is an injury to all” did have a point.
Today we are several disastrous years downstream from that heated exchange. A lot has changed, but not everything. The radical instinct to form a circular firing squad remains unchanged; it is in the genome, probably. Still, the reflex has been interrupted from time to time by distractions from the White House and Iraq. The president’s impending appointment with the dustbin of history -- taking Rove’s “permanent Republican majority” with him, it seems -- permits and even requires some effort to imagine a change of course in the near future.
None of the leading Democratic candidates really counts as a person of the left (no matter what crazy uncle Larry says on his “Down with Marxist Hillary and Obama Commie” blog). One of Gitlin’s points in his new book is that even the forces calling for a relatively modest sort of liberal reformism are only one part of the “big tent” of Democratic activism.
And even were the Democrats in control of the executive and legislative branches, the fact is that the Republican Party will still be able to rely on its organizational “bulldozer” -- capable of staying relentlessly on message, even (and especially) when reality gets in the way. It is “a focused coalition with two, and only two, major components,” writes Gitlin, “the low-tax, love-business, hate-government enthusiasts and the God-save-us moral crusaders.”
In contrast, the Democrats subsume “roughly eight” constituencies, by Gitlin’s reckoning: “labor, African Americans, Hispanics, feminists, gays, environmentalists, members of the helping professions (teachers, social workers, nurses), and the militantly liberal, especially antiwar denizens of avant-garde cultural zones such as university towns, the Upper West Side of Manhattan, and so on.”
This is not the place to rehearse Gitlin’s whole analysis. He gave an overview of the book at TPM Cafe recently, and our podcast discussion covers some of the major points.
But it seems worth noting that Gitlin's earlier complaints about “identity” and the jargonizing folkways of the academic left, while not entirely absent from The Bulldozer and the Big Tent, are much less prominent here than in some of his other writings. He appears to recognize that said cohorts do indeed have a place under the big tent -- over in the section for “the militantly liberal” and “antiwar denizens of avant-garde cultural zones.”
The more I think about this, the less sure I am what to make of it. And it’s not just being called a denizen. (You get used to that.)
Treating labor as one constituency and defining it as distinct from blacks, Latinos, feminists, and gays might make sense insofar as each has its own lobbying apparatus inside the Beltway. But in real life (and in the polling booth, for that matter) the terms of identity are by no means clearcut. Gitlin refers to Jesse Jackson’s role as “the voice of post-sixties interest-group liberalism.” Which is maybe fair enough, as far as it goes -- but it doesn’t account for Jackson’s surprisingly strong primary showings among white labor unionists in 1988.
Gitlin writes that now, as the Bush period comes to an end, we may be able at last to “get on with an adult discussion of how Americans may afford health care and decent housing, win decent employment and fair wages, dampen inequalities, stifle murderous enemies, and sustain a livable earth for generations present and future.”
Well said. Speed the day. And when it comes, a large helping of realpolitik will be essential. (Inspirational passages from Antonio Gramsci, maybe not so much.) To repair the damage done to this country over the past six years might take decades -- and that’s putting things with all the optimism anyone can reasonably muster.
But any progressive force that is up to the task will need to do more than tolerate its own multiplicity. It will have to be able to build on its actual strengths -- not all of which are credited by the “left conservative” tendency to treat economic egalitarianism as the primary criterion for social progress.
Ten years ago, state recognition of civil unions (let alone marriage) between same-sex couples was not really part of the public debate. Today, however, it is. The Republican party has to spend a considerable part of its energy defending the principle that the right to get drunk in Vegas and have a wedding must be restricted to a specific configuration of participants. This makes them look kind of silly to a lot of people, including some Republicans.
When a significant portion of the public accepts the idea that gays and lesbians have at very least a right to civil unions, this raises the possibility that the "bulldozer" might just fire its engine into overdrive and go flying off a cliff.
Now, to be frank, I am enough of a “left conservative” to wish that we were discussing nationalizing the oil companies instead. But realpolitik means working with what you’ve got.
Any tendency to regard “mere” cultural politics as a distraction from assembling the forces to launch another is not a matter of being tough-minded and practical. It means ignoring the battles you are already winning -- and taking for granted the forces that have led the fight. There are various things to call such a strategy, but “pragmatic” would not be one of them.
Members of the U.S. Army Reserves are often called “citizen-soldiers” – an expression that definitely carries more honorific overtones than another label that has sometimes been applied to them, “weekend warriors.” The latter phrase is not just insulting but now hopelessly out of date. The men and women portrayed by Michael Musheno and Susan M. Ross in their new book Deployed: How Reservists Bear the Burden of Iraq, published by the University of Michigan Press, can hardly be called amateur soldiers. They were not only sent to Baghdad but assigned to guard an overcrowded and under-equipped prison camp. (Not the one at Abu Ghraib, though it sounds like they get that question a lot.) And like other reservists, they find themselves serving repeated tours of duty – drafted in everything but name.
The authors interviewed 46 members of a unit they call the 893rd Army Reserve Military Police Company (a fictitious name) between May and September 2004. Although Deployed includes some tables of demographic data on interview subjects, the book is far more focused on qualitative than quantitative information – an analysis of how, as Musheno and Ross put it, “they adapted to long deployments while coping with family relations, military duties, and civilian careers.”
The result is an account of the various ways the citizen-soldiers of the Army Reserves dealt with the potential fault-line embodied in that hyphen. In a way, it is a matter of emphasis. For those who see themselves as citizen-soldiers, the authors argue, “identity is primarily anchored in the relationships and structures within civilian life, including but not limited to family, community, civilian work worlds, and education.” Others understand themselves primarily as citizen-soldiers, taking their bearings from the “heavy demands on reservists’ time, energy, activities, and emotions” made by the military. “Civilian relationships, jobs, and goals are placed on the back burner,” write Musheno and Ross. Their sense of family life is “bifurcated into categories of ‘blood family’ (traditional family members left behind) and the newly developed ‘army-green family’ (brothers and sisters in arms).”
I interviewed the authors by e-mail. They wrote their responses together, like the book itself. The transcript follows. Michael Musheno is chair of the department of criminal justice studies at San Francisco State University; and Susan M. Ross is associate professor of sociology and chair of the department of criminal justice at Lycoming College.
Q:Perhaps the most surprising thing about your project is that you had the cooperation of military officials in doing your research. It's also striking how eager the reservists you interviewed seemed to be to talk about their lives and experiences. Did you worry that they might be speaking to you "under orders"? Were you ever aware of any effort by authorities to find out what was being said during the interviews?
A: Perhaps the people most surprised by the willingness of many of the reservists to participate in the project ended up being the two of us. We had been warned multiple times by veterans long since retired from the military that we would have a hard time getting servicemen and women to open up to us, though as you note, this was not the case. In total, we spoke with two-thirds of the men and women of the 893rd who were still attending drill weekends and served in Iraq.
For those who did participate, we took the added precaution, beyond a standard letter of informed consent, to ensure that if they had any feelings of having been “ordered” to participate, that we would simply sit there with them in silence for a half hour or so to give the illusion that an actual interview had taken place. None of the reservists took us up on this offer, and we remain confident that they were well aware that working with us, unlike the mandatory military drug testing that was taking place over the same time period, was in fact voluntary.
In terms of authorities, the company and platoon commanders were quite respectful and enthusiastic supporters of the project and did so out of a genuine and frequently expressed concern for the well-being of the men and women serving in their company. We never had reason to doubt their integrity. Although there was a battalion-level sergeant major who stopped by to check on our progress one afternoon, a brief overview of some very general and early findings left him satisfied and allowed for us to carry on uninterrupted thereafter.
Q:Social-scientific research into the U.S. military goes back at least to the work of Robert Merton and colleagues on "The American Soldier" almost 60 years ago. Did you have a sense of all that work being in the background? Or did it seem to you that the reserves constituted a largely unexplored topic?
A: There are several rich pockets of military research in history and the social sciences, but much of the research on soldiering over the last thirty or so years has been dominated by psychology and is skewed toward the topic of soldiers as damaged individuals returning from war. With this framework as a backdrop, we were completely taken aback when we encountered reservists who talked of being on deployment as “like being on vacation” from the struggles of their civilian home lives. Nothing in the military literature had really prepared us for meeting a young enlisted man, Dennis Harris, who wistfully remarked, “And sometimes I was wishin’ I was still back in Iraq, ‘cause none of my problems were really back there. I didn’t have my problems with my ex back there, and I didn’t have to worry about it ‘cause I was doin’ other stuff. But when I got home, I had to worry about it then, and it’s just like it was as if it was hard to let go.”
Rather than hearing some canon within military scholarship being echoed by Dennis, and colleagues who expressed similar sentiments, what we heard was a variation on the theme of work becoming family and family becoming work articulated so wonderfully by sociologist Arlie Hochschild in her book The Time Bind. So while we were well-grounded in a variety of military literatures, we also drew on our backgrounds in family sociology for Susan and public policy for Michael.
Q:You found from your interviews that reservists tended to fall into three broad categories. The "adaptive" reservists experience a comfortable fit between their family backgrounds and their service; they often grew up in military families. The "struggling" reservists find that military service gives them some degree of escape from difficult civilian lives, including sometimes complicated situations at home. A third group, the "resistant" reservists, are the least accepting of the demands that the military places on them; they just want to get out and go on with their lives.
Do you find any evidence that these distinctions correspond to personal alliances or social groupings within the reserves themselves? Do the reservists themselves have some version of this categorization as part of the lore or "folk sociology" of life in the reserves?
A: Although the reservists would definitely make comparisons between their own experiences and that of other reservists, they never employed language or jargon in use as explicit as the phrases we employ to identify these clusters of reservists. Their daily interactions with one another, particularly while they were on the ground in Iraq revolved around who they worked with in the prison as well as their squads and platoons, as we describe.
Gender was certainly important in defining relationships when soldiers were in their makeshift barracks in the prison, with women grouped together and watching each others backs. The phrases “adaptive,” “struggling,” and “resistant” are our way of articulating groupings of reservists that became apparent to us as we poured over the transcripts of these conversations. There are definitely times in which similar experiences draw a group of soldiers together. There was one group of struggling reservists who would routinely gather around a campfire at night and talk through the difficulties their wives or fiancées were having back home and try to develop strategies for dealing with these problems.
Also, reservists often distinguished their experiences and perspectives from others and would encourage us to talk to a buddy of theirs who had a completely different experience. Sometimes these were adaptive reservists who pointed us in the direction of a friend out of concern for their situations at home, who in turn we describe as struggling reservists. Not all comparisons, however, were stated with such tenderness.
For example, when asked about how his own engagement had weathered the deployments when so many others had not, one an adaptive soldier, Seth Walker, talked about how he felt that there were a lot of “silly” engagements that people had entered into in the face of the deployments that he felt were not well-thought through. While Seth is not using the formal language we develop for the book, he is comparing his own social support networks and the strength of his own relationship with his fiancée to those of struggling reservists.
Also, Brad Whitman, a resistant reservist, points out his political divergence from many of the other soldiers, mostly those who we came to see as adaptive soldiers, when he says “it was a more personal struggle for me while I was there, not for so many other people because most of the majority of your military personnel are your Pat Buchanan or G. Gordon Liddy fanatics. But for me and a few others, it was a more personal level of you got this feeling like we’re here for definitely the wrong reasons.” While Brad revealed the characteristics of what we came to call the resistant reservists, he worked side by side with those we came to call adaptive and struggling reservists and when some resistant reservists stopped coming to drill after returning from Iraq, soldiers who are in the adaptive cluster respected this choice or offered up excuses for their colleagues who were technically AWOL.
Q:You point out that reservists have at times been given uncomplimentary treatment by military professionals, as in being dubbed REMFs. (The first two letters stand for "Rear Echelon.") What did you learn about how that sort of tension influenced the experience of dealing with the tensions involved in the role of "citizen-soldier"?
A: The members of the 893rd were keenly aware of their status as “weekend warriors” and were frequently reminded of this by the members of the active duty component who questioned their ability to handle assigned soldiering duties first during their stateside deployment and again in Iraq. The reservists take offense to this skepticism, particularly in light of having proven their worth to the U.S. military over the course of their deployments, and they take pride in having served in a war zone that offered no comfort in distinguishing between front and rear lines of battle.
When the 893rd deployed, its ranks held trained civilian police officers, correctional officers, computer technicians, carpenters, and students of criminal justice, and the members of the company firmly believe that their ability to draw upon both their military training and their civilian skills ultimately made them more effective soldiers.
The hope of returning home from Iraq to a hero’s welcome and wiping clean the reserve stigma was abruptly snuffed out with the April 2004 release of the photos from Abu Ghraib prison. The 893rd did not serve at that now infamous prison, but soon came to understand that they, and all members of the Reserve, would have to bear the burden of this newly formed black eye.
Q:Your project is, for the most part, descriptive. But what conclusions have you drawn for the future? Have you given any thought to what kinds of policy implications might follow?
A: While there is thick description within the book, particularly in privileging the voices of the soldiers, our work is interpretive in nature. What we present is our interpretation of what we heard as we listed to 46 citizen soldiers who have given extraordinary service to the nation under the Bush administration’s “war on terror.” Although at first glance, the number of Americans servicing in the all-volunteer military seems impressive with a membership of 2.4 million men and women, when compared to a total U.S. population of over 300 million, it is clear that the cost of the political decision to enter Iraq has been endured disproportionately by a very small number of Americans, many of whom live their civilian lives on the edge of an increasingly shaky economy.
Beginning with George Washington, many leaders in the U.S. have called for universal national service of the American public as part of our duties as citizens. But, when push comes to shove, the raising of an army to fight a ground war has always fallen disproportionately on the less privileged of American citizens. We don’t see this changing particularly when patriotic fervor wanes and a war becomes prolonged and less popular. American military leaders reasoned after the Vietnam War that making the Reserve integral to a ground war would sober the political leadership of this country in taking the decision to go to war. Doing this did not stop the most recent march to war by our political leadership and so, we are back to a point where the public is skeptical of our political leadership, distrustful of the media’s accounting of the lead up to war, and more aware of the costs of protracted war.
That will probably put a break on going into another war in the near term, but it leaves our nation vulnerable to political and media propaganda when our first hand memories fade and still without a solution to our longstanding struggle over how to raise an army that can fight successfully when necessary and serve as a brake on the political leadership when not. We agree with those who advocate for a program of national service that provides citizens with options, including becoming citizen soldiers. We are not so naïve to think that the story we tell about the sacrifices of the few will turn the tide but it may fall upon the ears of future leaders who will require more sacrifices of the many and make clear the boundaries of sacrifices required of the few.
In the late 1940s, as Richard Rorty was finishing his undergraduate studies and considering a future as a professional philosopher, his parents began to worry about him. This is not surprising. Parents worry; and the parents of philosophers, perhaps especially. But just why Rorty's parents worried – well now, that part is surprising.
They were prominent left-wing journalists. His father, James, also had some minor reputation as a poet; and his mother, Winifred, had done important work on the sociology of race relations, besides trying her hand at fiction. In a letter, James Rorty speculated that going straight into graduate work might be something Richard would later regret. His son would do well to take some time “to discover yourself, possibly through a renewed attempt to release your own creative need: through writing, possibly through poetry....”
In short, becoming an academic philosopher sounded too practical.
Not to go overboard and claim that this is the defining moment of the philosopher’s life (Rosebud!). But surely it is the kind of experience that must somehow mark one’s deepest sense of priorities. How does that inner sense of self then shape a thinker’s work?
Neil Gross’s book Richard Rorty: The Making of an American Philosopher, to be published next month by University of Chicago Press, is not exactly a biography of its subject, who died last year. Rather, it is a study of how institutional forces shape an intellectual’s sense of personal identity, and vice versa. (Gross is currently in transit from Harvard University to the University of British Columbia, where as of this summer he will be an associate professor of sociology.)
Influenced by recent work in sociological theory – but with one eye constantly on the archive of personal correspondence, unpublished writings, and departmental memoranda – Gross reconstructs how Rorty’s interests and intellectual commitments developed within the disciplinary matrix of academic philosophy. He takes the story up through the transformative and sui generis work of Rorty’s middle years, Philosophy and the Mirror of Nature (1979) and Consequences of Pragmatism (1982).
This includes a look at Rorty’s complicated and unhappy relationship with his colleagues at Princeton University in the 1970s. “I find it a bit terrifying,” he wrote in a letter at the time, “that we keep turning out Ph.D.'s who quite seriously conceive of philosophy as a discipline in which one does not read anything written before 1970, except for the purposes of passing odd examinations.” Nor did it help that Rorty felt other professors were taking his ex-wife’s side in their divorce. (What’s the difference between departmental gossip and cultural history? In this case, about 30 years.)
Gross has written the most readable of monographs; and the chapter titled “The Theory of Intellectual Self-Concept” should be of interest even to scholars who aren’t especially concerned with Rorty’s long interdisciplinary shadow. I interviewed Gross recently by e-mail, just before he headed off to Canada. The transcript of our discussion follows.
Q:You identify your work on Richard Rorty not as a biography, or even as a work of intellectual history, but rather as an empirical case study in "the new sociology of ideas." What is that? What tools does a sociologist bring to the job that an intellectual historian wouldn't?
A: Sociology is a diverse field, but if I had to offer a generalization, I'd say that most sociologists these days aim to identify the often hidden social mechanisms, or cascading causal processes, that help to explain interesting, important, or counterintuitive outcomes or events in the social world. How and why do some movements for social change succeed in realizing their goals when others fail to get off the ground? Why isn't there more social mobility? What exactly is the connection between neighborhood poverty and crime? Few sociologists think anymore that universal, law-like answers to such questions can be found, but they do think it possible to isolate the role played by more or less general mechanisms.
Sociologists of ideas are interested in identifying the hidden social processes that can help explain the content of intellectuals' ideas and account for patterns in the dissemination of those ideas. My book attempts to make a theoretical contribution to this subfield. I challenge the approaches taken by two of the leading figures in the area -- Pierre Bourdieu and Randall Collins -- and propose a new approach. I think that the best sociological theory, however, has strong empirical grounding, so I decided to develop my theoretical contribution and illustrate its value by deeply immersing myself in an empirical case: the development of the main lines of Richard Rorty's philosophy.
This entailed doing the same kind of work an intellectual historian would do: digging through archives, reading through Rorty's correspondence and unpublished manuscripts (to which he granted to access,) and of course trying to get a grasp on the diversity of Rorty's intellectual output for the period in question. This work is reflected in the first half of my book, which reads like an intellectual biography.
But the book isn't intended as a biography, and in the second half I try to show that thinking about Rorty's life and career in terms of the hidden social mechanisms at play offers unique explanatory leverage. I love intellectual history, but many intellectual historians are allergic to any effort at generalization. One of my aims in this book is to show them that they needn't be. The old sociology of knowledge may have been terribly reductive -- ideas are an expression of class interests or reflective of dominant cultural tendencies, etc etc -- but the sociology of ideas today offers much more fine-grained theoretical tools.
I only cover Rorty's life up until 1982 because by then most of the main lines of his philosophy had already been developed. After that point, he becomes for the sociologist of ideas a different kind of empirical case: an intellectual superstar and bête noire of American philosophy. It would be fascinating to write about the social processes involved with this, but that was too much for one book.
Q:This might seem like a chicken-or-egg question....Did an interest in Rorty lead you toward this sociological approach, or vice versa?
A: When I was a graduate student in the 1990s I read quite a bit of Rorty's work, and found it both interesting and frustrating. But my interest in the sociology of ideas developed independently. For me, Rorty is just a case, and I remain completely agnostic in the book about the value of his philosophy.
Q:But isn't there something already a little bit pragmatism-minded about analyzing a philosopher's work in sociological terms?
A: It's certainly the case that there are affinities between pragmatism and the sociology of knowledge. But I'm not trying to advance any kind of philosophical theory of knowledge, pragmatist or otherwise. I believe, like every other sociologist of ideas, that intellectuals are social actors and that their thought is systematically shaped by their social experiences. Whether that has any philosophical implications is best left to philosophers to figure out.
I do think that the classical pragmatist philosophers Charles Peirce, William James, John Dewey, and George Herbert Mead had it right in their account of human social action, as Hans Joas has persuasively argued. Some of their insights do make their way into my analysis.
Q: A common account of Rorty's career has him starting out as an analytic philosopher who then undertakes a kind of "turn to pragmatism" in the 1970s, thereby reviving interest in a whole current of American philosophy that had become a preserve of specialists. Your telling is different. What is the biggest misconception embedded in that more familiar thumbnail version?
A: Rorty didn't start out as an analytic philosopher. His masters thesis at Chicago was on Whitehead's metaphysics, and while his dissertation at Yale on potentiality was appreciative in part of analytic contributions, one of its major aims was to show how much value there might be in dialogue between analytic and non-analytic approaches. As Bruce Kuklick has shown, dialogue between various philosophical traditions, and pluralism, were watchwords of the Yale department, and Rorty was quite taken with these metaphilosophical ideals.
Rorty only became seriously committed to the analytic enterprise after graduate school while teaching at Wellesley, his first job. This conversion was directly related to his interest in moving up in the academic hierarchy to an assistant professorship in a top ranked graduate program. At nearly all such programs at the time, analytic philosophy had come to rule the roost. This was very much the case at Princeton, which hired him away from Wellesley, and his commitment to analytic philosophy solidified even more during the years when he sought tenure there.
But the conventional account is flawed in another way as well. It turns out that Rorty read a lot of pragmatism at Yale -- Peirce in particular -- and one of the things that characterized his earliest analytic contributions was a consistent interest in pointing out convergences and overlaps between pragmatism and certain recent developments in analytic thought. So when he finally started calling himself a pragmatist later in his career, it was in many respects a return to a tradition with which he had been familiar from the start, however much he might have come to interpret it differently than specialists in American philosophy would.
Q:You argue for the value of understanding what you call "the intellectual self-concept." Would you explain that idea? What does it permit us to grasp about Rorty that we might not, otherwise?
A: As I've already suggested, my goal in this book was not simply to write a biography of Rorty, but also to make a theoretical contribution to the sociology of ideas. Surprising as it might sound to some, the leading figures in this area today -- to my mind Pierre Bourdieu and Randall Collins -- have tended to depict intellectuals as strategic actors who develop their ideas and make career plans and choices with an eye toward accumulating intellectual status and prestige. That kind of depiction naturally raises the ire of those who see intellectual pursuits as more lofty endeavors -- it was not for nothing that Bourdieu described his study, Homo Academicus, as a "book for burning."
I argue that intellectuals do in fact behave strategically much of the time, but that another important factor influencing their lines of activity is the specific "intellectual self-concept" to which they come to cleave. By this I mean the highly specific narratives of intellectual selfhood that knowledge producers may carry around with them -- narratives that characterize them as intellectuals of such and such a type.
In Rorty's case, one of the intellectual self-concepts that came to be terribly important to him was that of a "leftist American patriot." I argue that intellectual self-concepts, thus understood, are important in at least two respects: they may influence the kinds of strategic choices thinkers make (for example, shaping the nature of professional ambition), and they may also directly influence lines of intellectual activity. The growing salience to Rorty of his self-understood identity as a leftist American patriot, for example, was one of the factors that led him back toward pragmatism in the late 1970s and beyond -- or so I claim.
I develop in the book an account of how the intellectual self-concepts of thinkers form and change over the life course. Rorty took on the leftist American patriot self-concept pretty directly from his parents, and it became reactivated in the 1970s in response to political and cultural developments and also their deaths. My argument is that the sociology of ideas would do well to incorporate the notion of intellectual self-concept into its theoretical toolkit.
But I must say that my ambitions extend beyond this. Bourdieu and Collins are not just sociologists of ideas, but general sociological theorists who happened to have applied their models to intellectual life. Implicit in my respectful criticisms of them is a call to supplement and revise their general models as well, and to fold notions of identity and subjectivity back into sociological theory -- conceptualized in the specific way I lay out, which eclectically draws on Anglo-American social psychology, theories of narrative identity, the ego psychology of Erikson, and other sources.
Q: The philosopher's father, James Rorty, is reasonably well-known to cultural historians as one of the left-wing anti-Communist public intellectuals of the mid-20th century. Your account of his life is interesting, but I found a lot of it rather familiar. By contrast, the chapter on Richard Rorty's mother was a revelation. Winifred Rorty was a clearly a remarkable person, and the question of her influence on her son seems very rich. What was it like to rediscover someone whose career might otherwise be completely forgotten?
A: It's well known that Rorty's mother, Winifred, was the daughter of social gospel theologian Walter Rauschenbusch. What's less well known is that she was a research assistant to the sociologist Robert Park at the University of Chicago. Winifred never entered academe -- she didn't formally enroll as a graduate student at Chicago, and in any event the opportunities for women on the academic labor market at the time were severely limited. Instead, after she left Chicago she worked, like her husband James, as a free lance writer and journalist. Her specialties were race riots and fashion. Very late in her life she wrote a biography of Park.
I ended up devoting one chapter each to Winifred and James because their influence on their son was profound, but also because theirs were fascinating stories that hadn't really been told before. Certainly there is no shortage of scholarship on the New York intellectuals -- a group of which they were loosely a part -- but both led remarkable and distinctive intellectual and writerly lives.
In the case of Rorty's mother I didn't set out to write about someone whose career might otherwise be forgotten, but I can say that it was a great pleasure to immerse myself in her papers and writings. Too often intellectual historians and sociologists of ideas alike focus their attention on the most prominent and "successful" thinkers, but feminist historians, among others, have helpfully reminded us that the stories of those whose careers have been stymied or blocked by discrimination or other factors can be every bit as rich and worth recovering.
Q: Suppose someone were persuaded to pursue research into Rorty's life and work after 1982, working from within the approach you call the "new sociology of ideas." What questions and problems concerning that period would you most want to see studied? What manner of archival resources or other documentary material would be most important for understanding the later phase of Rorty's career?
A: There are lots of questions about this period in Rorty's life that are worth pursuing, but I think one of the most important would be to figure out why Rorty struck a chord with so many people, was vehemently hated by others, and what role exactly his scholarship played in the more general revival of interest in classical American pragmatism that has taken place over the past twenty years or so. My book focuses primarily on the development of ideas, whereas this would be a question of diffusion and reception. I don't think it's possible to give an answer to the question without doing a lot of careful empirical research.
One would want to know about the state of the various intellectual fields in which Rorty's work was received; about the self-concepts and strategic concerns of those who responded to him positively or negatively; about the role of intellectual brokers who helped to champion Rorty and translate his ideas into particular disciplinary idioms; about the availability of resources for pragmatist scholarship; about the role played by scholarly organizations, such as the Society for the Advancement of American Philosophy, in doing the kind of organizational work necessary to lay the groundwork for an intellectual revival; and so on. Here again one might use Rorty as a window into a more general social phenomenon: the emergence of what Scott Frickel and I have called "scientific/intellectual movements," in this case a movement aimed at reviving an intellectual tradition that had long been seen as moribund.
Q: Rorty gave you access to his papers. The notes to your book cite e-mail exchanges you had with him. Any personal impressions that stick with you, beyond what you've had to say in the monographic format?
A: Although Dick and I never formed a friendship, he wrote to me not long after his diagnosis to tell me about it, and to suggest that if I had any unanswered factual questions about his life, I might want to consider asking them of him sooner rather than later.
Some might see this as reflecting a concern to manage his reputation, but he read drafts of the book and -- without commenting on the plausibility of my thesis -- never asked me to change a thing. I think what it shows instead is that he was an incredibly generous, kind, and decent man, even in his final hours; he didn't want to leave a young scholar in the lurch.
Whatever one thinks of Rorty's philosophy, those are qualities all intellectuals could stand to emulate, and live by even in the midst of intense disagreement.
I am sick of reading about Malcolm Gladwell’s hair.
Sure, The New Yorker writer has funny hair. It has been big. Very big. It is audacious hair, hair that dares you not to notice it; hair that has been mentioned in far too many reviews. Malcolm Gladwell’s hair is its own thing.
Which is only appropriate, since in his writing, Gladwell has always gone his own way. But he’s been doing it long enough, and so well, and has made so much money, that some folks feel it’s time to trim him down to size. That hair is now seen as uppity.
Gladwell is a mere journalist. He’s not shy, and like many children of academics, he is not intimidated by eggheads. He does none of his own primary research, and instead scours academic journals to find interesting ideas -- he collects experiments and experimenters. He is a translator and a synthesizer, and comes up with catchy, sprightly titled theories to explain what he has seen. Some have called him a parasite. He has called himself a parasite.
It seems to me there’s always been a bit of snarkiness attached to discussions of Gladwell’s work. This is often the case for books that have become commercially successful, which is something that seems particularly to stick in the collective academic craw. There is a weird hostility in the reviews of Gladwell’s books that is directed not at the big-haired guy himself who, like a puppy, nips at the heels of academics and then relishes the opportunity to render their work into fluid, transparent prose, but toward those many people who have made Gladwell famous: his readers. No one matches the caustic condescension of Richard Posner, who said, in a review of Gladwell’s Blink, that “it’s a book for people who don’t read books.”
The reviews of Outliers, Gladwell’s latest book, show that even a New Yorker writer can go too far. People are now attacking Malcolm Gladwell as a kind of brand. The critiques boil down to a few things, one of which is that he doesn’t take into account evidence that refutes his theories. In other words, he’s not doing careful scholarship. But we all know that even careful scholarship is a game of picking and choosing -- it just includes more footnotes acknowledging this. And Gladwell never pretends to be doing scholarship.
Gladwell is also accused of being too entertaining. He takes creaky academic work and breathes Frankensteinian life into it. He weaves anecdotes together, creating a tapestry that builds to an argument that seems convincing. This, some reviewers have claimed, is like perpetuating fraud on the (non-academic) reading public: because Gladwell makes it so much fun to follow him on his intellectual journey, he’s going to convince people of things that aren’t provably, academically true. He will lull the hoi polloi into thinking they’re reading something serious.
Which is, of course, the most common complaint about Gladwell: He’s not serious enough. He’s having too much fun playing with his ideas. And, really, you can’t be Serious when you’re raking in so much coin. Anyone who gets paid four million bucks for a book that mines academic work -- and not necessarily the stuff that is agreed to be Important -- is going to become a target. His speaking fees are beyond the budgets of most colleges. In this way, his career is now similar to that of David Sedaris, who can command an impressive audience and still be dissed by the literary folks. Everyone who’s anyone knows that you can’t sell a lot of books and be a serious writer. Just ask Jonathan Franzen. Or Toni Morrison.
I don’t see Gladwell as a social scientist-manqué, or a philosopher wannabe. Instead, I read him more like an essayist. I think of his books as well-written, research-packed, extended essays. Let me show you the evils of imperialism by telling you a story about the time in Burma when I was forced to shoot an elephant. Let’s look at this (bad) academic prose and think about the relationship between politics and the English language. But instead of using his own experiences, he builds on work done by others. He uses a wry, quirky approach and blithely ignores the received wisdom and pieties of academe. He doesn’t seek out the researcher who’s highly regarded within her field; he looks for people who are doing things he finds interesting.
Gladwell reminds me of the kind of student I knew in college, the nerd who takes weird and arcane courses and then rushes from the lecture hall excited about some idea the professor has mentioned in passing and goes straight to the library to pursue it himself. He stays up all night talking about it, and convincing you that even though you were in the same class, and heard the same reference, you have somehow missed something. Maybe not something big, but at least something really, really cool.
Perhaps I have more trust in readers than to believe that they can be so easily bought off by a good story. And I wish that academics, instead of pillorying Gladwell for being good at translating complicated ideas, would study the way he does it and apply some portion of his method to their own work: He makes mini trade books of monographs. Surely this is a lesson worth learning. He uses the narrative art of the magazine writer to animate ideas. He profiles theories the way Gay Talese or Joan Didion did celebrities.
The audacity Gladwell shows in his writing, connecting seemingly disparate things and working hard, yet with apparent effortlessness, to make the ideas engaging, gives me hope for the future of books. It makes me feel better to see folks buying Gladwell rather than the swimmer Michael Phelps’s memoir or vampire novels -- not that there’s anything wrong with that. Yet this same audacity is what gets Gladwell into hot water with academics. He’s not supposed to do this.
Unless you are an aged physicist, you don’t really get to write books that “purport to explain the world.” You can, of course, try to explicate tiny portions of it. Science writers like James Gleick and Jonathan Weiner can go a lot further than most scientists in terms of making arcane principles understandable to the Joe the Plumbers of the reading world and no one gets bent of out shape. Perhaps it’s because of the assumption that scientists, with a few notable (often British) exceptions, are not supposed to be able to write books that normal people can read. Social scientists and historians are, however, expected to be able to know what is interesting and important about their work and present it to the public. Brand name thinkers like Susan Sontag and Martha Nussbaum can take on big ideas. But these people are experts; journalists shouldn’t try this at home.
What I love about Gladwell is that his writing is like his hair. You can see it as arrogant or scary (he writes about being stopped more frequently by cops when he had a big afro), or you can see it as playful and audacious. This is why, of course, so many reviews mention it; he has the right hair for his work.
One final, dour complaint about Gladwell has to do with his relentless cheeriness. He thinks that people are basically good, though he understands that sometimes circumstances aren’t. I can’t abide high-brow literary novelists who trash fiction that “cops out” with a happy ending. Maybe I’m hopelessly low-brow: I still love Jane Austen and Shakespeare’s comedies. The academic response to most things is generally: it’s more complicated than that. And sure, much of the time it is. But if something’s artfully crafted, I’m willing to cut the author some slack. I don’t ever expect to be thoroughly persuaded of anything; I’m characterologically skeptical and like to do the thinking on my own. Gladwell’s books invite me into a conversation. I think that’s part of the job of a good book.
For me, reading Malcolm Gladwell’s books is like watching Frank Capra movies. Just because they make you feel good and keep you entertained doesn’t mean that they’re not doing valuable work or tackling hard and real issues and ideas. Sure, someone else could have handled it differently. George Bailey might have finally committed suicide; the bank in Bedford Falls could have asked for a government bailout. But right now, maybe it’s not such a bad thing to read books that are a little more hopeful. And yes, audacious.
Rachel Toor teaches in the MFA program at Eastern Washington University. She writes a monthly column for The Chronicle of Higher Education, and her most recent book is Personal Record: A Love Affair With Running. Her Web site is www.racheltoor.com.
You probably recall that in George Orwell’s 1984 the authorities bring Winston Smith to a torture chamber to break his loyalty to his beloved Julia. Perhaps you do not remember the room number. It is 101.
The modern university institutionalizes Orwell’s association of the number 101 with torture. Faculty and students often consider introductory courses an affliction.
I suspect that colleagues award teaching prizes to 101 instructors partly as compensation for relieving themselves of the agony of teaching introductory courses -- a suspicion that first occurred to me last year, when I shared an award with the University of Toronto’s Centre for the Study of Pain, much praised for its relief of suffering.
Why, then, do I teach introductory sociology? My colleagues have been too polite to remind me of the alleged downsides, but they are well known. First, teaching an introductory course is often said to be a time-consuming activity that interferes with research and writing -- the royal road to prestige, promotion, and merit pay. Second, it is reputedly boring and frustrating to recite the elementary principles of the discipline to young students, many of whom could not care less. Third, the 101 instructor performs supposedly menial work widely seen as suited only to non-tenured faculty members, advanced graduate students, and other personnel at the bottom rung of the academic ladder. Although I understand these arguments, I do not find them compelling. For me, other considerations have always far outweighed them.
In particular, teaching intro solves, for me, the much-discussed problem of public sociology. Some sociologists believe that working to improve human welfare is somehow unprofessional or unscientific. They hold that professional sociologists have no business drawing blueprints for a better future and should restrict themselves to analyzing the present dispassionately and objectively. However, to maintain that belief they must ignore what scientists actually do and why they do it. Sir Isaac Newton studied astronomy partly because the explorers and mariners of his day needed better navigational cues. Michael Faraday was motivated to discover the relationship between electricity and magnetism partly by his society’s search for new forms of power.
Today, many scientists routinely and proudly acknowledge that their job is not just to interpret the world but also to improve it, for the welfare of humanity; much of the prestige of science derives precisely from scientists’ ability to deliver the goods. Some sociologists know they have a responsibility beyond publishing articles in refereed journals for the benefit of their colleagues. One example is Michael Burawoy’s 2004 presidential address to the American Sociological Association, a gloss on Marx’s “Theses on Feuerbach”, in which Burawoy criticized professional sociologists for defining their job too narrowly and called for more public sociology. Still, many sociologists hold steadfastly to the belief that scientific research and public responsibility are at odds -- largely I suspect, because they are insecure about whether their research is really scientific at all, so feel they must be more papist than the pope.
Setting such anxieties aside, one is left with the question of how to combine professional pursuits with public responsibility. One option is conducting research that stimulates broad discussion of public policy. Some of my colleagues study how immigration policy limits the labour market integration and upward mobility of immigrants; others how family policy impairs child welfare; and still others how tax and redistribution policies affect inequality. To the degree they engage educated citizens in discussion and debate on such important issues, they achieve balance between their professional and public roles.
I have chosen a different route to public responsibility. I have conducted research and published for a professional audience, but I have also enjoyed the privilege of addressing hundreds of thousands of members of the public over the years by teaching Sociology 101 in large lecture halls and by writing textbooks for intro students in several countries. As Orwell wrote, communicating effectively to a large audience may be motivated by aesthetic pleasure and egoistic impulses. Who among us does not want to write clear and compelling prose and to be thought clever for doing so? But in addition, one may want to address a large audience for what can only be deemed political reasons.
In 1844, Charles Dickens read his recent Christmas composition, The Chimes, to his friend William Charles Macready, the most famous Shakespearean actor of the day. Dickens later reported the reading to another friend as follows: “If you had seen Macready last night -- undisguisedly sobbing, and crying on the sofa, as I read -- you would have felt (as I did) what a thing it is to have Power.” I understand Dickens. I, too, relish the capacity to move and to sway a large audience to a desired end because it signifies that my influence will not be restricted to a few like-minded academics and that I may have at least some modest and positive impact on the broader society. I find most students burn with curiosity about the world and their place in it, and I am delighted when they tell me that a lecture helped them see how patterned social relations shape what they can become in this particular historical context. On such occasions I know that I have taught them something about limits and potential—their own and that of their society. Teaching intro thus allows me to discharge the public responsibility that, according to Burawoy and others, should be part of every sociologist’s repertoire.
In Marx’s words, “it is essential to educate the educators” -- especially those who persist in believing that teaching intro bores, frustrates, interferes, and suits only the academic proletariat.
Robert Brym is professor of sociology at the University of Toronto. A version of this essay first appeared in Academic Matters, which is published by the Ontario Confederation of University Faculty Associations.
Fifty years ago next month, C. Wright Mills published The Sociological Imagination, a classic critique of the field that includes, as an appendix, "On Intellectual Craftsmanship." The essay is part profession of faith and part practical handbook -- full of good advice, and not just for social scientists. "Scholarship is a choice of how to live as well as a choice of career," wrote Mills; "whether he knows it or not, the intellectual workman forms his own self as he works towards the perfection of his craft...."
I've lauded the piece here before, and was glad to see it in the table of contents for The Politics of Truth: Selected Writings of C. Wright Mills, a volume published last year by Oxford University Press and edited by John H. Summers, a visiting scholar at the Boisi Center for Religion and American Public Life at Boston College. But on closer examination, I saw that the editor hadn't simply reprinted the appendix. This version of "On Intellectual Craftsmanship" was rather different: it was taken from the text that Mills had mimeographed for his students in the mid-1950s.
This evidence of digging around in the archives left me eager to read more of the editor's own writings about Mills, listed in the bibliography, to see what insights he might have reached while excavating. And as luck would have it, we were introduced a short time later by a mutual friend. This somewhat expedited things, since Summers was just about to publish Every Fury on Earth (The Davies Group), a far-ranging collection of essays, including several on Mills.
Something of the maverick sociologist's feeling for intellectual craftsmanship runs throughout Summers' work. I don't recall the last time I read anything so ardent about scholarship as a means to soul-making -- or, for that matter, so angry at how academic life can distort that process. One of the remarkable things about Summers as a writer is that his frustration never runs to sarcasm -- no small accomplishment.
We recently exchanged a few rounds of e-mail about his work. A transcript of that exchange follows.
Q: You identify yourself as an anarchist and quote passages in which both James Agee and C. Wright Mills did, too. But it's not clear from your work (or theirs, for that matter) just how much this is a matter of feeling an affiliation with some strand of the anarchist movement, and how much it is a matter of personal temperament. What sort of anarchist are you?
A: May I split the difference between temperament and historical exemplars? Politically, anarchism is a democratic method for criticizing power; philosophically, a rough synonym for pragmatism, especially in William James's effort to defend the creativity of perception against the lure of abstraction and intellectualism.
Several years ago I began to notice writers and scholars whom I admired calling themselves anarchists; not only James, Mills, and Agee, but Dwight Macdonald, who called himself a conservative anarchist. What I did not notice, and still have not found, was a serious discussion of these impulses. (As Macdonald said, most educated Americans mistakenly believe anarchism means chaos). So I was drawn to anarchism out of frustrated curiosity. Sensibility had something to do with it, but that's only to say the same thing twice: I don't discover such things about myself but by reading.
Q: Dwight Macdonald edited and contributed to little political magazines -- as did Mills -- but also wrote for large-circulation publications. A couple of essays in your book were first published in The Journal of American History, and another appeared in an edited collection of papers. But the rest were written for magazines, newspapers, and Web sites. That sort of thing is normally just tolerated, though not encouraged. Aren't you worried that being "public" means you aren't "professional"? Isn't that the kiss of death on the tenure track?
A: The University of Rochester never asked me to make an invidious distinction between the public and the professional, but taught history as a form of criticism. If that sounds amateurish, as if critics are less serious than bibliomaniacs, then consider a short list of distinguished students and graduates from the Rochester history department and marvel at the blend of scholarly erudition and public commitment animating their work: Chris Lehmann, Kevin Mattson, Christopher Phelps, Rochelle Gurstein, Casey Nelson Blake, Cathy Tumber, Russell Jacoby. Has any small history department in recent memory made a comparable contribution?
I've put in my professional time -- editing a column for the American Historical Association's newsletter and publishing refereed articles in Intellectual History Review and Left History, where I have a 40-page, 100-footnote article forthcoming that would arouse any tenure committee, were I to make it that far. As things stand, I see no special reason to worry. Are there any tenure-track jobs left to lose?
Q:The pages discussing your academic career, so far, are marked by frustration with the university as an institution shaped by "the downsizing and outsourcing techniques perfected by the corporations." If history is a craft, you write, then historians should be organized into guilds -- a medieval notion, as was Paul Goodman's understanding of the university as "a community of scholars." But how do you create that ethos? Isn't the whole culture set up to teach people that they are monads of self-interest who need to learn to manipulate the system to get ahead?
A: Although the university hosts conflicting voices, it rarely gives us an effectual debate about the ends of education. The profession, likewise, includes many perspectives while controlling them within a methodological straight-jacket. If the ethos should precede the institution, as you rightly suggest by your question, then it is up to the individual scholar to self-organize.
A: How should I know? Paul Goodman, Lewis Mumford, and C. Wright Mills answered by telling us to study the gamut of social forms through which modern cultural history has transmitted itself, looking for links in a model of exemplary characters, images, events, and ideas. Christopher Lasch urged us to ask ourselves whether we possessed the moral resources implied by our cultural criticism. James Agee said we must be faithful to our perceptions wherever they may lie.
I think the question of how to live as a scholar or writer is personal, inescapably so, in the exact sense that society forbids us to acknowledge. (Many more people have done much worse things by taking things impersonally than those who have been sensitive to personal meaning). Almost everyone acknowledges that our system of graduate education is obsolete, yet there is not a single serious proposal for reforming the profession. Linger on that failure for a moment. In a crumbling system, self-organization is less a matter for utopian speculation than survival.
Q:Your first major undertaking as an apprentice scholar in the 1990s was a critical analysis of Dale Carnegie's "How to Win Friends and Influence People" -- done, it sounds like, in the approved cultural-studies manner of the period. It's kind of disappointing that you didn't include that paper in the collection. But maybe it's there between the lines? It sounds like one of your criticisms of academic life is, so to speak, its rampant if unacknowledged Carnegie-ism. Would you say more about your interest in his most famous text?
A: I grew up in a conservative family in rural Pennsylvania as the son and grandson of small businessmen. To them, How To Win Friends and Influence People contained nothing but common sense. I declined their offer to enroll in the Carnegie seminar during high school. Not until I enrolled in the master’s program at George Mason University in 1994 did I begin to understand the sources of the book’s cynicism, the elision of sincerity and its performance.
Carnegie, training his readers to detect weakness in others, undermined the possibility of a social gospel in Christian ethics. But my father and grandfather were not notably religious, and I sighted the irony of their devotion from another direction. Both of them are tall, tough men -- no metrosexuals here. Yet they esteemed Carnegie, a mousy Methodist who told men to suppress their instinct for conflict behind a plastic smile. Early on, I decided I would not suppress myself in this way.
The paper itself, though not worth publishing, gave me a short course on the therapeutic idiom in the business culture of the 1930s. I still find it curious that Carnegie, along with the period's self-help gurus such as Walter Pitkin, Dorthea Brande, and Norman Vincent Peale (“positive thinkers” all) cited the philosophical psychology of John Dewey and William James repeatedly and enthusiastically. James’s essay, “The Energies of Men,” made the point of departure for Pitkin’s book, More Power To You! A Working Technique for Making the Most of Human Energy (1934). Carnegie called James “the most distinguished psychologist and philosopher America ever produced” and Dewey “America’s most profound philosopher.” In Think and Grow Rich (1939) Napoleon Hill gave one of his chapters a title that could have appeared in a Mills book: “Imagination: The Workshop of the Mind.” Thus is one returned to the discomfiting paradox that major currents in American radical thought have not differed radically from the society they have criticized.
The most valuable part of my master’s degree from George Mason University was the chance to study with Roy Rosenzweig, one of the best men I have known.
Q:The admiration you express for Roy Rosenzweig was one of the things that surprised me the most about your book. Rosenzweig was the father of digital history. By contrast, you seem...well, not quite a Luddite, perhaps, but not an "early adopter."
A: Roy was easy to admire. I worked for his Center for History and New Media on projects such as History Matters: The U.S. Survey Course on the Web and the CD-Rom version of Who Built America? Under his direction, moreover, I published one of the first essays about labor history on the web. Roy worked in collaboration with Steve Brier and Josh Brown of the American Social History Project.
The instances of kindness, instruction, and encouragement I received from Roy, Josh, and Steve have made me wonder -- to return to your earlier question about organization of scholarly work -- whether Centers or Projects are more conducive to cooperative learning than Departments. My experience this year at Alan Wolfe’s Boisi Center for Religion and American Public Life, at Boston College, suggests all over again that this may be so.
Q:Last year, Oxford University Press published your edition of selected writings by C. Wright Mills, including a series of lectures from 1959 derived from an unpublished manuscript called "The Cultural Apparatus." By that title, Mills says he means "all of the organizations and milieux in which artistic, intellectual, and scientific work goes on, and by which entertainment and information are produced and distributed." Why do we need this 50-year-old analysis today? I mean, we're downstream from Habermas and Foucault now. Doesn't that pretty much cover it for (respectively) hope and fear in regard to the cultural apparatus?
A: Everyone can learn something from Mills’s “natural history of culture.” I say so confident in the knowledge of the reception accorded these lectures in 1959--the thousand or so letters on file at the University of Texas--as well as recent experience, having taught them last week in my history of radicalism course at The Cooper Union. The students there got it.
Can one say the same for Habermas? I have found him damnably difficult to teach. At the end of The Structural Transformation of the Public Sphere, he copied out a section of The Power Elite that reappears in The Cultural Apparatus: the idea of self-renewing publics, which implied the meaning of the intellectual vocation to lie in the continual search for them.
I agree that we face a mode of cultural production, distribution, and consumption unlike the factory-style conditions Mills addressed. Do I diminish these lectures by saying their value is primarily historical? The history of ideas can be useful without being practically useful, all the more so in the case of old ideas lightly printed on sketch paper, unrealized rather than outworn, forgotten. James Agee, who loved to play the church organ, often spoke of his idea to write a history of the United States through the religious hymn music echoing in America’s vast land of small towns, hamlets, and churches. In what sense do we “need" to know all about Agee’s impossible idea?
Q:Another notion in Mills that interests you is his idea of the New Man. What's that all about?
A: Daniel Bell was right to discern an "end of ideology." Mills, in his "Letter to the New Left," did not deny that social reality had exhausted modern ideology. But Mills praised ideological striving while Bell refused to mourn its passing. Accordingly, many commentators on Mills have been tempted to find ideological motives in his thought, insisting that he was really a Trotskyist, a Marxist, a Deweyan, a Weberian, a Shachtmanite, and so on. Mills himself insisted he was “neither a con-former nor a re-former.” I think one way to understand his ideological striving without tripping over a label is to consult the long line of New Men in Europe and America. From his first book, The New Men of Power (“of power,” not “in power”) to his defense of Cuba’s “new men” in Listen Yankee!, Mills let this idea guide his work.
The idea of the New Man puts biography at the center of the history of radicalism, which has been preoccupied with victimized social movements and which, in conception and method, looks like the historiography it claims to subvert. Why should biography sit on the sidelines of monographic scholarship when the New Man once dominated liberal and radical thought, showing up in Emerson’s “over-soul,” Nietzche’s “over-man,” Weber’s “new prophets,” and Adorno’s “New Type of Human Being” before he showed in Mills's Havana?
The New Man stands beyond alienation, feels in his spiritual independence determined to make intelligible the mysterious processes of history. “We know that the new form of social production, to achieve the good life, needs only new men,” Marx wrote in 1856. The Soviets found their New Man in Ostrovsky’s How the Steel Was Tempered--with Gladkov’s Cement featuring Dasha as the New Woman--while in America Alaine Locke claimed the creation of The New Negro as a greater achievement than any one work of art or literature so produced. Closer to Mills’s time, Frantz Fanon, in The Wretched of the Earth, said decolonialization “brings a natural rhythm into existence, introduced by new men, and with it a new language and a new humanity. Decolonialization is the veritable creation of new men.” And while Mills was hailing the decolonializers in Cuba, Arthur Schlesinger Jr. (in A Thousand Days) was exulting over the mood of Kennedy’s Washington, “the excitement which comes from an injection of new men and new ideas, the release of energy which occurs when men with ideas have a chance to put them into practice.”
Human character, so conceived by biographers of power, is an independent agent of political change, evidence of the plasticity of nature in the freedom of revolutionary spirit. It was Crevecoeur in his epistolary novel Letters from an American Farmer who asked “What then is the American, this new man?” and answered that he lived in “the most perfect society now existing in the world.”
Q:You are working on a biography of Mills -- who, given the extent of his work and his influence, it seems hard to believe died in his mid-40s. How is the project going? How far along are you?
A: A decade of research has turned up several thousand letters, poems, photographs, manuscripts, audio recordings, and autobiographical writings, including a 101-page life-history Mills wrote in college. Other new material I have discovered includesover 85 interviews, including multiple sessions with his widow and two ex-wives (all three women died last summer); and Columbia colleagues such as Daniel Bell, Lewis Coser, Nathan Glazer, Seymour Martin Lipset, Robert Merton, and Immanuel Wallerstein. Many had never discussed their relationship with Mills; of those living abroad, most had never been asked. From sessions with the Mexican novelist Carlos Fuentes, the Polish philosopher Leszek Kolakowski, and the British historian Dorothy Thompson I learned perspectives sharply at odds with American views.
The thrill of such research and the logistics of such interviews, plus the daunting complexity of the task itself, equal a book long in the making. It almost seems appalling to finish it, yet I expect to do so later this year. In the meantime, I'll complete a number of minor projects, including an introduction to social theory for Continuum’s “guide to the perplexed” series, and a pamphlet of my writings on higher education to be issued under the title Eternal Teacher.
Q: You haven't started emulating Mills by eating gigantic, heart-clogging steak dinners, have you?
A: Steak dinners? With my wife, Anna, I am living on the subsistence wages accorded adjunct faculty. There is hope yet. Four months ago Anna gave birth to our daughter, Niusha, who has been proving by sublime action what our education taught by pale precept: that our nature is innocent, intelligent, spontaneous, and, on the occasion, quite capable of making a fuss. Another child in the world, another born anarchist.
Laid low with illness -- while work piles up, undone and unrelenting -- you think, “I really couldn’t have picked a worse time to get sick.”
It’s a common enough expression to pass without anyone ever having then to draw out the implied question: Just when would you schedule your symptoms? Probably not during a vacation....
It’s not like there is ever a good occasion. But arguably the past few days have been the worst time ever to get a flu. Catching up with a friend by phone on Saturday, I learned that he had just spent several days in gastrointestinal hell. The question came up -- half in jest, half in dread -- of whether he’d contracted swine variety.
Asking this was tempting fate. Within 24 hours, I started coughing and aching and in general feeling, as someone put it on "Deadwood," “pounded flatter than hammered shit.” This is not a good state of mind in which to pay attention to the news. It is not reassuring to know that the swine flu symptoms are far more severe than the garden-variety bug. You try to imagine your condition getting exponentially worse, and affecting everyone around you -- and everyone around them.....
So no, you really couldn’t pick a worse time to get sick than right now. On the other hand, this is a pretty fitting moment for healthy readers to track down The Monster at Our Door: The Global Threat of Avian Flu, by Mike Davis, a professor of history at the University of California at Irvine. It was published four years ago by The New Press, in the wake of Severe Acute Respiratory Syndrome (SARS), which spread to dozens of countries from China in late ‘02 and early ‘03.
The disease now threatening to become a pandemic is different. For one thing, it is less virulent -- so far, anyway. And its proximate source was pigs rather than birds.
But Davis’s account of “antigenic drift” -- the mechanism by which flu viruses constantly reshuffle their composition -- applies just as well to the latest developments. A leap across the species barrier results from an incessant and aleatory process of absorbing genetic material from host organisms and reconfiguring it to avoid the host’s defense systems. The current outbreak involves a stew of avian, porcine, and human strands. “Contemporary influenza,” writes Davis, “like a postmodern novel, has no single narrative, but rather disparate storylines racing one another to dictate a bloody conclusion."
Until about a dozen years ago, the flu virus circulating among pigs “exhibited extraordinary genetic stability,” writes Davis. But in 1997, some hogs on a “megafarm” in North Carolina came down with a form of human flu. It began rejiggering itself with genetic material from avian forms of the flu, then spread very rapidly across the whole continent.
Vaccines were created for breeding sows, but that has not kept new strains of the virus from emerging. “What seems to be happening instead,” wrote Davis a few years ago, “is that influenza vaccinations -- like the notorious antibiotics given to steers -- are probably selecting for resistant new viral types. In the absence of any official surveillance system for swine flu, a dangerous reassortant could emerge with little warning.” An expert on infectious diseases quoted by CNN recently noted that avian influenza never quite made the leap to being readily transmitted between human beings: "Swine flu is already a man-to-man disease, which makes it much more difficult to manage, and swine flu appears much more infectious than SARS."
There is more to that plot, however, than perverse viral creativity. Davis shows how extreme poverty and the need for protein in the Third World combine to form an ideal incubator for a global pandemic. In underdeveloped countries, there is a growing market for chicken and pork. The size of flocks and herds grows to meet the demand -- while malnutrition and slum conditions leave people more susceptible to infection.
Writing halfway through the Bush administration, Davis stressed that the public-health infrastructure had been collapsing even as money poured into preparations to deal with the bioterrorism capabilities of Iraq’s nonexistent weapons of mass destruction. The ability to cope with a pandemic was compromised: “Except for those lucky few -- mainly doctors and soldiers -- who might receive prophylactic treatment with Tamiflu, the Bush administration had left most Americans as vulnerable to the onslaught of a new flu pandemic as their grandparents or great-grandparents had been in 1918.”
The World Health Organization began stockpiling Tamiflu in 2006, with half of its reserve of five million doses now stored in the United States, according to a recent New York Timesarticle. The report stressed that swine flu is driving up the value of the manufacturer’s stocks -- in case you wondered where the next bubble would be.
But don't expect to see comparable growth in the development of vaccines. As Davis wrote four years ago, “Worldwide sales for all vaccines produced less revenue than Pfizer’s income from a single anticholesterol medication. ... The giants prefer to invest in marketing rather than research, in rebranded old products rather than new ones, and in treatment rather than prevention; in fact, they currently spend 27 percent of their revenue on marketing and only 11 percent on research.”
The spread of SARS was contained six years ago -- a good thing, of course, but also a boon to the spirit of public complacency, which seems as tireless as the flu virus in finding ways to reassert itself.
And to be candid, I am not immune. A friend urged me to read The Monster at Our Door not long after it appeared. It sat on the shelf until a few days ago.
Now the book seems less topical than prophetic -- particularly when Davis draws out the social consequences of his argument about the threat of worldwide pandemics. If the market can’t be trusted to develop vaccines and affordable medications, he writes, “then governments and non-profits should take responsibility for their manufacture and distribution. The survival of the poor must at all times be accounted a higher priority than the profits of Big Pharma. Likewise, the creation of a truly global public-health infrastructure has become a project of literally life-or-death urgency for the rich countries as well as the poor.”
There is an alternative to this scenario, of course. The word "disaster" barely covers it.
MORE: Mike Davis discusses the swine flu outbreak in an article for The Guardian. He also appeared recently on the radio program Beneath the Surface, hosted by Suzi Weissman, professor of politics at St. Mary's College of California, available as a podcast here.
About a year ago, one of my distant relatives found himself in trouble with the law, and not for the first time. He had allegedly stabbed somebody in the course of a dispute over certain business matters, and so had to go on the run. The police had a thorough description of him (from sustained acquaintance) that they provided to local newspapers -- including the memorable detail that he had numerous tattoos, among them the ones on his forehead over each eye.
He was eventually tracked down in a nearby state. The stab-ee declined to press charges, and everyone lived happily ever after.
As events unfolded, I kept thinking: "There is a valuable lesson here. If you are planning on a life of crime, it is probably best not to get tattoos on your forehead. There are bound to be times when you will need to remain inconspicuous, and having a tattoo over each eye really won't help with that." Then again, career guidance for criminals is probably not what it could be.
Or is it? I have been reading Diego Gambetta's new book Codes of the Underworld: How Criminals Communicate, just published by Princeton University Press. The author, a professor of sociology at Oxford University, notes that senior convicts in Folsom State Penitentiary, including its "honorable" tattoo artists, strongly discourage young and unmarked felons from getting inked. Gambetta, who has also published a study of the Sicilian Mafia, takes a transnational approach in his new book. He cites a report on the attitude found within a South African prison: "Facial tattoos are the ultimate abandonment of all hope of a life outside."
On the other hand, it certainly shows a certain commitment to one's chosen career. It's also a way around the inconvenient fact that nowadays movie stars and accountants and writing-program administrators are sporting bitchin' 'tats. A generalized social destigmatization of body art ups the ante for people whose livelihood comes from projecting an aura of menace. In some lines of work, the forehead is a perfectly good place for one's CV. It may even qualify as proof of ambition.
Gambetta's study also looks at such modes of underworld communication as nicknames, slang, and such "trademarks" as the little logos on bags of heroin, or a gang's preferred means of executing a traitor. How absorbing readers may find Codes of the Underworld is very much a matter of taste. (Every time GoodFellas runs on cable, I end up watching, while my spouse refuses to sit through it a second time.) But morbid fascination aside, the book is interesting for how its method may apply to other forms of interaction -- and other career paths.
Surprisingly, none of the familiar theoretical apparatus of semiology is wheeled onstage. Gambetta's approach is an economic analysis of how various modes of underworld communication function.
This doesn't mean simply treating tattoos, nicknames, fish wrapped in newspaper, etc., as components of certain kinds of economic exchange. Rather, Gambetta looks at the life of crime itself as shaped by a traffic in signals of professional competence. There is a market of sorts involved in accumulating a stock of reputational "capital" -- as well as the incidental expenses that must be paid to maintain it. Not only do police and FBI agents spend a great deal of effort learning to mimic the lingo and gestures of the underworld, but so do wannabes and fashionistas. It is a constant struggle to update the code and proof-check the credentials.
Because the activities involved are illegal, the more familiar possibilities of accreditation are just not available. It is not like there is a licensing agency for counterfeiters. Anyway, how could you trust its certificates?
That is my example, not Gambatta's. But one incident he recounts may suggest how difficult things can get, at least for potential consumers of underworld services. A woman in Canada learned that there was a business in the American Southwest called Guns for Hire. She did not realize that it was a theatrical group that specialized in reenactments of Old Western shoot-outs and the like. She called its office to try to arrange the disposal of her husband. (This is an example of what is sometimes called "an imperfect market created by differences of information.")
But such problems do not emerge only along the boundary separating civilians and professional hoods. "Criminals embody homo economicus at his rawest," writes Gambetta, "and they know it. In keeping with the evidence that people who are untrustworthy are also likely to think that others are untrustworthy, criminals are more inclined to distrust each other than ordinary people do." In a subculture where dishonesty is the norm and participants have no recourse to mediation by the state, it is especially difficult to communicate trustworthiness and reliability to one's potential peers or clients.
On that score, Gambatta makes a fascinating and rather counterintuitive argument about the role that gross incompetence plays in organized crime -- and also, as a brief discussion in one chapter suggests, in academic life, at least in Italy.
"An unexpected result of my research on the mafia," he writes, "was to find out that mafiosi are quite incompetent at doing anything" other than shaking down legitimate businesses and enforcing trade agreements among smaller-scale hoodlums. "Mafiosi are good at intimidation and stick to it.... They let the professionals and the entrepreneurs take care of the actual business operations."
Rather than getting involved in running a restaurant or dealing drugs, they joke about their cluelessness in such matters and simply collect payment for "protection." But this professed incompetence (evidently quite well-demonstrated on the rare occasions that a mafioso tries to go legit) makes them strangely "trustworthy" to those using their services: "If [mobsters] showed any competence at it, their clients would fear that they might just take over."
Gambetta argues that something similar takes place among the baroni (barons) who oversee the selection committees involved in Italian academic promotions. While some fields are more meritocratic than others, the struggle for advancement often involves a great deal of horse trading. "The barons operate on the basis of a pact of reciprocity, which requires a lot of trust, for debts are repaid years later. Debts and credits are even passed on from generation to generation within a professor's 'lineage,' and professors close to retirement are excluded from the current deals, for they will not be around long enough to return favors."
The most powerful figures in this system, says Gambetta, tend to be the least intellectually distinguished. They do little research, publish rarely, and at best are derivative of "some foreign author on whose fame they hope to ride.... Also, and this is what is the most intriguing, they do not try to hide their weakness. One has the impression that they almost flaunt it in personal contacts."
Well, one also has the impression that the author is here on the verge of writing a satirical novel. But a friend who is interested in both the politics and academic life of Italy tells me that this account is all too recognizably accurate, in some fields anyway. Gambetta calls the system "an academic kakistocracy, or government by the worst," which is definitely an expression I can see catching on.
This may seem like a tangent from comparative criminology. But Gambetta argues that the cheerful incompetence of the baroni is akin to the mafioso's way of signaling that he can be "trusted" within his narrowly predatory limits
"Being incompetent and displaying it," he writes, "conveys the message I will not run away, for I have no strong legs to run anywhere else. In a corrupt academic market, being good at and interested in one's own research, by contrast, signal a potential for a career independent of corrupt reciprocity.... In the Italian academic world, the kakistrocrats are those who best assure others by displaying, through lack of competence and lack of interest in research, that they will comply with the pacts."
It would be shocking, simply shocking, however, if anyone suggested this was not strictly an Italian problem.
Shortly after last week’s column appeared, I headed out to Iowa City to attend -- and, as the occasion required, to pontificate at -- a gathering called Platforms for Public Scholars. Sponsored by the Obermann Center for Advanced Studies at the University of Iowa, it drew somewhere between 100 and 150 participants over three days.
This was the latest round in an ongoing conversation within academe about how to bring work in the humanities into civic life, and vice versa. The discussion goes back almost a decade now, to the emergence of the Imagining America consortium, which fosters collaboration between faculty at research universities and partners in community groups and nonprofit organizations.
That effort often runs up against institutional inertia. You sense this from reading "Scholarship in Public: Knowledge Creation and Tenure Policy in the Engaged University" (the report of the consortium's Tenure Team Initiative, released last year). Clearly there is a long way to go before people in the humanities can undertake collaborative, interdisciplinary, and civic-minded work without fearing that they are taking a risk.
Even so, the presentations delivered in Iowa City reported on a variety of public-scholarship initiatives -- local history projects, digital archives, a festival of lectures and discussions on Victorian literature, and much else besides. Rather than synopsize, let me recommend a running account of the sessions live-blogged by Bridget Draxler, a graduate student in English at the University of Iowa. It is available at the Web site of the Humanities, Arts, Sciences, and Technology Advanced Collaboratory (better known as HASTAC, usually pronounced “haystack”).
Word went around of plans to publish a collection of papers from the gathering. I asked Teresa Mangum, a professor of English at U of I, who organized and directed the event, if that was in the cards. She “built the platform,” as someone put it, and presided over all three days with considerable charm -- intervening in the discussion in ways that were incisive while also tending to foster the collegiality that can be elusive when people come from such different disciplinary and professional backgrounds.
“My goal is to have some kind of ‘artifact’ of the conference,” she told me, “but I'm trying to think more imaginatively what it might be ... possibly a collection of essays with a Web site. We definitely want to produce a online bibliography but maybe trying to use the Zotero exhibition approach there.”
It was a symposium in the strict sense, in that food was involved. Also, beverages. On the final day, a roundtable assessment of the whole event was the last item on the agenda -- only for this discussion to be bumped into the farewell dinner when things ran long.
Unfortunately I was unable to attend, for fear that a persistent hacking cough was turning me into a pandemic vector. Instead, I retired to the hotel to scribble out some thoughts that might have been worth taking up at the roundtable. Here they are -- afterthoughts, a little late for the discussion.
Most people who attended were members of the academic community, whether from Iowa or elsewhere, and most of the sessions took place in university lecture halls. But the first event on the first day was held at the Iowa City Public Library. This was a panel on new ways of discussing books in the age of digital media -- recounted here by Meena Kandasamy, a young Tamil writer and translator whose speech that evening rather stole the show.
Holding the event at the public library opened the proceedings up somewhat beyond the usual professorial demographic. At one point, members of the panel watched as a woman entered with her guide dog, stretched out on the ground at the back of the room, and closed her eyes to listen. At least we hoped she was listening. I think there is an allegory here about the sometimes ambiguous relationship between public scholarship and its audience.
In any case, the venue for this opening session was important. Public libraries were once called “the people’s universities.” The populist impulse has fallen on some scurvy times, but this trope has interesting implications. The public library is an institution that nobody would be able to start now. A place where you can read brand-new books and magazines for free? The intellectual property lawyers would be suing before you finished the thought.
So while musing on collaborative and civic-minded research, it is worth remembering the actually existing public infrastructure that is still around. Strengthening that infrastructure needs to be a priority for public scholarship -- at least as much, arguably, as "the production of knowledge." (This phrase, repeated incessantly in some quarters of the humanities, has long since slipped its original moorings, and owes more to American corporate lingo than to Althusser.)
Institutions can be narcissistic; and one symptom of this is a certain narrowly gauged conception of professionalism. often indistinguishable in demeanor from garden-variety snobbery. Any real progress in consolidating the practice of public scholarship has to involve a strengthening of ties with people in the public sector -- especially librarians and teachers.
It is not that scholars exist over here while something called “the public” is over there -- off in the distance. Rather, people are constituted as a public in particular spaces and activities. The university is one such site, at least sometimes. But it isn’t the only one, and public scholarship needs to have moorings in as many such venues as possible.
The problem being that it is often hard enough to drop an anchor in academe, let alone in the wide Sargasso Sea of civil society. I am not a professor and have no advice to give on that score. But it seems important to pass along the comments of someone attending Platforms for Public Scholars who confided some thoughts to me during some downtime. I will pass them along by permission, but without giving away anything about this person's identity.
During one panel, a couple of tenured professors mentioned being concerned that their civically engaged scholarship might not count for promotion. One even noted that people who had done collaborative work in the humanities tended to discount it as part of a tenure file -- saying, “Well I did my mine without getting credit for it, so why should you?”
At the time, I raised an eyebrow, but didn’t really think much about it. Later, though, someone referred back to the session in tones that suggested chagrin and longstanding doubts about having a career in the humanities.
“These are people who actually are established, who have some power in their institutions," this individual told me. "I don’t have that. I don’t even have a job yet. And I want them to show some courage. If you really have a conviction that collaboration and public engagement are important, then do it without worrying so much. And support it. Make it possible for someone like me to make doing public work part of my scholarship. Otherwise, what are we even talking about?”
The First of the Month is a cultural and intellectual publication that is singularly lively, and no less strange. It started out in 1998, in tabloid format, as a “newspaper of the radical imagination” published in Harlem. First has been compared to Partisan Review, the legendary magazine of the New York Intellectuals that began during the Depression. But honestly, that's just lazy. Any time a bunch of smart people start a magazine, somebody ends up comparing to it to Partisan Review, especially if it is published in New York; but First took its name from a song by Bone-Thugs-n-Harmony, and while I’d like to picture Delmore Schwartz doing a little freestyle rapping over scotch at the White Horse Tavern, it’s a stretch.
Following what has become the contemporary routine, the paper gave birth to a Web site; this then replaced the print edition. An anthology culled from its first decade appeared last year as The First of the Year: 2008, published by Transaction. On first approach, the book looks like a memorial service for the whole project. And an impressive one: the roster of contributors included (to give a very abbreviated and almost random roll-call) Amiri Baraka, Greil Marcus, Lawrence Goodwyn, Grace Lee Boggs, Adolph Reed, Russell Jacoby, Armond White, Kurt Vonnegut, Kate Millet, Richard Hoggart, and Ellen Willis.
I meant to give the volume a plug when it appeared; so much for the good intention. But happily, my initial impression was totally wrong. While continuing to function online (and to have its world headquarters in Harlem, where editorial collective member and impressario Benj DeMott lives) First has reinvented itself as an annual anthology. First of the Year: 2009, has just been published, which seems worth noting here, in this first column of the year.
The viability of any small-scale, relatively unprofitable cultural initiative is a function of two forces. One is the good will of the people directly involved. The other is getting support from the public – or rather, creating one.
In this case, the process is made more difficult by the fact that First is sui generis. Which is putting it politely. My own response upon first encountering it about 10 years ago involved a little cartoon balloon forming over my forehead containing the letters “WTF?” It is not simply that it is hard to know what to expect next; sometimes it is hard to say what it was you just read. In First, political commentary, cultural analysis, and personal essays sit side-by-side. But at times, all three are going on at once, within the same piece. Kenneth Burke used to refer to such jostlings of the coordinate system as creating "perspective by incongruity." It signals a breakdown of familiar formats -- a scrambling of routine associations.This is stimulating, if perplexing. The confusion is not a bug but a feature.
One familiar description of the journal that I have come to distrust treats First as a bridge between popular culture and the ivory tower. An often-repeated blurb from some years ago calls it "the only leftist publication [one] could imagine being read at both Columbia University and Rikers.”
Good advertising, to be sure. But the better the ad, the more its presumptions need checking. The whole “building a bridge” trope implies that there is a distance to be spanned – a connection between enclaves to be made. (The ideas are over here, the masses over there.) But reading First involves getting oriented to a different geography. Some academics do write for it, but they do not have pride of place among the other contributors, who include poets and musicians and journalists, and people who might best just be called citizens. The implication is not that there is distance to be crossed, but that we're all on common ground, whether we know it, or like it, or not.
In the wake of 9/11, some writers for First (not all of them) rallied to the call for a war, and at least one endorsed George W. Bush during the 2004 campaign. Does that mean that First is actually “the only ‘neoconservative’ publication read in both academe and prisons”? Well, no, but funny you should ask, because it underscores the convenience made possible by pre-gummed ideological labels.
At times they are useful (I tend to think "social-imperialist" is a pretty good label for the idea that "shock and awe" was necessary for historical progress in Iraq) but not always.
The discussion of Obama in the new volume is a case in point. Both Paul Berman (a Clintonian liberal who supported the Iraq War) and Amiri Baraka (who takes his political bearings from Mao Tsetung Thought) concurring that the 2008 election was a transformative moment. This is, let's say, an unanticipated convergence. Meanwhile, Charles O’Brien (an editorial collective member who endorsed Bush in ‘04, on more or less populist grounds) treats Obama as a short-circuit in the creation of a vigorous radicalism-from-below needed for social change. “Of the Obama campaign, what endures?” he asks. “The new Pepsi ad.”
It would be wrong to see First as yet another wonk magazine with some cultural stuff in it. Nor is one of those journals (edited on the bridge, so to speak) in which the latest reality-TV show provides the excuse for yet another tour of Foucault’s panopticon. Politics and culture come together at odd angles in the pages of First, -- or rather, each spins out from some vital center that proves hard to pin down. Margin and mainstream are configured differently here.
I tried to get a handle on First's particularity by talking to Benj DeMott, who edited the two anthologies and is now working on the third. We spoke by phone. Taking notes did not seem like a plausible endeavor on my part, because DeMott's mind moves like greased lightning – the ideas and references coming out in arpeggios, rapid-fire and sometimes multitrack.
But one point he made did stick. It was a consideration on holding together a project in which the contributorsdo not share a party line, and indeed sometimes only just barely agree to disagree. It sounds complicated and precarious. Often, he said, it comes down to sharing a passion for music -- for sensing that both democracy and dancing ought to be in the streets. Politics isn't about policy, it's about movement.
That does not mean celebration is always the order of the day. The indulgence of academic hiphop fans is legendary, but if you want to see what tough-minded cultural analysis looks like, check out the African-American film critic Armond White's reflections on white rapper Eminem in The First of the Year: 2009. The essay can be recommended even if its subject is now shrinking in pop culture’s rearview mirror.
“Rather than a symbol of cultural resistance,” writes White, “he’s the most egregious symbol of our era’s selfish trends. With his bootstrap crap and references to rugged individualism reminiscent of the 80s, he’s a heartless Reagan-baby – but without the old man’s politesse.... His three albums of obstinate rants culminate in the egocentric track ‘Without Me,’ making him the Ayn Rand of rap – a pop hack who refuses to look beyond himself.... Minus righteousness, angry rap is dismissible. Rap is exciting when it voices desire for social redress; the urge toward public and personal justice is what made it progressive. Eminem’s resurrected Great White Hope disempowers hip hop’s cultural movement by debasing it.”
Now, if you can imagine such thoughts ever appearing in an essay by Irving Howe -- let alone Irving Kristol -- then we can go ahead and describe First as inheriting the legacy of the New York Intellectuals.
Otherwise, it may be time to recognize and respect First for what it is in its own right: a journal of demotic intelligence, alive to its own times, with insights and errors appropriate to those times, making it worth the price of perplexity.
Only after talking to Benj DeMott did I read what seems, with hindsight, like the essay that best explains what is going on with the whole project. This is long tribute -- far more analytical than sentimental -- to his father, the late Benjamin DeMott, who was a professor of English at Amherst College. He was a remarkable essayist and social critic.
It is time that someone publish a volume of DeMott senior's selected writings. Meanwhile, his influence on First seems pervasive. The younger DeMott quotes a letter written in his father’s final years -- a piece of advice given to a friend. It offers a challenge to what we might call "the will to sophistication," and its hard clarity is bracing:
"Study humiliation. You have nothing ahead of you but that. You survive not by trusting old friends. Or by hoping for love from a child. You survive by realizing you have nothing whatever the world wants, and that therefore the one course open to you is to start over. Recognize your knowledge and experience are valueless. Realize the only possible role for you on earth is that of a student and a learner. Never think that your opinions – unless founded on hard work in a field that is totally new to you – are of interest to anyone. Treat nobody, friend, co-worker, child, whomever, as someone who knows less than you about any subject whatever. You are an Inferior for life. Whatever is left of it....This is the best that life can offer. And it’s better than it sounds.”
This amounts, finally, to a formulation of a democratic ethos for intellectual life. It bends the stick, hard, against the familiar warp. So, in its own way, does First, and I hope the website and the series of anthologies will continue and prosper as new readers and writers join its public.