Sociology

Why I Teach Intro

You probably recall that in George Orwell’s 1984 the authorities bring Winston Smith to a torture chamber to break his loyalty to his beloved Julia. Perhaps you do not remember the room number. It is 101.

The modern university institutionalizes Orwell’s association of the number 101 with torture. Faculty and students often consider introductory courses an affliction.

I suspect that colleagues award teaching prizes to 101 instructors partly as compensation for relieving themselves of the agony of teaching introductory courses -- a suspicion that first occurred to me last year, when I shared an award with the University of Toronto’s Centre for the Study of Pain, much praised for its relief of suffering.

Why, then, do I teach introductory sociology? My colleagues have been too polite to remind me of the alleged downsides, but they are well known. First, teaching an introductory course is often said to be a time-consuming activity that interferes with research and writing -- the royal road to prestige, promotion, and merit pay. Second, it is reputedly boring and frustrating to recite the elementary principles of the discipline to young students, many of whom could not care less. Third, the 101 instructor performs supposedly menial work widely seen as suited only to non-tenured faculty members, advanced graduate students, and other personnel at the bottom rung of the academic ladder. Although I understand these arguments, I do not find them compelling. For me, other considerations have always far outweighed them.

In particular, teaching intro solves, for me, the much-discussed problem of public sociology. Some sociologists believe that working to improve human welfare is somehow unprofessional or unscientific. They hold that professional sociologists have no business drawing blueprints for a better future and should restrict themselves to analyzing the present dispassionately and objectively. However, to maintain that belief they must ignore what scientists actually do and why they do it. Sir Isaac Newton studied astronomy partly because the explorers and mariners of his day needed better navigational cues. Michael Faraday was motivated to discover the relationship between electricity and magnetism partly by his society’s search for new forms of power.

Today, many scientists routinely and proudly acknowledge that their job is not just to interpret the world but also to improve it, for the welfare of humanity; much of the prestige of science derives precisely from scientists’ ability to deliver the goods. Some sociologists know they have a responsibility beyond publishing articles in refereed journals for the benefit of their colleagues. One example is Michael Burawoy’s 2004 presidential address to the American Sociological Association, a gloss on Marx’s “Theses on Feuerbach”, in which Burawoy criticized professional sociologists for defining their job too narrowly and called for more public sociology. Still, many sociologists hold steadfastly to the belief that scientific research and public responsibility are at odds -- largely I suspect, because they are insecure about whether their research is really scientific at all, so feel they must be more papist than the pope.

Setting such anxieties aside, one is left with the question of how to combine professional pursuits with public responsibility. One option is conducting research that stimulates broad discussion of public policy. Some of my colleagues study how immigration policy limits the labour market integration and upward mobility of immigrants; others how family policy impairs child welfare; and still others how tax and redistribution policies affect inequality. To the degree they engage educated citizens in discussion and debate on such important issues, they achieve balance between their professional and public roles.

I have chosen a different route to public responsibility. I have conducted research and published for a professional audience, but I have also enjoyed the privilege of addressing hundreds of thousands of members of the public over the years by teaching Sociology 101 in large lecture halls and by writing textbooks for intro students in several countries. As Orwell wrote, communicating effectively to a large audience may be motivated by aesthetic pleasure and egoistic impulses. Who among us does not want to write clear and compelling prose and to be thought clever for doing so? But in addition, one may want to address a large audience for what can only be deemed political reasons.

In 1844, Charles Dickens read his recent Christmas composition, The Chimes, to his friend William Charles Macready, the most famous Shakespearean actor of the day. Dickens later reported the reading to another friend as follows: “If you had seen Macready last night -- undisguisedly sobbing, and crying on the sofa, as I read -- you would have felt (as I did) what a thing it is to have Power.” I understand Dickens. I, too, relish the capacity to move and to sway a large audience to a desired end because it signifies that my influence will not be restricted to a few like-minded academics and that I may have at least some modest and positive impact on the broader society. I find most students burn with curiosity about the world and their place in it, and I am delighted when they tell me that a lecture helped them see how patterned social relations shape what they can become in this particular historical context. On such occasions I know that I have taught them something about limits and potential—their own and that of their society. Teaching intro thus allows me to discharge the public responsibility that, according to Burawoy and others, should be part of every sociologist’s repertoire.

In Marx’s words, “it is essential to educate the educators” -- especially those who persist in believing that teaching intro bores, frustrates, interferes, and suits only the academic proletariat.

Author/s: 
Robert Brym
Author's email: 
info@insidehighered.com

Robert Brym is professor of sociology at the University of Toronto. A version of this essay first appeared in Academic Matters, which is published by the Ontario Confederation of University Faculty Associations.

Every Fury on Earth

Fifty years ago next month, C. Wright Mills published The Sociological Imagination, a classic critique of the field that includes, as an appendix, "On Intellectual Craftsmanship." The essay is part profession of faith and part practical handbook -- full of good advice, and not just for social scientists. "Scholarship is a choice of how to live as well as a choice of career," wrote Mills; "whether he knows it or not, the intellectual workman forms his own self as he works towards the perfection of his craft...."

I've lauded the piece here before, and was glad to see it in the table of contents for The Politics of Truth: Selected Writings of C. Wright Mills, a volume published last year by Oxford University Press and edited by John H. Summers, a visiting scholar at the Boisi Center for Religion and American Public Life at Boston College. But on closer examination, I saw that the editor hadn't simply reprinted the appendix. This version of "On Intellectual Craftsmanship" was rather different: it was taken from the text that Mills had mimeographed for his students in the mid-1950s.

This evidence of digging around in the archives left me eager to read more of the editor's own writings about Mills, listed in the bibliography, to see what insights he might have reached while excavating. And as luck would have it, we were introduced a short time later by a mutual friend. This somewhat expedited things, since Summers was just about to publish Every Fury on Earth (The Davies Group), a far-ranging collection of essays, including several on Mills.

Something of the maverick sociologist's feeling for intellectual craftsmanship runs throughout Summers' work. I don't recall the last time I read anything so ardent about scholarship as a means to soul-making -- or, for that matter, so angry at how academic life can distort that process. One of the remarkable things about Summers as a writer is that his frustration never runs to sarcasm -- no small accomplishment.

We recently exchanged a few rounds of e-mail about his work. A transcript of that exchange follows.

Q: You identify yourself as an anarchist and quote passages in which both James Agee and C. Wright Mills did, too. But it's not clear from your work (or theirs, for that matter) just how much this is a matter of feeling an affiliation with some strand of the anarchist movement, and how much it is a matter of personal temperament. What sort of anarchist are you?

A: May I split the difference between temperament and historical exemplars? Politically, anarchism is a democratic method for criticizing power; philosophically, a rough synonym for pragmatism, especially in William James's effort to defend the creativity of perception against the lure of abstraction and intellectualism.

Several years ago I began to notice writers and scholars whom I admired calling themselves anarchists; not only James, Mills, and Agee, but Dwight Macdonald, who called himself a conservative anarchist. What I did not notice, and still have not found, was a serious discussion of these impulses. (As Macdonald said, most educated Americans mistakenly believe anarchism means chaos). So I was drawn to anarchism out of frustrated curiosity. Sensibility had something to do with it, but that's only to say the same thing twice: I don't discover such things about myself but by reading.

Q: Dwight Macdonald edited and contributed to little political magazines -- as did Mills -- but also wrote for large-circulation publications. A couple of essays in your book were first published in The Journal of American History, and another appeared in an edited collection of papers. But the rest were written for magazines, newspapers, and Web sites. That sort of thing is normally just tolerated, though not encouraged. Aren't you worried that being "public" means you aren't "professional"? Isn't that the kiss of death on the tenure track?

A: The University of Rochester never asked me to make an invidious distinction between the public and the professional, but taught history as a form of criticism. If that sounds amateurish, as if critics are less serious than bibliomaniacs, then consider a short list of distinguished students and graduates from the Rochester history department and marvel at the blend of scholarly erudition and public commitment animating their work: Chris Lehmann, Kevin Mattson, Christopher Phelps, Rochelle Gurstein, Casey Nelson Blake, Cathy Tumber, Russell Jacoby. Has any small history department in recent memory made a comparable contribution?

I've put in my professional time -- editing a column for the American Historical Association's newsletter and publishing refereed articles in Intellectual History Review and Left History, where I have a 40-page, 100-footnote article forthcoming that would arouse any tenure committee, were I to make it that far. As things stand, I see no special reason to worry. Are there any tenure-track jobs left to lose?

Q: The pages discussing your academic career, so far, are marked by frustration with the university as an institution shaped by "the downsizing and outsourcing techniques perfected by the corporations." If history is a craft, you write, then historians should be organized into guilds -- a medieval notion, as was Paul Goodman's understanding of the university as "a community of scholars." But how do you create that ethos? Isn't the whole culture set up to teach people that they are monads of self-interest who need to learn to manipulate the system to get ahead?

A: Although the university hosts conflicting voices, it rarely gives us an effectual debate about the ends of education. The profession, likewise, includes many perspectives while controlling them within a methodological straight-jacket. If the ethos should precede the institution, as you rightly suggest by your question, then it is up to the individual scholar to self-organize.

Q: Okay, how?

A: How should I know? Paul Goodman, Lewis Mumford, and C. Wright Mills answered by telling us to study the gamut of social forms through which modern cultural history has transmitted itself, looking for links in a model of exemplary characters, images, events, and ideas. Christopher Lasch urged us to ask ourselves whether we possessed the moral resources implied by our cultural criticism. James Agee said we must be faithful to our perceptions wherever they may lie.

I think the question of how to live as a scholar or writer is personal, inescapably so, in the exact sense that society forbids us to acknowledge. (Many more people have done much worse things by taking things impersonally than those who have been sensitive to personal meaning). Almost everyone acknowledges that our system of graduate education is obsolete, yet there is not a single serious proposal for reforming the profession. Linger on that failure for a moment. In a crumbling system, self-organization is less a matter for utopian speculation than survival.

Q: Your first major undertaking as an apprentice scholar in the 1990s was a critical analysis of Dale Carnegie's "How to Win Friends and Influence People" -- done, it sounds like, in the approved cultural-studies manner of the period. It's kind of disappointing that you didn't include that paper in the collection. But maybe it's there between the lines? It sounds like one of your criticisms of academic life is, so to speak, its rampant if unacknowledged Carnegie-ism. Would you say more about your interest in his most famous text?

A: I grew up in a conservative family in rural Pennsylvania as the son and grandson of small businessmen. To them, How To Win Friends and Influence People contained nothing but common sense. I declined their offer to enroll in the Carnegie seminar during high school. Not until I enrolled in the master’s program at George Mason University in 1994 did I begin to understand the sources of the book’s cynicism, the elision of sincerity and its performance.

Carnegie, training his readers to detect weakness in others, undermined the possibility of a social gospel in Christian ethics. But my father and grandfather were not notably religious, and I sighted the irony of their devotion from another direction. Both of them are tall, tough men -- no metrosexuals here. Yet they esteemed Carnegie, a mousy Methodist who told men to suppress their instinct for conflict behind a plastic smile. Early on, I decided I would not suppress myself in this way.

The paper itself, though not worth publishing, gave me a short course on the therapeutic idiom in the business culture of the 1930s. I still find it curious that Carnegie, along with the period's self-help gurus such as Walter Pitkin, Dorthea Brande, and Norman Vincent Peale (“positive thinkers” all) cited the philosophical psychology of John Dewey and William James repeatedly and enthusiastically. James’s essay, “The Energies of Men,” made the point of departure for Pitkin’s book, More Power To You! A Working Technique for Making the Most of Human Energy (1934). Carnegie called James “the most distinguished psychologist and philosopher America ever produced” and Dewey “America’s most profound philosopher.” In Think and Grow Rich (1939) Napoleon Hill gave one of his chapters a title that could have appeared in a Mills book: “Imagination: The Workshop of the Mind.” Thus is one returned to the discomfiting paradox that major currents in American radical thought have not differed radically from the society they have criticized.

The most valuable part of my master’s degree from George Mason University was the chance to study with Roy Rosenzweig, one of the best men I have known.

Q: The admiration you express for Roy Rosenzweig was one of the things that surprised me the most about your book. Rosenzweig was the father of digital history. By contrast, you seem...well, not quite a Luddite, perhaps, but not an "early adopter."

A: Roy was easy to admire. I worked for his Center for History and New Media on projects such as History Matters: The U.S. Survey Course on the Web and the CD-Rom version of Who Built America? Under his direction, moreover, I published one of the first essays about labor history on the web. Roy worked in collaboration with Steve Brier and Josh Brown of the American Social History Project.

The instances of kindness, instruction, and encouragement I received from Roy, Josh, and Steve have made me wonder -- to return to your earlier question about organization of scholarly work -- whether Centers or Projects are more conducive to cooperative learning than Departments. My experience this year at Alan Wolfe’s Boisi Center for Religion and American Public Life, at Boston College, suggests all over again that this may be so.

Q: Last year, Oxford University Press published your edition of selected writings by C. Wright Mills, including a series of lectures from 1959 derived from an unpublished manuscript called "The Cultural Apparatus." By that title, Mills says he means "all of the organizations and milieux in which artistic, intellectual, and scientific work goes on, and by which entertainment and information are produced and distributed." Why do we need this 50-year-old analysis today? I mean, we're downstream from Habermas and Foucault now. Doesn't that pretty much cover it for (respectively) hope and fear in regard to the cultural apparatus?

A: Everyone can learn something from Mills’s “natural history of culture.” I say so confident in the knowledge of the reception accorded these lectures in 1959--the thousand or so letters on file at the University of Texas--as well as recent experience, having taught them last week in my history of radicalism course at The Cooper Union. The students there got it.

Can one say the same for Habermas? I have found him damnably difficult to teach. At the end of The Structural Transformation of the Public Sphere, he copied out a section of The Power Elite that reappears in The Cultural Apparatus: the idea of self-renewing publics, which implied the meaning of the intellectual vocation to lie in the continual search for them.

I agree that we face a mode of cultural production, distribution, and consumption unlike the factory-style conditions Mills addressed. Do I diminish these lectures by saying their value is primarily historical? The history of ideas can be useful without being practically useful, all the more so in the case of old ideas lightly printed on sketch paper, unrealized rather than outworn, forgotten. James Agee, who loved to play the church organ, often spoke of his idea to write a history of the United States through the religious hymn music echoing in America’s vast land of small towns, hamlets, and churches. In what sense do we “need" to know all about Agee’s impossible idea?

Q: Another notion in Mills that interests you is his idea of the New Man. What's that all about?

A: Daniel Bell was right to discern an "end of ideology." Mills, in his "Letter to the New Left," did not deny that social reality had exhausted modern ideology. But Mills praised ideological striving while Bell refused to mourn its passing. Accordingly, many commentators on Mills have been tempted to find ideological motives in his thought, insisting that he was really a Trotskyist, a Marxist, a Deweyan, a Weberian, a Shachtmanite, and so on. Mills himself insisted he was “neither a con-former nor a re-former.” I think one way to understand his ideological striving without tripping over a label is to consult the long line of New Men in Europe and America. From his first book, The New Men of Power (“of power,” not “in power”) to his defense of Cuba’s “new men” in Listen Yankee!, Mills let this idea guide his work.

The idea of the New Man puts biography at the center of the history of radicalism, which has been preoccupied with victimized social movements and which, in conception and method, looks like the historiography it claims to subvert. Why should biography sit on the sidelines of monographic scholarship when the New Man once dominated liberal and radical thought, showing up in Emerson’s “over-soul,” Nietzche’s “over-man,” Weber’s “new prophets,” and Adorno’s “New Type of Human Being” before he showed in Mills's Havana?

The New Man stands beyond alienation, feels in his spiritual independence determined to make intelligible the mysterious processes of history. “We know that the new form of social production, to achieve the good life, needs only new men,” Marx wrote in 1856. The Soviets found their New Man in Ostrovsky’s How the Steel Was Tempered--with Gladkov’s Cement featuring Dasha as the New Woman--while in America Alaine Locke claimed the creation of The New Negro as a greater achievement than any one work of art or literature so produced. Closer to Mills’s time, Frantz Fanon, in The Wretched of the Earth, said decolonialization “brings a natural rhythm into existence, introduced by new men, and with it a new language and a new humanity. Decolonialization is the veritable creation of new men.” And while Mills was hailing the decolonializers in Cuba, Arthur Schlesinger Jr. (in A Thousand Days) was exulting over the mood of Kennedy’s Washington, “the excitement which comes from an injection of new men and new ideas, the release of energy which occurs when men with ideas have a chance to put them into practice.”

Human character, so conceived by biographers of power, is an independent agent of political change, evidence of the plasticity of nature in the freedom of revolutionary spirit. It was Crevecoeur in his epistolary novel Letters from an American Farmer who asked “What then is the American, this new man?” and answered that he lived in “the most perfect society now existing in the world.”

Q: You are working on a biography of Mills -- who, given the extent of his work and his influence, it seems hard to believe died in his mid-40s. How is the project going? How far along are you?

A: A decade of research has turned up several thousand letters, poems, photographs, manuscripts, audio recordings, and autobiographical writings, including a 101-page life-history Mills wrote in college. Other new material I have discovered includes over 85 interviews, including multiple sessions with his widow and two ex-wives (all three women died last summer); and Columbia colleagues such as Daniel Bell, Lewis Coser, Nathan Glazer, Seymour Martin Lipset, Robert Merton, and Immanuel Wallerstein. Many had never discussed their relationship with Mills; of those living abroad, most had never been asked. From sessions with the Mexican novelist Carlos Fuentes, the Polish philosopher Leszek Kolakowski, and the British historian Dorothy Thompson I learned perspectives sharply at odds with American views.

The thrill of such research and the logistics of such interviews, plus the daunting complexity of the task itself, equal a book long in the making. It almost seems appalling to finish it, yet I expect to do so later this year. In the meantime, I'll complete a number of minor projects, including an introduction to social theory for Continuum’s “guide to the perplexed” series, and a pamphlet of my writings on higher education to be issued under the title Eternal Teacher.

Q: You haven't started emulating Mills by eating gigantic, heart-clogging steak dinners, have you?

A: Steak dinners? With my wife, Anna, I am living on the subsistence wages accorded adjunct faculty. There is hope yet. Four months ago Anna gave birth to our daughter, Niusha, who has been proving by sublime action what our education taught by pale precept: that our nature is innocent, intelligent, spontaneous, and, on the occasion, quite capable of making a fuss. Another child in the world, another born anarchist.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

The Monster at Our Door

Laid low with illness -- while work piles up, undone and unrelenting -- you think, “I really couldn’t have picked a worse time to get sick.”

It’s a common enough expression to pass without anyone ever having then to draw out the implied question: Just when would you schedule your symptoms? Probably not during a vacation....

It’s not like there is ever a good occasion. But arguably the past few days have been the worst time ever to get a flu. Catching up with a friend by phone on Saturday, I learned that he had just spent several days in gastrointestinal hell. The question came up -- half in jest, half in dread -- of whether he’d contracted swine variety.

Asking this was tempting fate. Within 24 hours, I started coughing and aching and in general feeling, as someone put it on "Deadwood," “pounded flatter than hammered shit.” This is not a good state of mind in which to pay attention to the news. It is not reassuring to know that the swine flu symptoms are far more severe than the garden-variety bug. You try to imagine your condition getting exponentially worse, and affecting everyone around you -- and everyone around them.....

So no, you really couldn’t pick a worse time to get sick than right now. On the other hand, this is a pretty fitting moment for healthy readers to track down The Monster at Our Door: The Global Threat of Avian Flu, by Mike Davis, a professor of history at the University of California at Irvine. It was published four years ago by The New Press, in the wake of Severe Acute Respiratory Syndrome (SARS), which spread to dozens of countries from China in late ‘02 and early ‘03.

The disease now threatening to become a pandemic is different. For one thing, it is less virulent -- so far, anyway. And its proximate source was pigs rather than birds.

But Davis’s account of “antigenic drift” -- the mechanism by which flu viruses constantly reshuffle their composition -- applies just as well to the latest developments. A leap across the species barrier results from an incessant and aleatory process of absorbing genetic material from host organisms and reconfiguring it to avoid the host’s defense systems. The current outbreak involves a stew of avian, porcine, and human strands. “Contemporary influenza,” writes Davis, “like a postmodern novel, has no single narrative, but rather disparate storylines racing one another to dictate a bloody conclusion."

Until about a dozen years ago, the flu virus circulating among pigs “exhibited extraordinary genetic stability,” writes Davis. But in 1997, some hogs on a “megafarm” in North Carolina came down with a form of human flu. It began rejiggering itself with genetic material from avian forms of the flu, then spread very rapidly across the whole continent.

Vaccines were created for breeding sows, but that has not kept new strains of the virus from emerging. “What seems to be happening instead,” wrote Davis a few years ago, “is that influenza vaccinations -- like the notorious antibiotics given to steers -- are probably selecting for resistant new viral types. In the absence of any official surveillance system for swine flu, a dangerous reassortant could emerge with little warning.” An expert on infectious diseases quoted by CNN recently noted that avian influenza never quite made the leap to being readily transmitted between human beings: "Swine flu is already a man-to-man disease, which makes it much more difficult to manage, and swine flu appears much more infectious than SARS."

There is more to that plot, however, than perverse viral creativity. Davis shows how extreme poverty and the need for protein in the Third World combine to form an ideal incubator for a global pandemic. In underdeveloped countries, there is a growing market for chicken and pork. The size of flocks and herds grows to meet the demand -- while malnutrition and slum conditions leave people more susceptible to infection.

Writing halfway through the Bush administration, Davis stressed that the public-health infrastructure had been collapsing even as money poured into preparations to deal with the bioterrorism capabilities of Iraq’s nonexistent weapons of mass destruction. The ability to cope with a pandemic was compromised: “Except for those lucky few -- mainly doctors and soldiers -- who might receive prophylactic treatment with Tamiflu, the Bush administration had left most Americans as vulnerable to the onslaught of a new flu pandemic as their grandparents or great-grandparents had been in 1918.”

The World Health Organization began stockpiling Tamiflu in 2006, with half of its reserve of five million doses now stored in the United States, according to a recent New York Times article. The report stressed that swine flu is driving up the value of the manufacturer’s stocks -- in case you wondered where the next bubble would be.

But don't expect to see comparable growth in the development of vaccines. As Davis wrote four years ago, “Worldwide sales for all vaccines produced less revenue than Pfizer’s income from a single anticholesterol medication. ... The giants prefer to invest in marketing rather than research, in rebranded old products rather than new ones, and in treatment rather than prevention; in fact, they currently spend 27 percent of their revenue on marketing and only 11 percent on research.”

The spread of SARS was contained six years ago -- a good thing, of course, but also a boon to the spirit of public complacency, which seems as tireless as the flu virus in finding ways to reassert itself.

And to be candid, I am not immune. A friend urged me to read The Monster at Our Door not long after it appeared. It sat on the shelf until a few days ago.

Now the book seems less topical than prophetic -- particularly when Davis draws out the social consequences of his argument about the threat of worldwide pandemics. If the market can’t be trusted to develop vaccines and affordable medications, he writes, “then governments and non-profits should take responsibility for their manufacture and distribution. The survival of the poor must at all times be accounted a higher priority than the profits of Big Pharma. Likewise, the creation of a truly global public-health infrastructure has become a project of literally life-or-death urgency for the rich countries as well as the poor.”

There is an alternative to this scenario, of course. The word "disaster" barely covers it.

MORE: Mike Davis discusses the swine flu outbreak in an article for The Guardian. He also appeared recently on the radio program Beneath the Surface, hosted by Suzi Weissman, professor of politics at St. Mary's College of California, available as a podcast here.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Criminal Incompetence

About a year ago, one of my distant relatives found himself in trouble with the law, and not for the first time. He had allegedly stabbed somebody in the course of a dispute over certain business matters, and so had to go on the run. The police had a thorough description of him (from sustained acquaintance) that they provided to local newspapers -- including the memorable detail that he had numerous tattoos, among them the ones on his forehead over each eye.

He was eventually tracked down in a nearby state. The stab-ee declined to press charges, and everyone lived happily ever after.

As events unfolded, I kept thinking: "There is a valuable lesson here. If you are planning on a life of crime, it is probably best not to get tattoos on your forehead. There are bound to be times when you will need to remain inconspicuous, and having a tattoo over each eye really won't help with that." Then again, career guidance for criminals is probably not what it could be.

Or is it? I have been reading Diego Gambetta's new book Codes of the Underworld: How Criminals Communicate, just published by Princeton University Press. The author, a professor of sociology at Oxford University, notes that senior convicts in Folsom State Penitentiary, including its "honorable" tattoo artists, strongly discourage young and unmarked felons from getting inked. Gambetta, who has also published a study of the Sicilian Mafia, takes a transnational approach in his new book. He cites a report on the attitude found within a South African prison: "Facial tattoos are the ultimate abandonment of all hope of a life outside."

On the other hand, it certainly shows a certain commitment to one's chosen career. It's also a way around the inconvenient fact that nowadays movie stars and accountants and writing-program administrators are sporting bitchin' 'tats. A generalized social destigmatization of body art ups the ante for people whose livelihood comes from projecting an aura of menace. In some lines of work, the forehead is a perfectly good place for one's CV. It may even qualify as proof of ambition.

Gambetta's study also looks at such modes of underworld communication as nicknames, slang, and such "trademarks" as the little logos on bags of heroin, or a gang's preferred means of executing a traitor. How absorbing readers may find Codes of the Underworld is very much a matter of taste. (Every time GoodFellas runs on cable, I end up watching, while my spouse refuses to sit through it a second time.) But morbid fascination aside, the book is interesting for how its method may apply to other forms of interaction -- and other career paths.

Surprisingly, none of the familiar theoretical apparatus of semiology is wheeled onstage. Gambetta's approach is an economic analysis of how various modes of underworld communication function.

This doesn't mean simply treating tattoos, nicknames, fish wrapped in newspaper, etc., as components of certain kinds of economic exchange. Rather, Gambetta looks at the life of crime itself as shaped by a traffic in signals of professional competence. There is a market of sorts involved in accumulating a stock of reputational "capital" -- as well as the incidental expenses that must be paid to maintain it. Not only do police and FBI agents spend a great deal of effort learning to mimic the lingo and gestures of the underworld, but so do wannabes and fashionistas. It is a constant struggle to update the code and proof-check the credentials.

Because the activities involved are illegal, the more familiar possibilities of accreditation are just not available. It is not like there is a licensing agency for counterfeiters. Anyway, how could you trust its certificates?

That is my example, not Gambatta's. But one incident he recounts may suggest how difficult things can get, at least for potential consumers of underworld services. A woman in Canada learned that there was a business in the American Southwest called Guns for Hire. She did not realize that it was a theatrical group that specialized in reenactments of Old Western shoot-outs and the like. She called its office to try to arrange the disposal of her husband. (This is an example of what is sometimes called "an imperfect market created by differences of information.")

But such problems do not emerge only along the boundary separating civilians and professional hoods. "Criminals embody homo economicus at his rawest," writes Gambetta, "and they know it. In keeping with the evidence that people who are untrustworthy are also likely to think that others are untrustworthy, criminals are more inclined to distrust each other than ordinary people do." In a subculture where dishonesty is the norm and participants have no recourse to mediation by the state, it is especially difficult to communicate trustworthiness and reliability to one's potential peers or clients.

On that score, Gambatta makes a fascinating and rather counterintuitive argument about the role that gross incompetence plays in organized crime -- and also, as a brief discussion in one chapter suggests, in academic life, at least in Italy.

"An unexpected result of my research on the mafia," he writes, "was to find out that mafiosi are quite incompetent at doing anything" other than shaking down legitimate businesses and enforcing trade agreements among smaller-scale hoodlums. "Mafiosi are good at intimidation and stick to it.... They let the professionals and the entrepreneurs take care of the actual business operations."

Rather than getting involved in running a restaurant or dealing drugs, they joke about their cluelessness in such matters and simply collect payment for "protection." But this professed incompetence (evidently quite well-demonstrated on the rare occasions that a mafioso tries to go legit) makes them strangely "trustworthy" to those using their services: "If [mobsters] showed any competence at it, their clients would fear that they might just take over."

Gambetta argues that something similar takes place among the baroni (barons) who oversee the selection committees involved in Italian academic promotions. While some fields are more meritocratic than others, the struggle for advancement often involves a great deal of horse trading. "The barons operate on the basis of a pact of reciprocity, which requires a lot of trust, for debts are repaid years later. Debts and credits are even passed on from generation to generation within a professor's 'lineage,' and professors close to retirement are excluded from the current deals, for they will not be around long enough to return favors."

The most powerful figures in this system, says Gambetta, tend to be the least intellectually distinguished. They do little research, publish rarely, and at best are derivative of "some foreign author on whose fame they hope to ride.... Also, and this is what is the most intriguing, they do not try to hide their weakness. One has the impression that they almost flaunt it in personal contacts."

Well, one also has the impression that the author is here on the verge of writing a satirical novel. But a friend who is interested in both the politics and academic life of Italy tells me that this account is all too recognizably accurate, in some fields anyway. Gambetta calls the system "an academic kakistocracy, or government by the worst," which is definitely an expression I can see catching on.

This may seem like a tangent from comparative criminology. But Gambetta argues that the cheerful incompetence of the baroni is akin to the mafioso's way of signaling that he can be "trusted" within his narrowly predatory limits

"Being incompetent and displaying it," he writes, "conveys the message I will not run away, for I have no strong legs to run anywhere else. In a corrupt academic market, being good at and interested in one's own research, by contrast, signal a potential for a career independent of corrupt reciprocity.... In the Italian academic world, the kakistrocrats are those who best assure others by displaying, through lack of competence and lack of interest in research, that they will comply with the pacts."

It would be shocking, simply shocking, however, if anyone suggested this was not strictly an Italian problem.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

The Public Option

Shortly after last week’s column appeared, I headed out to Iowa City to attend -- and, as the occasion required, to pontificate at -- a gathering called Platforms for Public Scholars. Sponsored by the Obermann Center for Advanced Studies at the University of Iowa, it drew somewhere between 100 and 150 participants over three days.

This was the latest round in an ongoing conversation within academe about how to bring work in the humanities into civic life, and vice versa. The discussion goes back almost a decade now, to the emergence of the Imagining America consortium, which fosters collaboration between faculty at research universities and partners in community groups and nonprofit organizations.

That effort often runs up against institutional inertia. You sense this from reading "Scholarship in Public: Knowledge Creation and Tenure Policy in the Engaged University" (the report of the consortium's Tenure Team Initiative, released last year). Clearly there is a long way to go before people in the humanities can undertake collaborative, interdisciplinary, and civic-minded work without fearing that they are taking a risk.

Even so, the presentations delivered in Iowa City reported on a variety of public-scholarship initiatives -- local history projects, digital archives, a festival of lectures and discussions on Victorian literature, and much else besides. Rather than synopsize, let me recommend a running account of the sessions live-blogged by Bridget Draxler, a graduate student in English at the University of Iowa. It is available at the Web site of the Humanities, Arts, Sciences, and Technology Advanced Collaboratory (better known as HASTAC, usually pronounced “haystack”).

Word went around of plans to publish a collection of papers from the gathering. I asked Teresa Mangum, a professor of English at U of I, who organized and directed the event, if that was in the cards. She “built the platform,” as someone put it, and presided over all three days with considerable charm -- intervening in the discussion in ways that were incisive while also tending to foster the collegiality that can be elusive when people come from such different disciplinary and professional backgrounds.

“My goal is to have some kind of ‘artifact’ of the conference,” she told me, “but I'm trying to think more imaginatively what it might be ... possibly a collection of essays with a Web site. We definitely want to produce a online bibliography but maybe trying to use the Zotero exhibition approach there.”

It was a symposium in the strict sense, in that food was involved. Also, beverages. On the final day, a roundtable assessment of the whole event was the last item on the agenda -- only for this discussion to be bumped into the farewell dinner when things ran long.

Unfortunately I was unable to attend, for fear that a persistent hacking cough was turning me into a pandemic vector. Instead, I retired to the hotel to scribble out some thoughts that might have been worth taking up at the roundtable. Here they are -- afterthoughts, a little late for the discussion.

Most people who attended were members of the academic community, whether from Iowa or elsewhere, and most of the sessions took place in university lecture halls. But the first event on the first day was held at the Iowa City Public Library. This was a panel on new ways of discussing books in the age of digital media -- recounted here by Meena Kandasamy, a young Tamil writer and translator whose speech that evening rather stole the show.

Holding the event at the public library opened the proceedings up somewhat beyond the usual professorial demographic. At one point, members of the panel watched as a woman entered with her guide dog, stretched out on the ground at the back of the room, and closed her eyes to listen. At least we hoped she was listening. I think there is an allegory here about the sometimes ambiguous relationship between public scholarship and its audience.

In any case, the venue for this opening session was important. Public libraries were once called “the people’s universities.” The populist impulse has fallen on some scurvy times, but this trope has interesting implications. The public library is an institution that nobody would be able to start now. A place where you can read brand-new books and magazines for free? The intellectual property lawyers would be suing before you finished the thought.

So while musing on collaborative and civic-minded research, it is worth remembering the actually existing public infrastructure that is still around. Strengthening that infrastructure needs to be a priority for public scholarship -- at least as much, arguably, as "the production of knowledge." (This phrase, repeated incessantly in some quarters of the humanities, has long since slipped its original moorings, and owes more to American corporate lingo than to Althusser.)

Institutions can be narcissistic; and one symptom of this is a certain narrowly gauged conception of professionalism. often indistinguishable in demeanor from garden-variety snobbery. Any real progress in consolidating the practice of public scholarship has to involve a strengthening of ties with people in the public sector -- especially librarians and teachers.

It is not that scholars exist over here while something called “the public” is over there -- off in the distance. Rather, people are constituted as a public in particular spaces and activities. The university is one such site, at least sometimes. But it isn’t the only one, and public scholarship needs to have moorings in as many such venues as possible.

The problem being that it is often hard enough to drop an anchor in academe, let alone in the wide Sargasso Sea of civil society. I am not a professor and have no advice to give on that score. But it seems important to pass along the comments of someone attending Platforms for Public Scholars who confided some thoughts to me during some downtime. I will pass them along by permission, but without giving away anything about this person's identity.

During one panel, a couple of tenured professors mentioned being concerned that their civically engaged scholarship might not count for promotion. One even noted that people who had done collaborative work in the humanities tended to discount it as part of a tenure file -- saying, “Well I did my mine without getting credit for it, so why should you?”

At the time, I raised an eyebrow, but didn’t really think much about it. Later, though, someone referred back to the session in tones that suggested chagrin and longstanding doubts about having a career in the humanities.

“These are people who actually are established, who have some power in their institutions," this individual told me. "I don’t have that. I don’t even have a job yet. And I want them to show some courage. If you really have a conviction that collaboration and public engagement are important, then do it without worrying so much. And support it. Make it possible for someone like me to make doing public work part of my scholarship. Otherwise, what are we even talking about?”

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

The First of the Year

The First of the Month is a cultural and intellectual publication that is singularly lively, and no less strange. It started out in 1998, in tabloid format, as a “newspaper of the radical imagination” published in Harlem. First has been compared to Partisan Review, the legendary magazine of the New York Intellectuals that began during the Depression. But honestly, that's just lazy. Any time a bunch of smart people start a magazine, somebody ends up comparing to it to Partisan Review, especially if it is published in New York; but First took its name from a song by Bone-Thugs-n-Harmony, and while I’d like to picture Delmore Schwartz doing a little freestyle rapping over scotch at the White Horse Tavern, it’s a stretch.

Following what has become the contemporary routine, the paper gave birth to a Web site; this then replaced the print edition. An anthology culled from its first decade appeared last year as The First of the Year: 2008, published by Transaction. On first approach, the book looks like a memorial service for the whole project. And an impressive one: the roster of contributors included (to give a very abbreviated and almost random roll-call) Amiri Baraka, Greil Marcus, Lawrence Goodwyn, Grace Lee Boggs, Adolph Reed, Russell Jacoby, Armond White, Kurt Vonnegut, Kate Millet, Richard Hoggart, and Ellen Willis.

I meant to give the volume a plug when it appeared; so much for the good intention. But happily, my initial impression was totally wrong. While continuing to function online (and to have its world headquarters in Harlem, where editorial collective member and impressario Benj DeMott lives) First has reinvented itself as an annual anthology. First of the Year: 2009, has just been published, which seems worth noting here, in this first column of the year.

The viability of any small-scale, relatively unprofitable cultural initiative is a function of two forces. One is the good will of the people directly involved. The other is getting support from the public – or rather, creating one.

In this case, the process is made more difficult by the fact that First is sui generis. Which is putting it politely. My own response upon first encountering it about 10 years ago involved a little cartoon balloon forming over my forehead containing the letters “WTF?” It is not simply that it is hard to know what to expect next; sometimes it is hard to say what it was you just read. In First, political commentary, cultural analysis, and personal essays sit side-by-side. But at times, all three are going on at once, within the same piece. Kenneth Burke used to refer to such jostlings of the coordinate system as creating "perspective by incongruity." It signals a breakdown of familiar formats -- a scrambling of routine associations.This is stimulating, if perplexing. The confusion is not a bug but a feature.

One familiar description of the journal that I have come to distrust treats First as a bridge between popular culture and the ivory tower. An often-repeated blurb from some years ago calls it "the only leftist publication [one] could imagine being read at both Columbia University and Rikers.”

Good advertising, to be sure. But the better the ad, the more its presumptions need checking. The whole “building a bridge” trope implies that there is a distance to be spanned – a connection between enclaves to be made. (The ideas are over here, the masses over there.) But reading First involves getting oriented to a different geography. Some academics do write for it, but they do not have pride of place among the other contributors, who include poets and musicians and journalists, and people who might best just be called citizens. The implication is not that there is distance to be crossed, but that we're all on common ground, whether we know it, or like it, or not.

In the wake of 9/11, some writers for First (not all of them) rallied to the call for a war, and at least one endorsed George W. Bush during the 2004 campaign. Does that mean that First is actually “the only ‘neoconservative’ publication read in both academe and prisons”? Well, no, but funny you should ask, because it underscores the convenience made possible by pre-gummed ideological labels.

At times they are useful (I tend to think "social-imperialist" is a pretty good label for the idea that "shock and awe" was necessary for historical progress in Iraq) but not always.

The discussion of Obama in the new volume is a case in point. Both Paul Berman (a Clintonian liberal who supported the Iraq War) and Amiri Baraka (who takes his political bearings from Mao Tsetung Thought) concurring that the 2008 election was a transformative moment. This is, let's say, an unanticipated convergence. Meanwhile, Charles O’Brien (an editorial collective member who endorsed Bush in ‘04, on more or less populist grounds) treats Obama as a short-circuit in the creation of a vigorous radicalism-from-below needed for social change. “Of the Obama campaign, what endures?” he asks. “The new Pepsi ad.”

It would be wrong to see First as yet another wonk magazine with some cultural stuff in it. Nor is one of those journals (edited on the bridge, so to speak) in which the latest reality-TV show provides the excuse for yet another tour of Foucault’s panopticon. Politics and culture come together at odd angles in the pages of First, -- or rather, each spins out from some vital center that proves hard to pin down. Margin and mainstream are configured differently here.

I tried to get a handle on First's particularity by talking to Benj DeMott, who edited the two anthologies and is now working on the third. We spoke by phone. Taking notes did not seem like a plausible endeavor on my part, because DeMott's mind moves like greased lightning – the ideas and references coming out in arpeggios, rapid-fire and sometimes multitrack.

But one point he made did stick. It was a consideration on holding together a project in which the contributorsdo not share a party line, and indeed sometimes only just barely agree to disagree. It sounds complicated and precarious. Often, he said, it comes down to sharing a passion for music -- for sensing that both democracy and dancing ought to be in the streets. Politics isn't about policy, it's about movement.

That does not mean celebration is always the order of the day. The indulgence of academic hiphop fans is legendary, but if you want to see what tough-minded cultural analysis looks like, check out the African-American film critic Armond White's reflections on white rapper Eminem in The First of the Year: 2009. The essay can be recommended even if its subject is now shrinking in pop culture’s rearview mirror.

“Rather than a symbol of cultural resistance,” writes White, “he’s the most egregious symbol of our era’s selfish trends. With his bootstrap crap and references to rugged individualism reminiscent of the 80s, he’s a heartless Reagan-baby – but without the old man’s politesse.... His three albums of obstinate rants culminate in the egocentric track ‘Without Me,’ making him the Ayn Rand of rap – a pop hack who refuses to look beyond himself.... Minus righteousness, angry rap is dismissible. Rap is exciting when it voices desire for social redress; the urge toward public and personal justice is what made it progressive. Eminem’s resurrected Great White Hope disempowers hip hop’s cultural movement by debasing it.”

Now, if you can imagine such thoughts ever appearing in an essay by Irving Howe -- let alone Irving Kristol -- then we can go ahead and describe First as inheriting the legacy of the New York Intellectuals.

Otherwise, it may be time to recognize and respect First for what it is in its own right: a journal of demotic intelligence, alive to its own times, with insights and errors appropriate to those times, making it worth the price of perplexity.

Only after talking to Benj DeMott did I read what seems, with hindsight, like the essay that best explains what is going on with the whole project. This is long tribute -- far more analytical than sentimental -- to his father, the late Benjamin DeMott, who was a professor of English at Amherst College. He was a remarkable essayist and social critic.

It is time that someone publish a volume of DeMott senior's selected writings. Meanwhile, his influence on First seems pervasive. The younger DeMott quotes a letter written in his father’s final years -- a piece of advice given to a friend. It offers a challenge to what we might call "the will to sophistication," and its hard clarity is bracing:

"Study humiliation. You have nothing ahead of you but that. You survive not by trusting old friends. Or by hoping for love from a child. You survive by realizing you have nothing whatever the world wants, and that therefore the one course open to you is to start over. Recognize your knowledge and experience are valueless. Realize the only possible role for you on earth is that of a student and a learner. Never think that your opinions – unless founded on hard work in a field that is totally new to you – are of interest to anyone. Treat nobody, friend, co-worker, child, whomever, as someone who knows less than you about any subject whatever. You are an Inferior for life. Whatever is left of it....This is the best that life can offer. And it’s better than it sounds.”

This amounts, finally, to a formulation of a democratic ethos for intellectual life. It bends the stick, hard, against the familiar warp. So, in its own way, does First, and I hope the website and the series of anthologies will continue and prosper as new readers and writers join its public.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

George Clooney Meets Max Weber

Spoiler alert: Max Weber’s life is an open book, thanks in part to Joachim Radkau’s wonderful new 700-page biography, so nothing to spoil there. But this essay does reveal the ending of Jason Rietman’s new film.

Thoughtful, intellectual movies are produced each year in the United States and abroad -- open texts rich with meaning, understood by critics or not. Some writers and directors begin with a premise, others stumble into one, and still others capture the zeitgeist and hit a chord, even if we cannot articulate precisely what it is. For me, not much of a moviegoer and certainly not a film critic, Up in the Air, the highly-acclaimed new movie directed by Jason Reitman (he also directed Juno), and written with Sheldon Turner, resonates powerfully with some of my challenging student conversations of late.

There are no ground-breaking paradigms about human nature introduced in Up in the Air, just as we’ve not seen many of those in academic circles in recent times. But by trying to keep us engaged, Reitman manages to come face to face with the very best of 19th and early 20th century philosophy and sociology. It was during this period that the great theorists of industrialization and technology emerged with force – Marx of course, then Max Weber, Ferdinand Tönnies, and Emile Durkheim among others – exploring the relationships among rationality, morality, community, and the acceleration of technological change in all aspects of life.

By the end of the 19th century, the horrors of progress began to take hold in the sociological imagination, a theme that persisted into the 20th century through Foucault and his contemporaries. There are the cheerleaders for sure: Marshall McLuhan – brilliant as he was – could see very little dark side to the explosion of electronic media, for instance. And it is difficult to locate the downsides of advances in medicine or public health technologies without undue stretching. But Reitman is some sort of original and popular voice, struggling anew with the complex interface between rapidly-evolving technology (communication, transportation, finance) and human relations. It’s not a bad addition to a syllabus.

Let's start with Weber, the wildly abbreviated version: With regard to technology, progress, and capitalism, Weber detected a linear trend toward increasing rationalization, systematization, and routinization. In all aspects of life -- from the nature of organizations to the structure of religions -- we seek efficiency and system, in order to gain profit, leisure time, and fulfillment. This drive toward increasing organization, in all its manifestations, is too powerful to fight, given its clear appeal and "scientific" grounding.

Yet, Weber notes, all of this seeming success ultimately makes us less human: With increasing rationalization, we lose our collective spirit. He said, famously, that "each man becomes a little cog in the machine and, aware of this, his one preoccupation is whether he can become a bigger cog," a set of insights that drove him to despair. There are, Weber argued, occasional charismatic leaders that shake up our tidy world of rational calculation. But charismatic movements and people succumb to the inevitability of rationalization, crushed by a culture driven to success, results, and materialism. With no way out, Weber posits, we shall find ourselves in an "iron cage" of rationality, and the human heart will be impossible to locate.

To the film: Ryan Bingham (Clooney) is a consultant who spends most of his life on planes and in airports, traveling the nation as a professional terminator. He is with a firm hired by companies to lay off workers face-to-face (so the employer doesn’t have to), hand them a packet with their severance details, and deliver banal bits of inspiration intended to contain any extreme emotional reaction on the part of the unlucky employee. It’s a perfect Weberian job: Bingham produces nothing truly meaningful, keeps the wheels of capitalism spinning, has no personal relations muddying up his work, and makes good money for the firm.

This all goes well for Bingham; he has no interest in settling down (at least at the start of the film), and being in the air keeps his adrenaline pumping. But his firm has even higher ambitions to rationalize their business model, and with the help of a naïve 20-something M.B.A. type, moves to a system where professionals like Bingham can fire people over videoconference, hence saving millions in travel costs. At the end of the film, due to some unhappy results, the new system is pulled back for more study, and Bingham and colleagues get back on the road to once again fire people in person, which has more heart than the videoconference firing.

A victory against the forces of rationalization? After all, when Bingham fires people in-person, there is something of a human touch. But the film undercuts that thesis as well, with another character, a woman professional, also a road warrior, Alex Goran (played by Vera Farmiga). Goran is attractive and warm, but at base is even more mercenary than Bingham: She too lives in the air, has impressive frequent flyer tallies, and is in all the premium classes that one can aspire to (car rental, hotel, airline, so forth).

Bingham is impressed, having finally met his female match (she quips: “I’m you with a vagina”), finds her in hotels across the country for sex appointments, falls in love with her, finds his heart, and is badly jilted in the end (Goran is married, although she had never revealed this to Bingham). And while he may be badly hurt, she is sincerely puzzled that he failed to understand their unspoken contract: Why, he was part of her rationalized system – husband and family in Chicago, fulfilling career, and Bingham for pleasure on the road.

One of the nice twists of the film is that the female character is a more highly evolved Weberian being than are the men: She has a seemingly happy life – she is content, not alienated or complaining – while Bingham struggles with the rationalization of love, the one aspect of human interaction he apparently thought could not succumb to a culture of calculation. He wasn’t paying for the sex after all; he actually liked her.

While Goran’s character -- a Weberian monster of sorts -- might worry us, she underscores a central problem with the rationalization thesis in an age of social networking, texting, and air travel. Weber and his followers did not foresee the humanization of technology that we see now, and I too have been slow to come to this. For years I taught my students about Weber’s iron cage; they understood it and they appreciated it. They understood how the ATM – for all its efficiencies – lessens human interaction (you’ll not meet anyone in a long bank line these days). They understood what is lost when poll results stand in for qualitative opinion expression, or how a phone call is essentially less human than a face-to-face interaction. The tension between progress and human connectedness – that it was a tradeoff, in fact – seemed to make good sense.

But I struggle to hold up my side of the argument these days. Students insist that their connectedness with friends and strangers, through communication technology, is real, fulfilling, fun, sincere, and intimate (e.g., “sexting”). Weber and I are dinosaurs who have no room in our theorizing for the highly social, extraordinarily human interaction that the Internet has enabled. Technology itself, the force we feared would crush the human spirit, turns out to enhance it.

Or so our students argue. We go round and round on this. And perhaps even those of us who have wrapped much of our intellectual existence around theorists like Weber will see the light, and treat those theories as important, but entirely historically-bound. Up in the Air passes no judgment on Goran’s lifestyle, and in fact, she may be the Übermensch. She controls her destiny and she directs the rationalization of her emotional life. While world-weary (a lot of airport bars, a lot of men), she has found her happiness, while Bingham remains a pathetic, troubled amateur.

Up in the Air encourages a revision of some Weberian views, but also takes on some of our mid-20th century sociological giants as well. Robert Merton, working in the tradition of Tönnies and Weber, argued that the dominant media of his day – radio – had produced what he called pseudo-Gemeinschaft or the "feigning of personal concern and intimacy with others in order to manipulate them the better," for profit, typically. Whether it’s selling war bonds (he wrote on Kate Smith’s campaign) or the perpetual fake-friendly "it’s a pleasure to serve you" we hear constantly, Merton was bothered by the niceties layered atop brute business motive. Is it their pleasure or not? Do they sincerely like to serve us, or do they get points for it on a performance review?

In Up in the Air, our protagonist – thanks to his frequent flying – gets the special "personal" treatment from airline professionals and others. He knows it’s fake, but it is still a pleasurable and valued aspect of daily life. When I raise the old Merton argument with my students these days, they are not bothered by it at all, and Reitman sees the niceties much the same way -- as the state of nature in contemporary capitalism, not a repulsive, slavish persona designed by corporate headquarters. When Bingham finally gets his reward for travelling an extraordinary number of miles on the airline – a personal meeting with the top pilot – he is at a loss for words, after imagining the moment a hundred times in his fantasies. Even when we’ve survived the countless niceties and earned the real human touch, it’s not that great after all, another puzzle for our backward hero.

It is far too generous to say that McLuhan was right, that technology has made us more human, brought us together in a global village of understanding, encouraged tolerance of difference, and connected us to our essential, spiritual, primitive and fuller selves. He slips and slides, preaches a bizarre narrative of human history, and ignores social structure and power dynamics as much as possible. But he did, and profoundly so, foresee something of the social networking of today -- how light might shine through what looks like a mechanical, calculating, and cold world of technological progress. Up in the Air sides with McLuhan and with my students: The film gives one answer to a depressed Weber, but my generation -- at least -- feels empty at the end, as we go back up in the air with Clooney.

Author/s: 
Susan Herbst
Author's email: 
info@insidehighered.com

Susan Herbst is chief academic officer of the University System of Georgia and professor of public policy at Georgia Tech.

The Turbulent Years

In October, the U.S. Department of Labor announced a fine of more than $87.4 million on BP North America Inc. for "failure to correct [the] potential hazards faced by employees” that had been uncovered by the Occupational Safety and Health Administration. This set an all-time record for penalties set by OSHA on any company -- dwarfing the previous one, from 2005, of a mere $21 million, imposed after an explosion at a BP refinery killed 15 people and injured 170 others.

Since last fall, BP has gone on to bigger things. A tone of moral indignation has been heard lately (on Capitol Hill, for instance) regarding those OSHA violations. But why the outrage? It’s just business. As long as risk to the company's workers can be translated into a calculable expense, decisions will be made on a rational basis. With an eye on the bottom line, the company can decide whether or not to install adequate equipment to protect either workers or the environment.

Or not to protect them, as the case may be. Profit is profit, and the ocean has no lawyer. Let's not pretend otherwise.

Of course, events might have unfolded very differently if the people working on the offshore rig had decided to shut production down when the company pushed them (once again) to cut corners and ignore danger signs. Every time I see a picture from the Gulf of Mexico, I wonder about that. But when politicians or people in the mass media discuss the situation, work stoppage by BP's employees is one possibility that never comes up.

The very idea seems almost unthinkable. It is easier to get mad at how flagrantly BP ignored safety violations than to imagine labor acting outside the established framework of government regulation and corporate decision making. Maybe BP can afford this failure of the imagination -- but I doubt the planet can, at least not forever.

So it’s a good time to have a new edition of Irving Bernstein’s two studies The Lean Years (1960) and The Turbulent Years (1969). Originally published by Houghton Mifflin, they have just been reissued in paperback by Haymarket Books and offer, between them, a classic survey of how American workers fared during the 1920s and ‘30s. SPOILER ALERT: They tended to do best when they had the confidence and the willingness to challenge their employers -- and not just over wages. Bernstein, who at the time of his death in 2001 was an emeritus professor of political science at the University of California at Los Angeles, makes clear that control over working conditions was usually also at stake.

What set Bernstein's work apart from the usual run of scholarship on American labor history at mid-century was his strong interest in the life and activity of non-unionized people -- including those working in agriculture, or leaving it behind for new kinds of employment, in the case of African-Americans leaving the South. And Bernstein wrote with grace. He had a knack for the thumbnail biography of ordinary people: There are numerous miniature portraits embedded in the epic. He was sensitive to the changes in mood among workers as they faced the boom of the 1920s (which passed most of them by) and the agony of the Depression (which hit them hardest). In many cases, they blamed themselves for their misery. The possibility of joining forces with others to change anything took a while to sink in.

The new paperback editions come with introductions by Frances Fox Piven, a professor of sociology and political science at the City University of New York Graduate Center, who draws out Bernstein's argument on this point: "The train of developments that connects changes in social conditions to a changed consciousness is not simple. People ... harbor somewhere in their memories the building blocks of different and contradictory interpretations of what it is that is happening to them, of who should be blamed, and what can be done about it. Even the hangdog and ashamed unemployed worker who swings his lunch box and strides down the street so the neighbors will think he is going to a job can also have other ideas that only have to be evoked, and when they are, make it possible for him on another day to rally with others and rise up in anger at his condition."

Quoting that passage gives me pause -- for Piven, a former president of the American Sociological Association, has in recent months been the focus of intricate theories about how Barack Obama was using ACORN to impose martial law on gated communities. Or perhaps ACORN was using Barack Obama to that end. I must admit some difficulty in reading the pertinent diagram. But in short, she has been involved in some quite nefarious activity, such as encouraging poor people to vote.

No doubt this will make Piven's endorsement of Irving Bernstein's two books seem particularly worrying. Only someone in the Tea Party (a well-funded movement organized by professional lobbyists) is supposed to "rally with others and rise up in anger in his condition" -- not an unemployed person who wants work and decent health care. Furthermore, protesters ought to direct their rage strictly at the government, and never at private enterprise.

I suppose the late Irving Bernstein will end up as a box in the big flow chart of cyclothymic, pseudopopulist political discourse. It seems like a matter of time. But if you read his books, something eventually becomes clear. He thought the New Deal had saved capitalism and made it more fair. He was not fond of the Communists, who expected the Depression would work to their advantage. Before writing his labor histories, Bernstein specialized in collective bargaining. (Aside from publishing books on the subject, he served in arbitration disputes.) The Turbulent Years is dedicated to Clark Kerr -- the president of the University of California system and a major target of the radical student movement in the 1960s.

In short, when Bernstein wrote with sympathy about the strikes and street fighting of the 1930s, it was not out of an instinctive combativeness but from a sense that people do these things because they have been left no choice by "an unbalanced society" (to borrow an expression he used to describe the United States on the eve of the crash of 1929). If his book sounds almost revolutionary now, that is a sign that the ordinary frame of reference for political judgment has skewed so far to the right that reality is standing sideways.

I contacted Frances Fox Piven to ask her opinion of this assessment.

"Bernstein definitely thought of himself as a centrist, but a reformer," she told me. "He was quite contemptuous, for example, of ideologues on the Left in the 1930s. But he was never contemptuous of workers themselves, and his respect and empathy for workers forced him to pay attention, even respectful attention, to the strikes and sit-downs and demonstrations they undertook during the 1930s. One of the consequences of the rise of a turbulent and aggressive labor movement was to open up normal politics, to move the political culture to the Left. The Civil Rights movement had a similar consequence thirty years later. It is chastening to observe that in the absence of mass movements from the bottom (and the Tea Party is not a movement from the bottom) that our politics reverts to a kind of default position in which business interest groups have outsized influence."

If a sufficiently "turbulent and aggressive" spirit had prevailed among the people working for BP just a couple of months ago, there might not now be one hundred thousand barrels of crude oil (by the company's own estimate) surging into the ocean every day -- with no end in sight.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Poisonous Knowledge

Every so often a thinker will earn a place in history through the force of a single really bad idea. Cesare Lombroso (1832-1909) was such a figure. Examining the physiognomy of known felons, living and dead, the pioneering Italian criminologist concluded that some people were organically predisposed to breaking the law. It was just in their nature. They were degenerates, in the strictest sense: biological throwbacks from civilized humanity to something lower on the evolutionary scale.

Various physical traits signaled the regression. This was the bright side of Lombroso’s theory, since it told you what to watch out for. Rapists tended to have abnormally round heads. Women with masculine faces and excessive body hair were a menace to society; a lack of maternal instinct made them capable of acts more vicious and depraved than male offenders. Left-handed men were closer to the state of "women and savage races," thus more prone to crime or lunacy than we law-abiding right-handers.

All of this proves less amusing given how influential Lombroso’s books remained into the early 20th century. Somebody probably went to jail for having a sloping forehead and asymmetrical ears.

A few years back, Duke University Press brought out translations of a couple of Lombroso’s works, which apart from their historical significance, are fascinating for the images the esteemed researcher used to demonstrate his argument. They are haunting, especially the photographs. The faces wear various expressions: hardened, hungry, bitter, confused, terrified. Each evokes a long story of bad choices or bad luck, or both. I’m not sentimental enough to believe that all of them, or even most, were innocent. There are some tough customers who look ready to stick to their story, no matter what. (“That guy was already dead when I got there.”) But the crimes are long forgotten. What remains now is the trace of misery, caught in the gaze of a criminologist who has reduced them to specimens.

On page 78 of William Garriott’s Policing Methamphetamine: Narcopolitics in Rural America, published by New York University Press, there is the reproduction of a poster called “A Body on Drugs.” The author, who is an assistant professor of justice studies at James Madison University, found it taped to the wall of a sheriff’s office in “Baker County” -- the name he has given to an area in West Virginia where he did ethnographic fieldwork in the mid-2000s.

Garriott calls the poster “reminiscent of the catalogs of criminals from which the 19th-century criminologist Cesare Lombroso sought to discern the distinctive features of congenital criminality.” I will return to this idea later, but first should describe the poster itself. Because it has been reduced to the dimensions of a single page in a book, the text is almost impossible to read, but you can still make out the photographs, which show the long-term effects of methamphetamine use on the body through a combination of mug shots and close-ups, plus brain scans.

All of it is ghastly. “The arms and legs had open sores,” recalls Garriott, “the hands were scabbed and bandaged, the mouth was missing teeth, the brains showed signs of malfunction, and the faces were prematurely aged.” If anything, the images may understate the impact of meth. The festering sores result from an accumulation of toxins in the addict’s body. They can also induce psychosis. “Cooking” meth in improvised labs, besides running the risk of explosion, generates extremely dangerous contaminants.

The social profile of crack cocaine, 20 years or so back, was black and urban, while meth’s “brand identity” tends to be white and (especially in recent years) rural. Garriott initially went to West Virginia as a cultural anthropologist to study “the treatment experiences of addicts working to overcome their addiction to meth,” he writes, “what I thought of as the ‘therapeutic trajectory of their recovery process.' ” The focus of the project shifted as Garriott noticed how often “drug problems generally, and the methamphetamine problem specifically, were framed locally as matters for the criminal justice system,” rather than as a medical issue.

To describe the relationship between addict and community, then, was impossible without assessing the role of the police. This is hardly surprising, and perhaps least of all in rural areas, where state and civil society tend to meet at the same diner and church socials. But Garriott’s analysis leaps from the ethnographic particulars to broad claims about what he calls the “narcopolitics” of meth. The term is modeled on Michel Foucault’s concept of biopolitics, which covers a host of ways the modern state seeks to monitor, classify, regulate, and control the population of human organisms within its territory. (However befuddled Lombroso’s dubious Darwinism, for example, his work is the perfect instance of a biopolitical strategy: identifying a defective and dangerous human subspecies enhances the power of the authorities over the social order. That was the plan, anyway.)

Once, the narcopolitical imperative was summed up in the slogan “War on Drugs,” which you don’t hear invoked much anymore. (To quote Detective Ellis from "The Wire": “You can’t even call this shit a war… Wars end.”) Yet the constant mobilization against illegal drugs not only continues but blurs the line between narcopolitics and the “normal” functioning of the state – including, in Garriott’s catalog, “the election of officials, the administration of justice, the practice of law enforcement and the formation of public policy (both foreign and domestic), the allocation of social services, the use of military force, the interpretation of law, and the behavior of the judiciary.”

And because the prosecution and incarceration of drug offenders is one of the few areas of governmental action with broad public support, narcopolitics serves to legitimate the state itself. Policing the availability of illegal drugs and the behavior of their uses becomes a means through which the authorities establish and maintain public order -- or can at least be seen trying.

These tendencies become self-reinforcing. Drug abuse ceases to be a social problem. Rather, social problems, including violence and poverty, look like effects of criminal drug enterprises – which means resources should be channeled towards interdiction and incarceration.

With this notion of narcopolitical power -- as with just about any schema derived from Foucault’s work -- you soon get the sense of a juggernaut rolling over the landscape, flattening everything in its path, with nobody resisting because nobody can, and you’d pity the fool who tried.

In an epilogue, Garriott takes up the question of what reforms of the system suggested by his analysis -- then admits that none really follow. I respect his candor. If you can’t change the world, might as well interpret it, not that doing so makes much difference. But there is at times a strange disconnection between his analytic framework and his descriptions of life in Baker County.

The narcopolitical imagination, by Garriott’s account, “maps” social space according to its own imperative to track and control illegal substances. The community learns to define itself in opposition to the menace of drug dealers and addicts. Social anxieties become focused around them. The preferred response is punitive. Therapeutic treatment for meth abuse is something prescribed by the legal system; it is part of a continuum, with prison at the other end. And all of this functions in a closed loop -- with the problem always finally defined as a matter of criminality, thereby reinforcing narcopolitical power.

But Garriott’s fieldwork shows a community with every reason to regard meth as a real menace – not because it is a convenient explanation for social disorder, but every phase of its existence creates actual dangers. The author does not mention this. Cooking one pound of meth creates six pounds of toxic byproduct. Recovery from addiction is difficult and rarely lasts for very long. Nor does accumulation of narcopolitical power by the state generate confidence in its authority. Garriott notes rumors that local officials are failing to deal with the meth problem because they are somehow involved in trafficking. And while Foucault's thinking about biopower treated certain new disciplines (criminology for instance) as modes of domination over the social field, the knowledge gained by the police and citizens clearly has the very opposite effect. Garriott quotes one officer saying, "Sometimes I wish I was more naive." The awareness that a trash bag on the side of the road might be filled with deadly chemicals from a meth lab is itself a kind of "poisonous knowledge," as the author puts it.

"A Body on Drugs," the poster mentioned earlier, is a concentrated bit of such poisonous knowledge. Garriott borrowed its title for the dissertation later revised as this book. His commentary treats the images as a contemporary narcopolitical variant of Lombroso's work, "drawing attention to a generic type of criminal and the signs by which they could be identified." Recognizing the open wounds, rotting teeth, and emaciation "made possible ... understanding both their physical appearance and their criminality as symptoms of their addiction." The poster did not say they were "born criminals," as Lombroso might. But the narcopolitical gaze was linking their biology and their criminality just as closely.

Having finished reading Policing Methamphetamine, I used a magnifying glass to examine the poster closely. You could see, for example, the little jar that one woman used to collect the imaginary bugs she felt crawling under her skin and removed with a knife. So I learned from the captions. There were some mug shots taken of people who had been arrested before becoming addicted to meth and then afterward. Garriott calls them "a concrete means of imagining the temporality of the relationship between drugs, addiction, and criminality," which is certainly one way of putting it. But in spite of prolonged squinting, I never saw any mention of criminality on the poster. That was not its point. It was about suffering.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Transforming Terror

Looking back at the early 21st century in their seminar rooms, somewhere down the road, historians might spare a few minutes to consider a short video shot, and posted to YouTube, on the day after Osama bin Laden was killed. It records an attempt by someone in a crowded New York City subway car to lead the other passengers in a triumphant chant of “USA! USA!”

The first half of the clip is evidence of the famous wall of indifference encasing each New Yorker while occupying public space -- and especially while riding the subway, where your car may be invaded at any moment by a roving mariachi band or someone delivering a loud sermon.

Having documented this familiar demeanor, the man with the camera expects to break it down by reminding everyone that Osama is now dead. The effort misfires. Nobody responds. The news is not cathartic. The anthropologist Victor Turner used the term communitas to name the state of collective intimacy, a collapse of social distance, accompanying certain kinds of rituals or festivals. The aftermath of disaster can create it, too -- and as the 10th anniversary of 9/11 approaches, there will no doubt be more and more tributes to the spirit of communitas that emerged then, for a while. Our would-be cheerleader expected it to be churning beneath the surface, now that 9/11 was avenged. But the faces he filmed say otherwise. For them it was just another commute, on just another Monday.

The little drama of awkwardness captured in this slice-of-life video is interesting because it embodies a real conflict over how to respond to an act of violence. One way is to celebrate it -- in this case, it seems, by assuming that revenge brings something to an end. ("We killed Osama, so now we're even.")

The other response proves more ambiguous and harder to characterize, though “resignation laced with dread” might be about right. That is the shade of my own ambivalence, at least. As someone living near enough to the White House to take the mission of the Flight 93 hijackers rather personally, I did not wish Osama bin Laden well. But celebrating his execution as a rite of closure seems both barbarous and bad magic; the spirit of revenge, once summoned, is hard to control. If the people in the subway car don’t start giving each other high-fives, that’s because some are already preparing themselves for the worst

A new anthology called Transforming Terror: Remembering the Soul of the World, published by the University of California Press, strives for higher ground than either the “USA! USA!” camp or Team Stoic Pessimism. Edited by Karin Lofthus Carrington and Susan Griffin, it comes with a foreword by Desmond Tutu, who calls the volume “a path through which we might one day meet the challenge of terrorism and bring peace to our troubled world.” (Carrington has served as an adjunct professor of depth psychology at the Pacifica Graduate Institute, and Griffin is the author of several books, including A Chorus of Stones, a finalist for the Pulitzer Prize.)

Writing as “a witness to many beautiful and unexpected acts of courage and generosity,” the Archbishop of South Africa testifies to the possibility of redeeming the world -- something he understands as a human task as well as a theological doctrine. The editors share his vision, which they cast in terms of a cosmopolitan spirituality with psychotherapeutic overtones. That does not mean withdrawing from the world’s violence to contemplate higher things. Parts of the book constitute a tour of hell on earth: there are excerpts from accounts of lynching, car bombing, nuclear destruction, civil war, and genocide. “We are looking,” the editors write, “at the way terror damages the human psyche or, as the ancient Greeks called it, soul, and how it is through this damage that the world enters seemingly endless cycles of violence.”

They define terrorism to include “acts of violence against unarmed civilians, no matter who perpetrates them” or whether “purposeful or labeled as collateral damage.” This casts the net more widely than usual, of course. Grabbing the closest suitable reference book at hand, I find that the 2000 edition of the Collins Web-Linked Dictionary of Sociology calls terrorism “a form of politically motivated action combining psychological (fear inducing) and physical (violent action) components carried out by individuals or small groups with the aim of inducing communities or states to meet the terrorists’ demands.”

Carrington and Griffin are much less concerned with the political motivation or consequences of terrorism than its defining effect: fear, trauma, powerlessness … in short, terror itself. The cycles of wounding, retribution, and brutalization pay no heed to the distinction between terrorism and war (which is, from this perspective, semantic).

Mixed in with the reports on violence and atrocity are excerpts from poetry (Federico Garcia Lorca, Theodore Roethke, Taha Muhammad Ali) and scores of essays, along with the occasional prayer or sermon from figures in various world religions. The material is divided into thematic clusters, with one chapter on “trauma, violence, and memory,” for example, and another on “gender and violence.” The first half of the book covers the psychic damage done by terror, including the pathological dimensions of the desire to strike back. The second considers various modes of nonviolent “paths to transformation,” including citizen diplomacy (informal exchanges among ordinary people from adversarial countries) and truth-and-reconciliation efforts in places where a state or powerful group has terrorized a population.

Even with this structure, though, the book is a bit of a dog’s breakfast. Some benefit might follow from reading things in sequence, but I found this impossible. Excerpts range from a few lines to several pages in length; the effect is jolting, and the attention wanders. And some of the editorial choices are unfortunate. Placing a traditional Buddhist prayer (translated by the Dalai Lama) right across from a passage by St. Thomas Aquinas isn’t a problem. A little like something the hip young clergyman in a romantic comedy might do? Sure. Otherwise it’s unobjectionable. But turning the Aquinas text into a piece of free verse that sounds like Kahlil Gibran is not much of a contribution to peace and justice.

On the other hand, the world is such a mess that it makes no sense to get too irritable with anybody trying to mend it. Transforming Terror has its heart in the right place, even if it does include the wisdom of Deepak Chopra. Treating war and terror as maladies of the soul can be reductive. But ignoring the wounds they leave means letting them fester, and it makes for a kind of madness.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Pages

Subscribe to RSS - Sociology
Back to Top