Anthropology

It's All Geek to Me

We have, by contemporary standards, a mixed marriage, for I am a nerd, while my wife is a geek. A good thing no kids are involved; we’d argue about how to raise them.

As a nerd, my bias is towards paper-and-ink books, and while I do indeed use information technology, asking a coherent question about how any of it works is evidently beyond me. A geek, by contrast, knows source code....has strong opinions about source code....can talk to other geeks about source code, and at some length. (One imagines them doing so via high-pitched clicking noises.) My wife understands network protocols. I think that Network Protocols would be a pretty good name for a retro-‘90s dance band.

This is more than a matter of temperament. It is a cultural difference that makes a difference. The nerd/geek divide manifested itself at the recent meeting of the Association of American University Presses, for example. Most people in scholarly publishing are nerds. But they feel like people now want them to become geeks, and this is not an expectation likely to yield happiness.

Christopher M. Kelty’s Two Bits: The Cultural Significance of Free Software, just published in dead-tree format by Duke University Press, might help foster understanding between the tribes. The book itself is available for free online. (The author also contributes to the popular academic group-blog Savage Minds.)

Kelty, an assistant professor of anthropology at Rice University, has done years of fieldwork among geeks, but Two Bits is not really a work of ethnography. Instead of describing geek life at the level of everyday experience or identity-shaping rituals, Kelty digs into the history and broader implications of one core element of geek identity and activity: the question of “open source” or “free” software. Those terms are loaded, and not quite equivalent, even if the nuance tends to be lost on outsiders. At issue, in either case, is not just the availability to users of particular programs, but full access to their inner workings – so that geeks can tinker, experiment, and invent new uses.

The expression “Free Software,” as Kelty capitalizes it, has overtones of a social movement, for which openness and transparency are values that can be embedded in technology itself, and then spread throughout institutions that use it. By contrast, the slightly older usage “open source” tends to be used when the element of openness is seen as a “development methodology” that is pragmatically useful without necessarily having major consequences. Both terms have been around since 1998. The fact that they are identical in reference yet point to a substantial difference of perspective is important. “It was in 1998-99,” writes Kelty, “that geeks came to recognize that they were all doing the same thing and, almost immediately, to argue about it.”

Much of Two Bits is devoted to accounts of how such arguments unfolded amidst the development of particular digital projects, with the author as a participant observer in one of them, Connexions (an online resource for the collaborative production of textbooks and curricular materials, previously discussed here). A merely nerdish reader may find some of this tough going. But the upshot of Two Bits is that geekery has constituted itself – through freeware, or whatever you want to call it – as what Kelty calls a “recursive” public sphere, with important implications for cultural life outside its borders.

Any strong notion of a public sphere is going to see the public as, in Kelty’s words, “a collective that asserts itself as a check on other constituted forms of power – like states, the church, and corporations – but which remains independent of those domains of power.”

The hard question, most of the time, is whether or not such a public actually exists. The journalist and social thinker Walter Lippmann considered the health of the public in three books he wrote during the 1920s, each volume gloomier than the last. And when Jurgen Habermas revisited the concept in the early 1960s, he concluded that the public sphere as a space of debate and rational deliberation had been at its most robust in the 18th century. More recently, Americans have made a hit out of a game show called “Are Your Smarter than a Fifth Grader?” in which adult contestants routinely prove that they are not, in fact, smarter than a fifth grader. All things considered, the idea of the public as a force that “asserts itself as a check on other constituted forms of power .... but which remains independent of those domains of power” does not seem to have much traction.

But geekdom (in Kelty’s analysis anyway) fosters a much more engaged ethos than that associated with earlier forms of mass media. This is not simply a matter of the well-known penchant for libertarianism in the tech world, about which there is probably not much new worth saying. (If consenting adults want to talk about Ayn Rand, that’s OK as long as I don’t have to listen.) Rather, the whole process of creating and distributing free software is itself, to borrow a programming term, recursive.

Per the OED, recursivity involves “a repeated procedure such that the required result at each step except the last is given in terms of the result(s) of the next step, until ... a terminus is reached with an outright evaluation of the result.”

Something like that dynamic – the combination of forward motion, regressive processing, and cumulative feedback – is found in geekdom’s approach to collaboration and evaluation. The discussions involved are not purely technical, but involve arguments over questions of transparency and ethical implication of software.

“A recursive public,” writes Kelty, “is a public that is vitally concerned with the material and practical maintenance and modification of the technical, legal, practical, and conceptual means of its own existence as a public; it is a collective independent of other forms of constituted power and is capable of speaking to existing forms of power through the production of actually existing alternatives.” (Those alternatives take the form of technology that the rest of us use, whether we understand it or not.)

Two Bits is an effort to analyze the source code, so to speak, of geekdom itself. How the larger culture interacts with it, and is shaped by it, is a subject for another study. Or for quite a few of them, rather, in due course. For now, I think Kelty’s book deserves a wide readership -- especially among nerds trying to make sense of the past decade, let alone to prepare for the next one.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Fear and Humiliation as Legitimate Teaching Methods

A psychoanalytically inclined friend of mine once told me that you can tell the important dreams not because you know what they mean, but because you can't get them out of your head. As an anthropologist I've noticed something similar about ethnographic fieldwork: You live through moments that immediately seem important to you, but it is only after chewing them over that you realize why. I had one such moment recently that taught me, deep down, that I firmly believe in the power of fear and humiliation as teaching methods. This insight came to me late last month in the course of having my ass kicked repeatedly by Kael'thas Sunstrider, son of Anasterian, prince of Quel'Thalas, and servant of Kil'jaeden the Deceiver.

This high valuation of fear and humiliation is not the sort of thing that you hear at the pep talks organized at your campus teaching and learning center. Perhaps this is not surprising given the non-traditional subject which provoked it. I study people who play World of Warcraft. Warcraft is one of the world's most popular videogames, home to over 10 million people who enter its high-fantasy world to become murloc-slaying gnomes and felsteel-smelting blacksmiths.

As players slay monsters and explore dungeons their characters progress, become more powerful, and develop an inventory of every more powerful gear. There are lots of things you can do in-game, from player-versus-player battle fields reminiscent of arcade game shoot ‘em ups to obsessive hoarding of gold earned by, for instance, picking rare herbs and selling them to players.

People play Warcraft for many reasons, but the guild that I am studying plays it to raid. Four times a week we get a posse of 25 people together to spend four hours to explore the most inaccessible, difficult dungeons in the game, find computer-controlled “bosses” of ever-increasing difficulty, and slay them. Of all of the things to do in World of Warcraft, raiding is the hardest and most intense. It requires powerful characters and careful planning. Of the 10 million people who play Warcraft, 9 million have have never even stepped foot inside the places we have been, much less kicked the ass of the bad guys that we found there. We have a Web site, we have headsets, and we are serious. I don't study “the video game as genre.” I study the way American cultures of teamwork and achievement shape online interaction. As an observer my mind boggles at the 20-80 hours my guildies spend in-game every week. As a participant I'm super proud of our accomplishments.

Enough exposition. In late September our target was Kael'thas Sunstrider, the blood elf prince who broods in the floating Naaru citadel of Tempest Keep. The fight against Kael is legendary for its intricacy: First the raid must defeat each of his four advisors in turn. Then his arsenal of magic weapons must be overcome and turned against the advisors, who Kael resurrects. Finally the raid has the opportunity to fight Kael and his pet phoenix. In the final stage of the fight, the raid must struggle to down Kael as he removes the gravity from the room and leaves the raid hanging, literally, in mid-air. Whole guilds have broken up in rancorous self-hatred after struggling unsuccessfully to down him.

Recently we tried to get some help by inviting to our raid members of another guild, which had already downed Kael. Almost immediately I could see why its members were successful -- their raid leader did not pull his punches. In the middle of fight I would hear him saying things like "Xibby, don't think I don't see you healing melee -- please do your job and focus on the tank." At times -- like when our Paladin failed repeatedly to engage Thaladred the Darkener, who responded by repeatedly blowing up our warlocks -- voices were raised.

I was impressed by their professionalism, their commitment to high standards, and their leader's willingness to call people out when they made mistakes, but most of my guildmates didn't feel that way when we chatted after the raid in our online guild chat.

"i’m sorry but my husband dosen’t curse at me and no guy on wow will either" said Darkembrace, a shadowpriest who was also a stay-at-home mom in Virginia with a 3 year old daughter and a 75 pound rottweiler in the IM discussion.

"yeah," said our 18 year old tree druid Algernon, summing up the mood succinctly. "fuk them please never invite them back lol"

That raid passed into the guild's collective memory without further ado but, like an important dream, it kept running through my head. I had always known that raiding is a form of learning. It takes weeks of time and dozens of deaths before a guild-first boss kill, and even more time until a boss is so routinely killable that he is, as we say, “on farm.” But it wasn't until those Kael attempts that I realized just how similar raiding and teaching are.

A 25-person raid is the same size as a class, and like a class its leader can only take it to places places that it is willing to go. Teaching, like learning to down a boss, is about helping people grow their comfort zone by getting them to spend time outside of it. The question is how to push people so that they will be ready to learn, instead of ready to tear their hair out.

Raiding has taught me that being a good teacher requires laying down strict guidelines while simultaneously demonstrating real care for your students. The stronger the ties of trust and respect between teacher and student, the more weight they will bear. In the past I've cringed when my raid leaders cheerfully announced that we would spend the next four hours dying over, and over, and over again to a boss who seemed impossible to defeat. But I've trusted them, done my job, and ultimately we have triumphed because they insisted on perseverance. The visiting raid leader who took us through the Kael raid lacked that history with us -- he was too much of a stranger to ask us to dig deep and give big.

A willingness to take risks can also be shored up by commitment and drive. Our guest leader drove my guildies nuts, but impressed me with his professionalism. Does this mean that after graduate school even generous doses of sadism seem unremarkable? Perhaps. But it also indicates that I was willing to work hard to see Kael dead, even if it meant catching some flack. For them, it was a game, and when it stopped being fun they lost interest.

What I learned that night was that I believe in the power of fear and humiliation as teaching methods. Obviously, I don't think they are teaching methods that should be used often, or be at the heart of our pedagogy. But I do think that there are occasions when it is appropriate to let people know that there is no safety net. There are times -- not all the time, or most of the time, but occasionally and inevitably -- when you have to tell people to shut up and do their job. I’m not happy to discover that I believe this, and in some ways I wish I didn’t. But Warcraft has taught me that I there is a place for "sink or swim" methods in teaching.

We never did get Kael down. Shortly after our shared guild run the powers that rule the World of Warcraft decided that the Kael fight was too hard and have "nerfed" it -- made him lighter, fluffier, and easier to kill. We’re headed back in on Thursday, but our victory now seems as hollow as it will be inevitable. My guildies will take the nerf and love it, because burning down a boss that used to wipe them out will make them feel like gods. To me it will be a disappointment, because their pleasure in victory will be proof that we were never willing to do what we had to in order to become the kind of people who didn’t need the nerf.

Teaching is about empowering students, and Warcraft has taught me that there is a difference between being powerful and feeling powerful. We had a chance to grow as a guild, but in the end we just couldn't hack it. In the course of all this I learned that I am a person whose believes that there are some things in life too important for us to give up just because achieving them might make us uncomfortable.

Anthropologists love to tell stories of their emotional communion with the people they study. This story ends on a darker note, because what I learned from my attempts to kill Kael'thas Sunstrider was that I was not the same kind of person as my guildies -- a fact made even more disconcerting by the fact that we are supposed to be members of the "same" culture. My fieldwork has not taught me to find commonality across cultures, but to see diversity within my own. Playing Warcraft has taught me that I have a dark side when it comes to pedagogy which I wish I didn't have -- I’ve realized that a seam of commitment that surfaced in one place in my biography lies hidden in another. Does this mean my guildies need to care more, or that I need to learn to care less? It’s a question that I try not to ask, because I’m afraid I might not like the answer.

Author/s: 
Alex Golub
Author's email: 
info@insidehighered.com

Alex Golub is an assistant professor of anthropology at the University of Hawaii at Manoa who blogs at Savage Minds.

The Relevance of the Humanities

The deepening economic crisis has triggered a new wave of budget cuts and hiring freezes at America’s universities. Retrenchment is today’s watchword. For scholars in the humanities, arts and social sciences, the economic downturn will only exacerbate existing funding shortages. Even in more prosperous times, funding for such research has been scaled back and scholars besieged by questions concerning the relevance of their enterprise, whether measured by social impact, economic value or other sometimes misapplied benchmarks of utility.

Public funding gravitates towards scientific and medical research, with its more readily appreciated and easily discerned social benefits. In Britain, the fiscal plight of the arts and humanities is so dire that the Institute of Ideas recently sponsored a debate at King’s College London that directly addressed the question, “Do the arts have to re-brand themselves as useful to justify public money?”

In addition to decrying the rising tide of philistinism, some scholars might also be tempted to agree with Stanley Fish, who infamously asserted that humanities “cannot be justified except in relation to the pleasure they give to those who enjoy them.” Fish rejected the notion that the humanities can be validated by some standard external to them. He dismissed as wrong-headed “measures like increased economic productivity, or the fashioning of an informed citizenry, or the sharpening of moral perception, or the lessening of prejudice and discrimination.”

There is little doubt that the value of the humanities and social sciences far outstrip any simple measurement. As universities and national funding bodies face painful financial decisions and are forced to prioritize the allocation of scarce resources, however, scholars must guard against such complacency. Instead, I argue, scholars in the social sciences, arts, and humanities should consider seriously how the often underestimated value of their teaching and research could be further justified to the wider public through substantive contributions to today’s most pressing policy questions.

This present moment is a propitious one for reconsidering the function of academic scholarship in public life. The election of a new president brings with it an unprecedented opportunity for scholars in the humanities and social sciences. The meltdown of the financial markets has focused public attention on additional challenges of massive proportions, including the fading of American primacy and the swift rise of a polycentric world.

Confronting the palpable prospect of American decline will demand contributions from all sectors of society, including the universities, the nation’s greatest untapped resource. According to the Times Higher Education Supplement’s recently released rankings, the U.S. boasts 13 of the world’s top 20 universities, and 36 U.S. institutions figure in the global top 100. How can scholars in the arts, humanities and social sciences make a difference at this crucial historical juncture? How can they demonstrate the public benefits of their specialist research and accumulated learning?

A report published by the British Academy in September contains some valuable guidance. It argues that the collaboration between government and university researchers in the social sciences and humanities must be bolstered. The report, “Punching Our Weight: the Humanities and Social Sciences in Public Policy Making” emphasizes how expanded contact between government and humanities and social science researchers could improve the effectiveness of public programs. It recommends “incentivizing high quality public policy engagement.” It suggests that universities and public funding bodies should “encourage, assess and reward” scholars who interact with government. The British Academy study further hints that university promotion criteria, funding priorities, and even research agendas should be driven, at least in part, by the major challenges facing government.

The British Academy report acknowledges that “there is a risk that pressure to develop simplistic measures will eventually lead to harmful distortions in the quality of research,” but contends that the potential benefits outweigh the risks.

The report mentions several specific areas where researchers in the social sciences and humanities can improve policy design, implementation, and assessment. These include the social and economic challenges posed by globalization; innovative comprehensive measurements of human well-being; understanding and predicting human behavior; overcoming barriers to cross-cultural communication; and historical perspectives on contemporary policy problems.

The British Academy report offers insights that the U.S. government and American scholars could appropriate. It is not farfetched to imagine government-university collaboration on a wide range of crucial issues, including public transport infrastructure, early childhood education, green design, civil war mediation, food security, ethnic strife, poverty alleviation, city planning, and immigration reform. A broader national conversation to address the underlying causes of the present crisis is sorely needed. By putting their well-honed powers of perception and analysis in the public interest, scholars can demonstrate that learning and research deserve the public funding and esteem which has been waning in recent decades.

The active collaboration of scholars with government will be anathema to those who conceive of the university as a bulwark against the ever encroaching, nefarious influence of the state. The call for expanded university-government collaboration may provoke distasteful memories of the enlistment of academe in the service of the Cold War and the Vietnam War, a relationship which produced unedifying intellectual output and dreadfully compromised scholarship.

To some degree, then, skepticism toward the sort of government-university collaboration advocated here is fully warranted by the specter of the past. Moreover, the few recent efforts by the federal government to engage with researchers in the social sciences and humanities have not exactly inspired confidence.

The Pentagon’s newly launched Minerva Initiative, to say nothing of the Army’s much-criticized Human Terrain System, has generated a storm of controversy, mainly from those researchers who fear that scholarship will be placed in the service of war and counter-insurgency in Iraq and Afghanistan and produce ideologically distorted scholarship.

Certainly, the Minerva Initiative’s areas of funded research -- “Chinese military and technology studies, Iraqi and Terrorist perspective projects, religious and ideological studies," according to its Web site -- raise red flags for many university-based researchers. Yet I would argue that frustration with the Bush administration and its policies must not preclude a dispassionate analysis of the Minerva Initiative and block recognition of its enormous potential for fostering and deepening links between university research and public policy communities. The baby should not be thrown out with the bathwater. The Minerva Initiative, in a much-reformed form, represents a model upon which future university-government interaction might be built.

Cooperation between scholars in the social sciences and humanities and all of the government’s departments should be enhanced by expanding the channels of communication among them. The challenge is to establish a framework for engagement that poses a reduced threat to research ethics, eliminates selection bias in the applicant pool for funding, and maintains high scholarly standards. Were these barriers to effective collaboration overcome, it would be exhilarating to contemplate the proliferation of a series of “Minerva Initiatives” in various departments of the executive branch. Wouldn’t government policies and services -- in areas as different as the environmental degradation, foreign aid effectiveness, health care delivery, math and science achievement in secondary schools, and drug policy -- improve dramatically were they able to harness the sharpest minds and cutting-edge research that America’s universities have to offer?

What concrete forms could such university-government collaboration take? There are several immediate steps that could be taken. First, it is important to build on existing robust linkages. The State Department and DoD already have policy planning teams that engage with scholars and academic scholarship. Expanding the budgets as well as scope of these offices could produce immediate benefits.

Second, the departments of the executive branch of the federal government, especially Health and Human Services, Education, Interior, Homeland Security, and Labor, should devise ways of harnessing academic research on the Minerva Initiative model. There must be a clear assessment of where research can lead to the production of more effective policies. Special care must be taken to ensure that the scholarly standards are not adversely compromised.

Third, universities, especially public universities, should incentivize academic engagement with pressing federal initiatives. It is reasonable to envision promotion criteria modified to reward such interaction, whether it takes the form of placements in federal agencies or the production of policy relevant, though still rigorous, scholarship. Fourth, university presidents of all institutions need to renew the perennial debate concerning the purpose of higher education in American public life. Curricula and institutional missions may need to align more closely with national priorities than they do today.

The public’s commitment to scholarship, with its robust tradition of analysis and investigation, must extend well beyond the short-term needs of the economy or exigencies imposed by military entanglements. Academic research and teaching in the humanities, arts and social sciences plays a crucial role in sustaining a culture of open, informed debate that buttresses American democracy. The many-stranded national crisis, however, offers a golden opportunity for broad, meaningful civic engagement by America’s scholars and university teachers. The public benefits of engaging in the policy-making process are, potentially, vast.

Greater university-government cooperation could reaffirm and make visible the public importance of research in the humanities, arts and social sciences.

Not all academic disciplines lend themselves to such public engagement. It is hard to imagine scholars in comparative literature or art history participating with great frequency in such initiatives.

But for those scholars whose work can shed light on and contribute to the solution of massive public conundrums that the nation faces, the opportunity afforded by the election of a new president should not be squandered. Standing aloof is an unaffordable luxury for universities at the moment. The present conjuncture requires enhanced public engagement; the stakes are too high to stand aside.

Author/s: 
Gabriel Paquette
Author's email: 
doug.lederman@insidehighered.com

Gabriel Paquette is a lecturer in the history department at Harvard University.

The Hope of Audacity

I am sick of reading about Malcolm Gladwell’s hair.

Sure, The New Yorker writer has funny hair. It has been big. Very big. It is audacious hair, hair that dares you not to notice it; hair that has been mentioned in far too many reviews. Malcolm Gladwell’s hair is its own thing.

Which is only appropriate, since in his writing, Gladwell has always gone his own way. But he’s been doing it long enough, and so well, and has made so much money, that some folks feel it’s time to trim him down to size. That hair is now seen as uppity.

Gladwell is a mere journalist. He’s not shy, and like many children of academics, he is not intimidated by eggheads. He does none of his own primary research, and instead scours academic journals to find interesting ideas -- he collects experiments and experimenters. He is a translator and a synthesizer, and comes up with catchy, sprightly titled theories to explain what he has seen. Some have called him a parasite. He has called himself a parasite.

It seems to me there’s always been a bit of snarkiness attached to discussions of Gladwell’s work. This is often the case for books that have become commercially successful, which is something that seems particularly to stick in the collective academic craw. There is a weird hostility in the reviews of Gladwell’s books that is directed not at the big-haired guy himself who, like a puppy, nips at the heels of academics and then relishes the opportunity to render their work into fluid, transparent prose, but toward those many people who have made Gladwell famous: his readers. No one matches the caustic condescension of Richard Posner, who said, in a review of Gladwell’s Blink, that “it’s a book for people who don’t read books.”

The reviews of Outliers, Gladwell’s latest book, show that even a New Yorker writer can go too far. People are now attacking Malcolm Gladwell as a kind of brand. The critiques boil down to a few things, one of which is that he doesn’t take into account evidence that refutes his theories. In other words, he’s not doing careful scholarship. But we all know that even careful scholarship is a game of picking and choosing -- it just includes more footnotes acknowledging this. And Gladwell never pretends to be doing scholarship.

Gladwell is also accused of being too entertaining. He takes creaky academic work and breathes Frankensteinian life into it. He weaves anecdotes together, creating a tapestry that builds to an argument that seems convincing. This, some reviewers have claimed, is like perpetuating fraud on the (non-academic) reading public: because Gladwell makes it so much fun to follow him on his intellectual journey, he’s going to convince people of things that aren’t provably, academically true. He will lull the hoi polloi into thinking they’re reading something serious.

Which is, of course, the most common complaint about Gladwell: He’s not serious enough. He’s having too much fun playing with his ideas. And, really, you can’t be Serious when you’re raking in so much coin. Anyone who gets paid four million bucks for a book that mines academic work -- and not necessarily the stuff that is agreed to be Important -- is going to become a target. His speaking fees are beyond the budgets of most colleges. In this way, his career is now similar to that of David Sedaris, who can command an impressive audience and still be dissed by the literary folks. Everyone who’s anyone knows that you can’t sell a lot of books and be a serious writer. Just ask Jonathan Franzen. Or Toni Morrison.

I don’t see Gladwell as a social scientist-manqué, or a philosopher wannabe. Instead, I read him more like an essayist. I think of his books as well-written, research-packed, extended essays. Let me show you the evils of imperialism by telling you a story about the time in Burma when I was forced to shoot an elephant. Let’s look at this (bad) academic prose and think about the relationship between politics and the English language. But instead of using his own experiences, he builds on work done by others. He uses a wry, quirky approach and blithely ignores the received wisdom and pieties of academe. He doesn’t seek out the researcher who’s highly regarded within her field; he looks for people who are doing things he finds interesting.

Gladwell reminds me of the kind of student I knew in college, the nerd who takes weird and arcane courses and then rushes from the lecture hall excited about some idea the professor has mentioned in passing and goes straight to the library to pursue it himself. He stays up all night talking about it, and convincing you that even though you were in the same class, and heard the same reference, you have somehow missed something. Maybe not something big, but at least something really, really cool.

Perhaps I have more trust in readers than to believe that they can be so easily bought off by a good story. And I wish that academics, instead of pillorying Gladwell for being good at translating complicated ideas, would study the way he does it and apply some portion of his method to their own work: He makes mini trade books of monographs. Surely this is a lesson worth learning. He uses the narrative art of the magazine writer to animate ideas. He profiles theories the way Gay Talese or Joan Didion did celebrities.

The audacity Gladwell shows in his writing, connecting seemingly disparate things and working hard, yet with apparent effortlessness, to make the ideas engaging, gives me hope for the future of books. It makes me feel better to see folks buying Gladwell rather than the swimmer Michael Phelps’s memoir or vampire novels -- not that there’s anything wrong with that. Yet this same audacity is what gets Gladwell into hot water with academics. He’s not supposed to do this.

Unless you are an aged physicist, you don’t really get to write books that “purport to explain the world.” You can, of course, try to explicate tiny portions of it. Science writers like James Gleick and Jonathan Weiner can go a lot further than most scientists in terms of making arcane principles understandable to the Joe the Plumbers of the reading world and no one gets bent of out shape. Perhaps it’s because of the assumption that scientists, with a few notable (often British) exceptions, are not supposed to be able to write books that normal people can read. Social scientists and historians are, however, expected to be able to know what is interesting and important about their work and present it to the public. Brand name thinkers like Susan Sontag and Martha Nussbaum can take on big ideas. But these people are experts; journalists shouldn’t try this at home.

What I love about Gladwell is that his writing is like his hair. You can see it as arrogant or scary (he writes about being stopped more frequently by cops when he had a big afro), or you can see it as playful and audacious. This is why, of course, so many reviews mention it; he has the right hair for his work.

One final, dour complaint about Gladwell has to do with his relentless cheeriness. He thinks that people are basically good, though he understands that sometimes circumstances aren’t. I can’t abide high-brow literary novelists who trash fiction that “cops out” with a happy ending. Maybe I’m hopelessly low-brow: I still love Jane Austen and Shakespeare’s comedies. The academic response to most things is generally: it’s more complicated than that. And sure, much of the time it is. But if something’s artfully crafted, I’m willing to cut the author some slack. I don’t ever expect to be thoroughly persuaded of anything; I’m characterologically skeptical and like to do the thinking on my own. Gladwell’s books invite me into a conversation. I think that’s part of the job of a good book.

For me, reading Malcolm Gladwell’s books is like watching Frank Capra movies. Just because they make you feel good and keep you entertained doesn’t mean that they’re not doing valuable work or tackling hard and real issues and ideas. Sure, someone else could have handled it differently. George Bailey might have finally committed suicide; the bank in Bedford Falls could have asked for a government bailout. But right now, maybe it’s not such a bad thing to read books that are a little more hopeful. And yes, audacious.

Author/s: 
Rachel Toor
Author's email: 
newsroom@insidehighered.com

Rachel Toor teaches in the MFA program at Eastern Washington University. She writes a monthly column for The Chronicle of Higher Education, and her most recent book is Personal Record: A Love Affair With Running. Her Web site is www.racheltoor.com.

The Why and How of Human Terrain Teams

Inside Higher Edrecently published an interview with Roberto González, an associate professor of anthropology at San Jose State University, on the Human Terrain System (HTS), a U.S. Army program in which social scientists are embedded with military units. The questions were thoughtful and well asked, but the answers bear little resemblance to the work I conducted as a field social scientist deployed by HTS. I would like to explain what the goals of the program are, what we do, and why we do it, as well as try to clarify misperceptions that arise from unfamiliarity with military culture, terminology, planning and practice.

My job in Iraq was to represent the population to promote nonlethal planning and operations. When a mission is conceptualized, when course of action recommendations have to be made, when decisive points are identified for the commander, my job is to present what the population wants and expects, how it will react, and at all times promote nonlethal options.

This last portion, the promotion of nonlethal options, is of exceeding importance for two reasons. The first is the nature of my mission, and the overall mission of the HTS – we have an ethical responsibility to bring quality socio-cultural information and nonlethal possibilities to the commander’s attention. This is related to the second imperative, which goes to the heart of Counterinsurgency (COIN) doctrine. The three most important elements of COIN are 1) to empower the lowest level (the population), 2) to work from the bottom up (the population) and 3) nonlethal operations accomplish more than lethal ones. In a nutshell, my job is to keep the population, the effects of military operations on the population, and nonlethal options front and center in the commander and command staff’s awareness.

There are a number of ways that an HTT can keep the population and nonlethal options on the front burner. In the case of my team, we used very standard research and analysis methods to get at both primary and secondary open source data. At all times we endeavored to engage in best practices, both in terms of methodology and ethics. We essentially used four basic methods of collection: archival, process observation, participant observation, and semi-structured elite level interviews.

Our archival research had three different purposes. The first was to do our homework about our brigade’s operating environment before we deployed with them to Iraq. The second was to then go through the information on the population already archived by the brigade that we were replacing. The final component was to keep abreast of political, social, religious, and economic events in our operating environment, Iraq, the Middle East, and in some cases, the U.S., which could affect the host nation population that we, and the Army, had to interact with on a daily basis. We also process and participant observed a wide variety of meetings and events. At all of these we identified ourselves fully, explained who we were, what we were doing (serving as socio-cultural advisors for the Army), and asked for permission to ask questions and to attribute or not. At all times we used standard, basic protocols for conducting process and participant observation.

When conducting our elite level interviews, part of a four-month-long tribal study and history, we used formal, documented informed consent. The documents were prepared in English, translated into Arabic, and the interview subject retained one copy and I, as research director, retained one. When requested, anonymity was granted. The Army personnel we worked with never had access to these, to the internal ethical review process of the team, or to the raw information of someone’s identity when anonymity was requested. In fact, because of the social science backgrounds of many of the officers we dealt with daily, they not only understood the protocols, but respected them. Moreover, on one occasion the protocols actually allowed me to provide necessary information to a battalion commander. The sheikh I had just interviewed had consented to my attributing his information, which allowed me to answer the commander’s questions without feeling like I was boxed in. Ethical and methodological best practices actually enabled me to properly do my job. On another occasion, information that I collected was useful in helping the battalion commander, as I provided information that presented a set of nonlethal options for resolving a problem regarding a local mosque.

The results of this four-month study, in combination with data acquired from engaging in participant observation with everyday Iraqis, as well as internally displaced persons, provide very important insight and findings regarding Iraqi tribal behavior, Iraqi politics, religion, rule of law, as well as the stabilization and reconstruction that is being undertaken. The results are being prepared for peer review and publication.

The information we obtained was also packaged and provided to our brigade, the battalions, maneuver companies, as well as the embedded Provincial Reconstruction Team and the U.S. Department of State/U.S. Embassy. Had this information been available when Operation Iraqi Freedom was conceptualized, there would have been a greater chance of the initial stabilization and reconstruction being done in a better informed, more productive, and less lethal manner.

One of the other important points raised by Dr. González – and which I would like everyone to understand -- has to do with Army terminology. I went out on patrol as often as I could. Going on patrol means going out with a combat element, but it does not automatically mean going out to engage in combat or lethal operations. I went out on every mission I could that involved taking humanitarian assistance to the local Iraqis. And here’s the thing to remember – most of these involved going door to door. That’s right: The Army sends soldiers to towns, villages, and settlements to go door to door to deliver food, water, water purifiers, dental prophylaxis, toys, and other items on a regular basis. I also accompanied Civil Affairs teams to conduct assessments of infrastructure, attend meetings, and engage in medical operations among the local population.

In fact, while out on patrol my teammates and I were able to identify several archaeological sites. We brought this to the attention of brigade and battalion staffs, as well as the Cultural Heritage Officer at the U.S. Embassy and the head of the U.S. Army’s Archaeological Unit. We were able to preserve one site that was slated for development. And through collaboration with archaeologists at Penn State, University of Chicago, Harvard, the Army, and State Department, we created a comprehensive list and maps of all the sites in our operating environment so that the Army would know where construction could and could not take place.

The hallmark of good human terrain fieldwork lies in the reduction in the number of lethal operations, casualties inflicted and received. By doing our research, both primary and secondary, we were able to directly or indirectly conceptualize and influence virtually all of our brigade’s problem sets and provide nonlethal options to resolve them. My teammates and I were heavily involved with helping to write the brigade’s campaign plan. Every session always began with the Plans Officer and/or the Line of Effort (LOE) Chief asking what “does right look like for the Iraqis in our OE [operating environment] and how do we get them there?” Our job was to answer that question by taking our research and packaging it in a way that military personnel could easily and quickly digest. When we did this, we were able to ensure that the Army focused on the three most important aspects of COIN that I outlined above. This all translates into fewer injured or killed locals and, of course, fewer injured or killed American and Coalition Forces.

We do not do targeting, intelligence collection, or engage in any part of lethal and kinetic operations, although we do, like everyone, retain the right to self-defense. Contrary to the program’s most vocal critics, we are not using social science methodology to enable the Army to kill more Iraqis and Afghanis. In fact, one of our biggest successes was getting the Shriner’s Hospital in Boston, as well as a local Boston charity, to agree to treat a burned Iraqi boy and house and feed his family pro bono. When our Commander decided it was better for Iraqis to treat him we worked with a sister team in another OE to facilitate his access to treatment within the Iraqi Ministry of Health system.

This goes right to another point on terminology: The Army calls everything they engage in “targeting.” For instance, when the Commander goes to have dinner with a sheikh, that is referred to as targeting. This can easily lead to confusion by those who do not work with the military, so we have been encouraging them to use the terms “engage” and “engagement” instead of “target” and “targeting” when engaging in nonlethal operations. This is, actually, more than just a matter of semantics. By changing the way the military talks about nonlethal operations, we change the way they think about them, which further promotes nonlethal options.

In a nutshell, we are using our methodological skills to help the Army learn how to achieve their goals without having to use force. As someone with extensive methods training, in five different disciplines, and who has taught research methods, I can think of no more noble use than to use these skills to preserve life whenever possible. How many research and teaching academics can say the same about how they use their skills?

There is one set of related items that Dr. González mentions in his interview answers that I would like to address here. Despite what some personnel from the Foreign Military Studies Office wrote, we are not a “CORDs for the twenty-first century.” CORDs, a Vietnam-era initiative, was a full-fledged counterinsurgency program, utilizing both military and civilian advisors who lived with the local populations that they were working with and trained them on all aspects of government and governance. Moreover, they were training these populations in regards to stabilization and reconstruction. Importantly, because the CORDs personnel actually lived among the host nation population, they lived and died with them, so, when necessary, they fought with them. Human Terrain personnel do not live with the host nation population, nor do we fight with them. Rather we live on the military bases, go out with a military security escort, and return home to base after our engagements. We also are not involved with training the population, and we do not engage in stabilization or reconstruction projects. We are enabling advisors, not actors. The Provincial Reconstruction Teams, which are a State Department initiative, are the closest thing we have today to CORDs. The article that Dr. González mentions was published in the September/October 2006 issue of Military Review. As the first HTT did not deploy until February 2007, it was prepared well in advance of HTS becoming operational, and therefore cannot be construed as an accurate representation of HTS or its mission.

Project Phoenix, a separate Vietnam-era program, which too often is confused with, or mistakenly rolled into CORDs, is also not an applicable historical analog to HTS. This was a program advised by the Central Intelligence Agency and it largely involved Vietnamese trying to root out VietCong political cadres with the help of a small number of civilian advisors – mostly law enforcement personnel, not researchers. Unlike Project Phoenix, HTS is not engaged in identification and neutralization of targets.

I also want to make it very clear: The U.S. Army’s Human Terrain System is not connected or affiliated with other programs that have adopted the terminology of human terrain. This is important as Dr. González conflates HTS with these other initiatives and as such it is both inaccurate to confuse them, as well as unfair to HTS to try to paint us with the same brush.

While it is absolutely right to be concerned about learning the lessons of the past, the simple truth is I have yet to see or experience any evidence of the neo-colonial counterinsurgency that Dr. González describes. Regardless of whether you supported the politics and/or policies that led us into our current conflicts, as Americans we have a moral responsibility to leave Iraq and Afghanistan in as functional and stable a state of existence as possible.

Regardless of your politics regarding the war, if one has the skills and knowledge to help out, even a little bit, and one chooses not to, what does that say about that individual or organization? This is the question that the many academics who have found it easy to criticize the Human Terrain System, either from ignorance, misinformation, or political opposition to the policy decisions that led us into the war in Iraq, need to ask themselves.

Author/s: 
Adam L. Silverman
Author's email: 
info@insidehighered.com

Adam L. Silverman holds a doctorate in political science and criminology, masters' degrees in religion and international relations, and a bachelor's in Middle Eastern studies. He was the 2nd Brigade Combat Team/1st Armored Division field social scientist and socio-cultural advisor assigned to HTT IZ6 and is currently a social science advisor with the Human Terrain System. The ideas and opinions expressed in this essay are his alone and do not necessarily reflect the opinions of the brigade, division, U.S. Army, or the Human Terrain System.

Kass Backwards

Last week Leon Kass, chairman of the Council of Bioethics under President Bush, took to the podium to deliver the Jefferson Lecture of the National Endowment for the Humanities -- an event I did not go to, though it was covered by one of IHE's intrepid reporters.

My reluctance to attend suggests that, without noticing it, I have come to accept Kass’s best-known idea, “the wisdom of repugnance.” There is, alas, all too little evidence I am getting any wiser with age -- but my visceral aversion to hearing a Bush appointee talk about human values is inarguable.

As you may recall, Kass wrote in the late 1990s that biotechnological developments such as cloning are “the emotional expression of deep wisdom, beyond reason’s power fully to articulate it.” In our rising gorge, he insisted, “we intuit and feel, immediately and without argument, the violation of things that we rightfully hold dear.... Shallow are the souls that have forgotten how to shudder.”

Judged simply as an argument, this is not, let’s say, apodictically persuasive. Anyone who as ever taken an introductory anthropology course, or read Herodotus -- or gone to a different part of town -- will have learned that different groups feel disgust at different things. The affect seems to be hard-wired into us, but the occasions provoking it are varied.

Kass invoked the "wisdom of repugnance" a few years before he joined an administration that treated the willingness to torture as a great moral virtue -- meanwhile coddling bigots for whom rage at gay marriage was an appropriate response to “the violation of things we hold rightfully dear.”

Now, as it happens, some of us do indeed feel disgust at one of these practices, and not at the other. We also suspect that Kass’s aphorism about the shallowness of souls that have forgotten how to shudder would make a splendid epigraph for the chapter in American history that has just closed.

In short, disgust is not quite so unambiguous and inarguable an expression of timeless values as its champion on the faculty of the University of Chicago has advertised. Given a choice between “deep wisdom” and “reason’s power fully to articulate,” we might do best to leave the ineffable to Oprah.

There is no serious alternative to remaining within the limits of reason. Which means argument, and indeed the valuing of argument -- however frustrating and inconclusive -- because even determining what the limits of reason themselves are tends to be very difficult.

Welcome to modernity. It’s like this pretty much all the time.

The account of Kass's speech in IHE -- and the text of it, also available online -- confirmed something that I would have been willing to wager my paycheck on, had there been a compulsive gambler around to take the bet. For I felt certain that Kass would claim, at some point, that the humanities are in bad shape because nobody reads the “great works” because everybody is too busy with the “deconstruction.”

It often seems like the culture wars are, in themselves, a particularly brainless form of mass culture. Some video game, perhaps, in which players keep shooting at the same zombies over and over, because they never change and just keep coming -- which is really good practice in case you ever have to shoot at zombies in real life, but otherwise is not particularly good exercise.

The reality is that you encounter actual deconstructionists nowadays only slightly more often than zombies. People who keep going on about them sound (to vary references a bit) like Grandpa Simpson ranting about the Beatles. Reading The New Criterion, you'd think that Derrida was still giving sold-out concerts at Che Stadium. Sadly, no.

But then it never makes any difference to point out that the center of gravity for argumentation has shifted quite a lot over the past 25 years. What matters is not actually knowing anything about the humanities in particular -- just that you dislike them in general.

The logic runs something like: “What I hate about the humanities is deconstructionism, because I have decided that everything I dislike should be called ‘deconstructionism.’ ” Q.E.D.!

Kass complained that people in the humanities fail to discuss the true, the good, and the beautiful; or the relationships between humanity, nature, and the divine; or the danger that comes from assuming that technical progress implies the growth of moral and civic virtue. Clearly this is a man who has not stopped at the new books shelf in a library since the elder George Bush was Vice President.

And so last week’s Jefferson lecture was, perhaps, an encouraging moment, in spite of everything. With it, Leon Kass was saying farewell to Washington for, with any luck, a good long while. Maybe now he can spend some time catching up with the range of work people in the humanities have actually been doing. At very least he could read some Martha Nussbaum.

Then he might even pause to reflect on his own role as hired philosopher for an administration that revived one of the interrogation techniques of the Khmer Rouge. The wisdom of repugnance begins at home.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Midsummer Miscellany

The fall books have already started piling up. There are titles I’ve asked the publishers to send, and the ones volunteered by eager publicists, and the ceaseless influx of small books of poetry, which fill me with guilt for watching “Law & Order” reruns instead of reading them. (But verily, man cannot live by print alone.)

Before the new publishing season begins and they are lost in the flood, let me take a quick look here at a few recent titles – books I have found absorbing and rewarding, but not had a chance to discuss in this column. The list is miscellaneous, and the tip of an iceberg. I doubt they have much in common. But each title is a reminder of the fine and irreplaceable work that university presses do with no fanfare, and seldom much recognition. And let’s not even talk about profit.

The arrest of Henry Louis Gates Jr. (a.k.a. Gates-gate) has generated great heat but not much light. Various media loudmouths have been outdoing themselves in portraying the Harvard professor as some kind of wild-eyed radical. This is, of course, a matter of ignorance, robust and unashamed. Gates is in reality the most anodyne of centrists. But at least the furious fantasies he has provoked should put to rest for good any notion that the United States has lately turned into a “postracial” society.

It seems like a good moment to recommend Pulitzer Prize-winner Steven Hahn’s new book The Political Worlds of Slavery and Freedom, a compact but challenging volume published by Harvard University Press. The author is a professor of history at the University of Pennsylvania. His three chapters -- each a miniature monograph -- are based on a series of lectures at Harvard, given at Gates’s invitation.

Hahn looks at the complex way the African-American struggle for emancipation took shape both under slavery and in the wake of its abolition. This process involved the creation of institutions for self-governance, as seen in “the efforts of newly freed people to reconstitute their kinship groups, to form squads and other family-based work groups, to pool community resources, and, of course, to acquire land.”

These weren’t just social movements. They contained, argues Hahn, a political element. Hahn considers whether the activity of black Southerners during the Civil War amounted to a variety of slave revolt, and he sketches aspects of the political life of Marcus Garvey’s pan-Africanist group in the United States in the early part of the 20th century. Only the mo

st small-minded conception of American life would assume that these are matters of interest only to black readers. In a healthy culture, this little book would be a best-seller.

A few months ago, an editor asked me to review Adina Hoffman’s biography of Taha Muhammad Ali, My Happiness Bears No Relation to Happiness: A Poet’s Life in the Palestinian Century, published by Yale University Press. To tell the truth, my heart did not initially leap at the opportunity. For I had never read any of his poetry, and rather feared that it might be full of slogans -- that, indeed, the poet’s own life might be one long slogan.

This proves that I am an idiot. A couple of sessions with his selected works revealed Ali to be a wry, ambivalent, and often understated lyricist. (In translation, at least, he seems a little bit like Edgar Lee Masters.) The figure who emerges from Hoffman’s biography is that of a quiet shopkeeper in Nazareth who carefully studied Arabic literary tradition, and also absorbed the influence of the Palestinian nationalist “resistance literature” – then created his own distinctive style: one stripped-down and unrhetorical, but sensitive as a burn.

One of the remarkable things about this biography, as indicated in my review, is that it evokes not only the political and historical context of Ali’s work, but also how his poetry took shape. Its quietness and simplicity are hard-won.

At the other extreme from Ali, perhaps, is Walt Whitman, whose poetic voice is booming, and whose persona always seems a couple of sizes too large for the North American continent. A couple of years back, Duke University Press reprinted his one and only novel: a cautionary tale of the perils of strong drink called Franklin Evans, or The Inebriate. I have somehow never gotten around to reading it, and probably never will. But it is impressive to think that Whitman grew to his familiar cosmic dimensions while stone cold sober.

His poetry certainly intoxicated the readers portrayed in Michael Robertson’s Worshipping Walt: The Whitman Disciples, published by Princeton University Press. The noun in its subtitle is no exaggeration. The readers portrayed here found in Whitman’s work something akin to a new scripture -- nearly as much as followers of Joseph Smith or Mary Baker Eddy did the Book of Mormon or Science and Health.

You can still find R.M. Bucke’s Cosmic Consciousness (1901) -- where Whitman is identified as “the best, most perfect example the world has so far had of the Cosmic Sense" -- in New Age shops. Other disciples took his “chants democratic” as hymns for a worldwide socialist commonwealth. And his invocation of manly “adhesiveness” were understood by a few readers to be a call for what later generations would term gay liberation. Whitman insisted that his homophile readers had misunderstood him, and that when not writing poetry he had been busy fathering illegitimate children all over these United States. The biographers will continue to hash that one out -- though it’s clear that his literary persona, at least, is ready to couple with anything that moves, regardless of gender.

Whitman’s work gave some of his Victorian readers a vision of the world extending far beyond the horizon of the familiar and the acceptable. No surprise that they revered him as a prophet. Robertson, a professor of English at the College of New Jersey, tells the story of his steadily expanding circle of enthusiasts (which at one point aspired to become a global movement) with due appreciation for how profound the literary experience can be, when the right book falls into the right person’s hands.

Of course there are times when reading is a nothing but a guilty pleasure. So to go from the sublime to the sleazy, I have to recommend Jack Vitek’s The Godfather of Tabloid: Generoso Pope Jr. and the National Enquirer, published by the University Press of Kentucky late last year.

Not that the book itself is sleazy. The Enquirer may specialize in celebrity gossip, horrific crimes, UFO abductions, and Elvis Presley's posthumous itinerary. But that's not to say that the author -- an associate professor of English and journalism at Edgewood College, in Madison, Wisc. -- is anything but serious and measured in his approach. Vitek tackles his subject with all due awareness of its lingering cultural relevance. Pope modeled himself on newspaper tycoons such as William Randolph Hearst and Joseph Pulitzer.

The publisher also happens to have had family “connections” (as the preferred expression has it) with what its members do not call the Mafia. He also spent about a year working for the Central Intelligence Agency. This makes it especially interesting to consider the mission statement Pope released when he bought a local tabloid called the New York Enquirer in 1953. “In an age darkened by imperialist tyranny and war,” it said, “the New York Enquirer will fight for the rights of man, the rights of the individual, and will champion human decency and dignity, freedom and peace.”

Any biography moving between lofty rhetoric and very low company is bound to be pretty absorbing. The Enquirer, after it went national, reached a peak circulation of 6.7 million copies per issue in the late 1970s, with Pope playing an aggressive role in crafting its distinctive strain of populist sensationalism.

In a footnote, Vitek points out that Fredric Jameson’s analysis of postmodernism somehow overlooked Pope’s role as formative influence within what Jameson calls "the degraded landscape of schlock and kitsch." Quite right -- and it is good to have this oversight finally corrected.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

A Peek Behind the Veil

OK, so, into a bar walk an Anglican priest, a Muslim imam, a Jewish rabbi and an atheist. Sounds like a ramp to punch line, right? No. That was my panel last month at the 20th anniversary of the Oxford Round Table, at the University of Oxford, England.

Apparently, a peek behind the veil of ORT is needed. Recent posts in the academic blogosphere about this invitation-only academic symposium feature adulation for the intelligencia it attracts and castigation of Oxford for trading on its name for summer business, like some sort of pedagogical Judas.

Fact is, they’re both right. Mind, matter and merger summarize why the event both enchanted and irritated me.

Mind Over Matter

Firstly, pundits need not dismiss its scholarly girth. Formidable participants do darken the doors. My symposium, “Religion and Science After Darwin -- Effects on Christians and Muslims” -- featured sessions with distinguished thinkers in physics, biology, religion and law from all the intellicrat schools you might imagine: Oxford, Harvard, Boston U., UNC-Chapel Hill, Rutgers, etc. It’s not every day you spend time with David Browning (icon for Christian-Islamic comity), Robert Neville (23 books and counting), Amedee Turner (European Parliament while the Euro was established), or the ardent atheist Richard Dawkins (The God Delusion).

Further stamps of legitimacy on the program include ORT Trustee Charles Mould, former secretary of Oxford’s 400-year-old, 11 million volume Bodleian Library, and a 16-member advisory committee of university presidents and rectors from eight countries. Also, the manuscripts in its blind reviewed journal, Forum on Public Policy, bear characteristics of quality.

However, since the program began in 1989 with ministers of education from 20 countries, an internalized invitation system eroded to include mid-level researchers or engaged academics like me from teaching institutions -- from ministers of education to an educator with ministerial credentials (and a few relevant publications). Try to tame the jokes for Darwinian devolution.

The intellectual temperature was warm, not hot. This is where I’m supposed to say, “but all were meaningful contributors.” Truth is, some members of our panel were alien to the work, sending more than one head scratching. The good news is that neither title, institution type, or academic discipline were the indicators. Candid confrontation carried the day, based on the quality of ideas. I’m the better for hearing it all. (I’m supposed to say that, too).

As for how aliens gather, one candid comment by an event organizer confessed that the University of Oxford bills the ORT organization heavily for use, and like most universities in modern economy Oxford depends on summer conference “hotel” business to get by.

The ORT itself is, of course, a business (albeit nonprofit), which explains why they folded two smaller symposia into one fumbling theme. That irked me. It was like bringing a fruitcake to a wine and cheese party. I was dressed for interfaith democracy since 9/11. Others came with erudite philosophies of science.

Most organizations can’t get away with last minute theme mergers, but the collective transfixion over a week at the world’s first English speaking university seems to place otherwise central concerns, like the event purpose (!), out of mind for most participants.

Matter Over Mind: Pub and Pulpit

Oh, but the place is intoxicating, and place matters. If space inspires thought or ambition, the ORT venue should produce the most luminous luminaries on the planet. I’ll spare you predictable fawning over this medieval city, where every castle and cathedral issues such artisan care the place is fabled “the city of dreaming spires.” The point: ORT wouldn’t work in Albuquerque.

It’s not intention that the American Southwest lacks, but history, deep academic history, and the continuity one feels holding forth at an ancient lectern presided over by 800 years of political, scientific and religious savants.

Both pubs and pulpits nurtured greatness here for centuries. Their understated, six-inch plaques tagged across the city commemorate landmarks in a prevalence of meaning only Oxford could afford.

To the pub: on one side of town is a tiny booth in The Eagle and Child tavern where C.S. Lewis and J.R.R. Tolkien met every Tuesday for 25 years -- “the conversations that have taken place here," its plaque reads, "have profoundly influenced the development of 20th century literature,” from The Chronicles of Narnia to The Lord of the Rings.

And to the pulpit: across town is an unassuming though well-crafted podium in a Gothic cathedral from which John Wesley preached his conversion story and launched the Methodist Movement that, in part, propelled my own institution into being. There brother Charles penned hymns now sung in every Christian church on the planet. In a word, cool.

The significance of location fits ORT, as described by the French philosopher Gaston Bachelard in The Poetics of Space: Certain places reduce us to silence. They contain more than their objectivity. Sometimes you feel “inside an essential impression seeking expression.” And recall of such spaces, as I perform for you now, become not history -- “I was there” -- but a kind of poetry that memorializes moments. Bachelard says, “The great function of poetry is to shelter dreams.”

For too many academics, the dreams of significance are extinguished in a chemical bath of routine responsibilities (e.g. recommendation letters, grading, meetings). But such dreams require opportunities to perform. The University of Oxford’s space holds sufficient cachet to revive academic dreams, requiting love for elevated and sublime learning.

Mind and Matter Merger: Leaving in Tension

Alas, learning without tension is entertainment. Mind and matter merged for me during one session in the centuries-old Victorian Oxford Union Debate Chamber -- affectionately called “the last bastion of free speech in the world.” Recently, the Holocaust-denier David Irving and “sex-positive” community builder Joani Blank spun yarns. The likes of Yasser Arafat, Desmond Tutu, and a Kennedy or two are tossed in here and there. In that space all the tensions of the Oxford Round Table, real and symbolic, came together for me.

Standing at the podium was Dawkins. I’ve never been insulted with such kindness. He artfully delivered wink-and-smile sarcasm against bald jabs of theist stupidity, and appeared to relish the provocation. Had I not read some of his work, I would’ve thought it mere gamesmanship, superficial wordplay for positions not fully held.

Yet there’s a likability in him somehow, a most unexpected thing for me to feel as an evangelical Christian. I wished I had more time with him, but not in the way that morphed middle-aged scientists into giddy children after the Q & A, lining up hurriedly with the front flaps of their Dawkins books in one hand and autograph pen in another. Here was an orgy of secularism, loud and proud, baby.

Seated next to him in poetic paradox was the head-in-hand, the veteran Vicar Brian Mountford of millennia-aged University Church of St. Mary’s, original site for Oxford coursework, and the physical and spiritual hub of a city and campus with 40 chaplains. Twice per term, in fact, the “university sermon” is delivered here, dignitaries in tow.

Not only does this priest share the platform with Dawkins, shepherding souls in a landscape of logical positivism, but imagine this: He’s also Dawkins’s neighbor. What a delicious irony! That’s better than McDonalds and Burger King on the same corner.

Mountford reconciles this tension, in part, through self-described liberal theology. Our talk, his Spring sermons, and his book, Perfect Freedom: Why Liberal Christianity Might Be The Faith You’re Looking For, express: a “low view of the church” (it institutionalizes discipleship, stripping salvation of its freedoms); an “embrace of the secular” (the Church should not assume society is ethically less sophisticated than itself); soft judgment (“God would not condemn his creatures to eternal torment”); and the “championing of doubting Thomases on the fringe.” He sees this as being “more evangelical than the evangelicals” -- courting scoffers almost Socratically while provoking believers (“sermons send us to sleep because they are totally uncontroversial”).

But for me, a theological conservative, here strikes another strand of tension, beyond the ridiculing atheist “neighbor” we’re charged to love. Here is faith diverging between two likable people -- a theological gap Mountford once described as “chalk and cheese,” things that just don’t go well together.

Such was ORT for me: enchantment and irritation, the merger of chalk and cheese.

En route to the airport were two books under wing, Dawkins’s The God Delusion and Anthony Flew’s There is a God: How the World’s Most Notorious Atheist Changed His Mind.

Agitations get me thinking. I’m the better for it, remember?

Author/s: 
Gregg Chenoweth
Author's email: 
newsroom@insidehighered.com

Gregg Chenoweth is vice president for academic affairs at Olivet Nazarene University, and practicing journalist for a variety of magazines and newspapers.

Engaging the Military

Over the past two years, there has been considerable controversy over attempts by the Pentagon to recruit anthropologists and other social scientists to assist in counterinsurgency operations in Iraq, Afghanistan, and elsewhere in the “global war on terror.” Like the American Psychiatric Association and American Medical Association, which banned members’ participation in torture and interrogation, anthropologists have widely criticized the use of anthropology in counterinsurgency as unethical.

Of particular concern has been the U.S. Army’s “Human Terrain Team” program under which (sometimes armed) social scientists are embedded in brigades deployed in Iraq and Afghanistan to provide cultural knowledge that assists with combat operations. Many anthropologists agree that the Human Terrain program and other counterinsurgency activities violate the American Anthropological Association’s code of ethics, which commits members to do no harm to the people with whom they work, prohibits covert research, and requires researchers to obtain informed consent and to avoid doing things that could endanger the work of future anthropologists. Many have likewise criticized the recruitment of anthropologists as an effort to forestall bringing troops home from Iraq and Afghanistan, continuing the policies that have left the United States mired in deadly, unpopular wars.

Spurred by such concerns, in October 2007, the executive board of the American Anthropological Association (AAA) called the Human Terrain program “an unacceptable application of anthropological expertise.” Between 2007 and 2008, more than 1,000 anthropologists agreed to boycott the program, signing a pledge of non-participation in counterinsurgency as part of a campaign organized by the Network of Concerned Anthropologists (I am a member of the steering committee).

Supporters of the Human Terrain program have often claimed that those opposed to working in the wars are advocating total academic disengagement from the military and a retreat to the ivory tower. This could not be further from the truth. Most opponents of the Human Terrain program, myself included, are not categorically opposed to work and engagement with the military. To the contrary, many believe that anthropologists can ethically teach soldiers in classrooms, train peacekeepers, or consult with military and other government officials about cultural, social, historical, and political-economic issues.

Indeed, the campaign against anthropological collaboration in counterinsurgency has coincided with and helped fuel a recent efflorescence of research and work on an expanding array of issues related to the military and foreign policy. Far from calling for a retreat to the ivory tower, a growing number of anthropologists are actively involved in research both with and about the U.S. and other militaries, foreign policymaking and policymakers, war, conflict, and militarization.

Inspired by anthropologists like Laura Nader, Kathleen Gough, Mina Davis Caulfield, Marshall Sahlins and Eric Wolf, anthropologists have studied topics as diverse as nuclear weapons policy, the training of foreign military personnel at the School of the Americas, the shadowy world of the global arms trade, and the harmful effects of military bases. My own research has investigated the creation of the secretive U.S. base on Britain’s Indian Ocean island Diego Garcia, the expulsion of the island’s indigenous people during development of the base, and the significance of the base for U.S. foreign policy.

As a result of this work, I recently attended a two-day meeting of anthropologists, historians, sociologists, and political scientists organized by the newly founded Eisenhower Research Project for the Critical Study of Armed Forces and Militarization. Hosted by co-directors Catherine Lutz and Aaron Belkin and project manager Christina Rowley at Brown University’s Watson Institute for International Studies, participants discussed subjects as diverse as U.S. military spending (which now equals or exceeds that of all the other nations of the world combined), military checkpoints in Iraq, the increasing use of remote-controlled robots and other advanced technologies in war, the military’s role in the war on drugs, the militarization of the U.S. border, the armed services’ dependence on so-called military wives and military families, and the role of Hollywood and popular culture in glorifying war.

Most importantly, the interdisciplinary group of scholars dedicated itself not just to conducting research on military issues, but also to attempting to influence national conversations and public opinion about military and foreign policy. For too long in the past anthropologists and other social scientists have indeed isolated themselves in the ivory tower, ceding policy debates to international relations and security scholars, to think tanks generally invested (intellectually or literally) in war, to arms manufacturers’ lobbyists, to pundits, politicians, and the Pentagon.

Our nation is at a critical moment in determining the role the military is going to play in the world and the shape of our relations with other nations. President Obama has indicated his desire to chart a different course in the nation’s foreign policy from that of President Bush, to make diplomacy, cooperation, and engagement the hallmarks of U.S. international relations.

And yet, while slowly trying to extricate the nation from a deadly, illegal war in Iraq, we appear ready to repeat the same mistakes of that war, and Vietnam before it, in pursuing an increasingly violent war of occupation in Afghanistan — a nation where the British and Soviet empires failed before us in their attempts to impose foreign rule. Rather than learning from these past mistakes, from the lesson that there can be no military solution to the challenge posed by the Taliban and others resisting occupation, the escalation of U.S. troops and bombing in both Afghanistan and Pakistan is an increasingly bloody diversion from the political, economic, and diplomatic initiatives that must be at the heart of any solution to violent conflict.

Given the growing crisis in Afghanistan, which threatens to derail Obama’s agenda abroad and at home, the skills and original perspectives of anthropologists and other social scientists are desperately needed to build a new direction for U.S. military and foreign policy. This will mean conducting research of direct relevance to the U.S. military, to the State Department, and to the dynamics of U.S. global relations. This will mean shedding anthropologists’ traditional hesitancy about proposing proscriptive solutions to identified problems (the bread and butter of many international relations scholars). This will mean writing not primarily for academic audiences but instead for policymakers, politicians, and the wider public.

It will mean doing so not in the pages of (generally obscure) academic journals, but in the op-ed pages of newspapers, for blogs and major web outlets, and for the likes of Foreign Affairs and Foreign Policy. And it will mean building on efforts like the Eisenhower Research Project to create a new breed of policy think tanks — think tanks staffed by a diverse group of social scientists, driven by empirical research, and frequently working in collaboration with military leaders and others in the national security bureaucracy to create new policy approaches.

The Pentagon’s efforts to recruit anthropologists for the wars in Iraq and Afghanistan represent the failure of U.S. foreign policy rather than innovation. They are a return to the sad beginnings of anthropology — the “handmaiden of empires” — when the discipline was born as a tool to assist in the rule and control of colonized peoples in Africa, Asia, and North America. The recruitment of anthropologists represents the misguided belief that victory in Iraq and Afghanistan can be achieved through better tactics — if only we could fight smarter, know more about their cultures, and embed anthropologists with the troops, then we would “win”! — rather than realizing that the real lesson these wars is that wars of invasion and occupation should not be waged at all.

The nation must use this moment to embrace a permanent and fundamental change in our military and foreign policy. We must finally reject a foreign policy of invasion and occupation and embrace a new kind of foreign policy based around non-aggression, diplomacy, international cooperation, and the protection of human needs and human lives as the best way to ensure the security of the country and the world. With members of the military and an engaged citizenry as our partners and allies, anthropologists and other social scientists have a critical role to play in this process.

Author/s: 
David Vine
Author's email: 
info@insidehighered.com

David Vine is assistant professor of anthropology at American University. He is the author of the recently released Island of Shame: The Secret History of the U.S. Military Base on Diego Garcia(Princeton University Press) and co-author of The Counter-Counterinsurgency Manual, or Notes on Demilitarizing American Society (Prickly Paradigm Press, 2009).

Mutual Aid Society

Human beings are the product of a few million years of evolution. Awareness of this is part of what it means to be modern. But most of the time this recognition remains general and vague. We get along just fine without thinking about the scale of the processes involved. We act as if a thousand years is a long time; it can be a strain to imagine the world of a few decades ago. It is hard to reckon just how thin a slice of human time is there in recorded history. That we ever developed the capacity to record things at all is strange and improbable. As recently as ten or twelve thousand years ago, our ancestors devoted most of their waking hours to finding enough calories to stay alive.

“To date,” writes Michael Tomasello, co-director of the Max Planck Institute for Evolutionary Anthropology in Leipzig, “no animal species other than humans has been observed to have cultural behaviors that accumulate modifications and so ratchet up in complexity over time.” At some point that complexity begins to spike -- an exponential surge that comes to seem almost normal. But how is it possible?

In Why We Cooperate, just published by Boston Review Books, Tomasello gives a succinct account of his work with a research team conducting comparative studies of the behavior of human infants and our closest primate relations, especially chimpanzees.

Their findings suggest that we are distinguished, as a species, by capacities for empathy, generosity, cooperation, and a sense of fair play. Some of these tendencies are found among the great apes, but not to anything like the degree to which they manifest themselves in children from very early in their development. These distinctive capacities form the bedrock of our capacity to accumulate, over time, not just wealth but complex behavior.

The new book -- based on the Tanner Lectures delivered by Tomasello at Stanford University in early 2008 -- is a lay reader's introduction to work described in The Cultural Origins of Human Cognition (Harvard University Press, 1999) and Origins of Human Communication (MIT Press, 2008). Peers who comment on his work in the “forum” section of Why We Cooperate sometimes question the degree to which these abilities are hard-wired into us -- rather than being acquired through, or at least stimulated by, nurture and communication. But they concur that Tomasello and his team have opened up fruitful lines of inquiry into the source and nature of human development.

There is evidence, Tomasello writes, “that from around their first birthdays -- when they first begin to walk and talk and become truly cultural beings -- human children are already cooperative and helpful in many, though obviously not all, situations.” Faced with an adult they have never met before, for example, an infant between the ages of 14 and 18 months will help with “everything from fetching out-of-reach objects to opening cabinet drawers when the adult’s hands are full.”

This behavior is not the work of little rational-choice theorists in diapers. Children who were consistently rewarded for their assistance were found to be less helpful in subsequent experiments. Chimpanzees, too, were found to possess some inclination toward altruism. The major distinction on this point is that small children are both better able and more willing to share information in order to be helpful -- for example, by pointing out the location of a stapler whose whereabouts in the laboratory the child knows. While apes are capable of some very limited exchanges with humans, their messages tend to be self-interested (helping convey where a tool is that will be useful in getting them food) and they do not teach each other to communicate.

As they get older, writes Tomasello, the “relatively indiscriminate cooperativeness” of human children “becomes mediated by such influences as their judgments of likely reciprocity and their concern for how others in the group judge them....” This is not a matter of self-interest alone -- of doing unto others just as generously as they do unto you. The knack for moral bookkeeping does develop, of course. But first we acquire a sense that there are general rules for how things ought to be done, and that everyone ought to abide by them.

Three year-old children were shown a game that could be played by one person. “When a puppet later entered and announced that it, too, would play the game, but then did so in a different way,” reports Tomasello, “most of the children objected, sometimes vociferously. The children’s language when they objected demonstrated clearly that they were not just expressing their personal displeasure at a deviation. They made generic, normative declarations like, ‘It doesn’t work like that,’ ‘One can’t do that,’ and so forth.”

This is intriguing because the children’s perspective is disinterested. “It is one thing to follow a norm ... to avoid the negative consequences of not following it,” says Tomasello, “and it is quite another to legislate the norm when not involved oneself.”

In the case of the puppet experiment, I suppose “enforce” is a more appropriate word than “legislate.” But either way, it suggests the early development of a capacity to grasp the principle of a common and impersonal norm.

This both reflects and reinforces our capacity for cooperative action -- which may be the very thing that distinguished our hominid ancestors from other primates. Groups of small children “engage in all kinds of verbal and nonverbal communication for forming joint goals and attention and for coordinating their various roles in the activity,” says Tomasello, while his colleagues find nothing comparable to this range and complexity of cooperation among the great apes.

"Indeed,” he writes, “I believe that the ecological context within which these skills and motivations developed was a sort of cooperative foraging. Humans were put under some kind of selective pressure to collaborate in their gathering of food -- they became obligatory collaborators -- in a way that their closest primate relatives were not.... We could also speculate that since hunter-gatherer societies tend to be egalitarian, with bullies often ostracized or killed, humans underwent a kind of self-domestication process in which very aggressive and acquisitive individuals were weeded out by the group.”

Thanks to millennia of progress, we have reached a plateau of development where it is commonly accepted that existence is a war of all against all, and Donald Trump is taken to embody the traits that drove human evolution itself. (Otherwise he might look like the missing link with a hairpiece.)

None of this makes for optimism about our next ten thousand years or so -- or decade, for that matter. Hard as it is to wrap one’s mind around the depths of time and transformation involved in reaching this stage of civilization, it can be still more difficult to imagine how we can continue. The scale of possible aggression now -- let alone the unintended consequences of raw acquisitiveness -- go beyond anything our primate brains are quite ready to picture.

But for all that, the research by Tomasello and his associates is at least somewhat encouraging. It suggests that collaboration, sharing, and even generosity are not late developments in human existence -- merely secondary or superfluous capacities. They are essential. They came first. And they could yet assert themselves as a basis for reorganizing life itself.

Then, perhaps, our prehistory would come to an end -- and something like a civilization worthy of human beings would begin.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Pages

Subscribe to RSS - Anthropology
Back to Top