Humanities

Commentary on American mass shootings

Only satire can look certain horrible realities in the eye, as The Onion did with its article from last year about a lone-wolf mass shooting of random strangers. Its headline cut to the quick: “‘No Way To Prevent This,’ Says Only Nation Where This Regularly Happens.”

It’s the real American exceptionalism. Rampage shootings do take place in other countries (the 1996 Dunblane school massacre in Scotland, for example), but rarely. They remain distinct events in the public memory, rather than blurring together. In the United States the trauma is repetitive and frequent; only the location and the number of victims seem to change.

With Charleston we have the additional grotesquerie of a presidential candidate calling Dylann Roof’s extremely deliberate act an “accident” while the director of the Federal Bureau of Investigation made a point of denying that it was terrorism. (The shooter was an avowed white supremacist who attacked an African-American church and took care to leave one survivor to tell the tale. By no amount of semantic weaselry can it be described as anything but “[an] act of violence done or threaten[ed] to in order to try to influence a public body or citizenry,” to quote the director's own definition of terrorism.) But American rampage shootings do not always express an ideological agenda, or even a motive intelligible to anyone but the gunman. The meaninglessness of the violence, combined with its regularity, is numbing. So with time our scars grow callused, at least until the next spree rips them open again.

A few years ago Christopher Phelps, an intellectual historian who happens to be my closest friend, moved with his family to England, where he is now a senior lecturer (i.e., an associate professor) in American and Canadian studies at the University of Nottingham. At some point the British media began turning to him for commentary on life in these United States. “I tend to be asked on radio shows when there's a need for American expertise -- and implicitly an American accent, which adds an air of authenticity,” he wrote in an email when I asked him about it.

Among the topics he’s been asked about are “the re-election of Obama, the anniversary of JFK's death, and even what comprises the East Wing of the White House, since one only ever hears about the West Wing.” Of late, America’s everyday mayhem keeps coming up. In 2013 he discussed the Trayvon Martin case. Last August, it was the girl whose training in automatic weapons on an Arizona firing range ended when she lost control and sprayed her instructor with bullets. Phelps appeared on a nationally broadcast talk show hosted by Nick Ferrari, which seems like the perfect name for a bigger-than-life radio personality.

Ferrari wasted no time: “What is it with Americans and guns?” he asked. A fair question, though exceedingly blunt.

“I should have anticipated that, I suppose,” Phelps says now, “but I froze like the proverbial deer in the headlights, stuttering away.” Since then, unfortunately, he has gained experience in answering variations of the question. “The producers need people to do it,” he explains, “the university media team work hard to set up the gigs, and you feel as an American you should step in a bit to help modulate the conversation, but it sweeps away my life for a day or two when I have other plans and some psychopath shoots up America.” (The BBC program for which he was interviewed following the Charleston shootings can be found here.)

It is still depressing,” Phelps continues, “in fact draining, to be put in the position of explaining my people through this kind of event, but reflection has prompted some better ways of answering.”

A one-sentence question about the American pattern of free-range violence takes many more to address at all concretely. Phelps's assessment bears quoting at length:

“While I'm as drawn to generalities as anyone -- I've always thought there was something to H. Rap Brown's declaration that ‘violence is as American as cherry pie’ -- it’s important to realize that most American households do not possess guns, only a third do. So gun owners do not comprise all Americans but a particular demographic, one more white, male and conservative than the general population.

“The shooters in mass killings, likewise, tend to be white men. So we need to explain this sociologically. My shorthand is that white men have lost a supreme status of power and privilege, given a post-’60s culture claiming gender and racial equality as ideals, yet are still socialized in ways that encourage aggressiveness.

“Of course, that mix wouldn't be so dangerous if it weren't easy to amass an arsenal of submachine guns, in effect, to mow people down. Why do restrictions that polls say Americans consider reasonable always get blocked politically, if gun-owning households are a minority? For one thing, the gun manufacturing corporations subsidize a powerful lobby that doubles as a populist association of gun owners. That, combined with a fragmented federalist system of government, a strongly individualist culture and the centrality of a Constitution that seems to inscribe ‘the right to bear arms’ as a sacred right, makes reform very difficult in the United States compared to similarly positioned societies. This suggests the problem is less cultural than political.”                      

Following the massacre of 26 people, most of them children, at Sandy Hook Elementary School in Connecticut in 2012, National Rifle Association executive vice president Wayne LaPierre waited several days before issuing a statement. Whether he meant to let decent interval pass or just needed time to work up the nerve, his response was to blame our culture of violence on… our culture of violence.

He condemned the American entertainment industry for its constant output of “an ever more toxic mix of reckless behavior and criminal cruelty” in the form of video games, slasher movies and so forth. The American child is exposed to “16,000 murders and 200,000 acts of violence by the time he or she reaches the ripe old age of 18” -- encouraging, if not spontaneously generating, LaPierre said, a veritable army of criminals and insane people, just waiting for unarmed victims to cross their paths. “The only way to stop a monster from killing our kids,” he said, “is to be personally involved and invested in a plan of absolute protection.”

The speech was a marvel of chutzpah and incoherence. But to give him credit, LaPierre’s call for “a plan of absolute protection” had a sort of deluded brilliance to it -- revealing a strain of magical thinking worthy of… well, when you get right down to it, a violent video game. Despite living in a society full of people presumably eager to act out their favorite scenes in Natural Born Killers and American Psycho, having enough firepower will give you “absolute protection.”

On many points, Firmin DeBrabander’s book Do Guns Make Us Free? Democracy and the Armed Society (Yale University Press) converges with the analysis quoted earlier from my discussion with Christopher Phelps. But DeBrabander, an associate professor of philosophy at Maryland Institute College of Art, places special emphasis on the corrupting effect of treating the Second Amendment as the basis for “absolute protection” of civil liberties.

The vision of democracy as something that grows out of the barrel of a gun (or better yet, a stockpile of guns, backed up with a ready supply of armor-piercing bullets) involves an incredibly impoverished understanding of freedom. And it is fed by a paranoid susceptibility to “unmanageable levels of fear,” DeBrabander writes, and “irrationalities that ripple through society.”

He turns to the ancient Stoic philosophers for a more robust and mature understanding of freedom. It is, he writes, “a state of mental resolve, not armed resolve. Coexisting with pervasive threats, Seneca would say, is the human condition. The person who lives with no proximate dangers is the exception. And it’s no sign of freedom to live always at the ready, worried and trigger-happy, against potential threats; this is the opposite of freedom.” It is, on the contrary, “a form of servitude,” and can only encourage tyranny by demagogues.

“Freedom,” DeBrabander goes on to say, “resides in the ability to live and thrive in spite of the dangers that attend our necessarily tenuous social and political existence -- dangers that are less fearsome and debilitating to the extent that we understand and acknowledge them.” It is only one of many good points the author makes. (See also his recent essay “Campus Carry vs. Faculty Rights” for Inside Higher Ed.) And the certainty that another mass shooting will take place somewhere in the United States before much longer means we need all the stoicism we can get.

Editorial Tags: 

Ohio's largest community college receives unprecedented gift for humanities program

In an era when the humanities are overlooked or derided by politicians, one Ohio community college landed a $10 million grant to boost the liberal arts.

New book proposes teaching-intensive tenure-track model to address 'real' crisis in the humanities

New book proposes teaching-intensive tenure track to address what it calls the "real" crisis in the humanities.

A competing manifesto on the value of academic conferences (essay)

Last week, the New York Times's “Opinionator” published an essay in which Christy Wampole decried the present state of humanities scholarship by holding up the worst forms of conference behavior to ridicule.

Let’s be honest: all academics have groaned at plenary papers that go over the time limit or senior colleagues who assume their listeners will fully absorb their arguments even when delivered in a monotone with no attention to rhetorical context. These examples of inconsiderate academics are certainly not the norm, however, just as misbehavior or nastiness are not often the norm in any other professional arena.

Wampole herself admits to engaging in the narcissistic habit of answering emails during plenaries and having “listened for the first five minutes of the talk, just long enough to seize upon a word around which [she’ll] construct a pseudoquestion in the Q and A.” She includes herself among those who sometimes give a paper and then spend the rest of the conference at the pool bar.

To suggest as she does, however, that we should judge the quality or the future of the humanities by these unfortunate instances of a professional lack of grace is irresponsible. It is judging a profession by its lowest common denominator, and it obscures the good, important exchange of ideas and generation of knowledge that occurs at academic conferences year in and year out, throughout most academic careers.

It also feeds the worst stereotypes about academics that subsequently become fuel for political agendas across the country seeking to defund education at the great expense of America’s future.

Of late, public critiques of the humanities have taken the explicit form of assertions that the disciplines have no practical value or contemporary relevance in a technological world. Implicitly, such critiques also manifest in persistent funding cuts to arts programs, in calls for exclusively STEM-based initiatives to improve our educational system, in claims about the unemployability of humanities graduates, even in assertions by some defenders of the disciplines that humanities knowledge is primarily good for business, economics or public-policy makers -- which imply that such knowledge and experience has no value if it cannot be turned to moneymaking.

But in the last few years, there has also been what feels like an exponential increase in those willing to engage in national conversations that ask, and attempt to answer, tough questions about these issues. Academics and nonacademics alike have filled the pages of The New York Times, Inside Higher Ed, The Chronicle of Higher Education, The Atlantic and scores of other outlets with meditations on the costs of higher education humanities study, who is served, who is left out and the role of the humanities in shaping young minds or good citizens or brilliant scientists or desirable employees.

We are encouraged by this general willingness to engage in these tough intellectual conversations. At the same time, we are disheartened by the propensity of so many, both within higher education and outside of it, to rely upon the dismissive premise that academics largely exist in a secluded world in which they care only about their own infinitesimal research interests, which are esoteric at best, incomprehensible and a waste of taxpayers’ money at worst.

This is not to suggest we should not all, as professionals, always strive to make our practices better, to keep pace with the times, to question our own assumptions and habits, to identify honestly what is not working, and to change it where we can. However, it is to suggest that perhaps a better model for doing so is one that is based on the notion that academics -- as teachers, researchers, mentors and institutional colleagues -- go into their chosen profession with the desire to advance knowledge through collaborative means.

Contrary to the misrepresentation of academic conferences as attended only by dreary caricatures of the out-of-touch professor rambling on about irrelevant ideas, most conferences we attend are places where we try out ideas among our colleagues, launch collaborations, consider the pedagogical and public import of our findings, mentor graduate students, and participate in the transformations of our fields in ways that make us better teachers and better researchers.

Many of us value conferences for both private and professional reasons, as David M. Perry points out in his May 6 Chronicle of Higher Education response to Wampole’s essay, and as Devoney Looser has recently enumerated in her Chronicle guide to conference etiquette. We, like both of them, encourage thinking about conferences as an important means of entry into our disciplinary communities.

Conferences help to provide what many faculty cannot find at their home institutions: a community of minds focused on a particular issue. For faculty members everywhere but the Ivy League or a very well-funded public university, inviting speakers to campus who can give lectures and seminars on the latest research ideas or programmatic innovations is not a given, nor is access to a world-class research library. These facts are especially true in the context of many states’ perilous hollowing out of the financial support for public colleges and universities.

As faculty numbers continue to shrink, academics often find themselves a party of one in their departments, working as the sole representative of a particular field, without immediate access to colleagues in their fields of expertise. Done well, an academic conference offers a chance for collegial dialogue of the sort that can lead to tangible progress. When faculty members attend conferences, students and their institutions also directly reap the benefits.

Conferences can be particularly important for scholars of color and others who find themselves disenfranchised by administrations and by institutionalized injustices on their own campuses. Although we recognize that unfortunately many conferences have a long way to go to truly support marginalized academic communities, we are encouraged by those we have seen working explicitly to foster this kind of inclusivity.

For many faculty members struggling with the isolation of being seen as a “representative” member of an underrepresented group, conference networking can be a crucial path for figuring out how to navigate their own institutions, for dealing with the microaggressions of students, administrators and other faculty, and for coping with the additional and unique responsibilities they often face alone of mentoring minority student populations or administering programs. Conferences also have the potential to be sites for the birth of activism, where communities both formal and informal unite to make changes in how things are done, how people are treated and how certain ideas are valued.

Conferences are, in other words, even more important for those not privileged by mainstream academic cultures than they are for the elites. A researcher at Princeton has regular access to communities of scholarship that would be completely unknown to most attendees of a scholarly conference. Perhaps most depressingly, such intellectual communities are often nonexistent for the contingent faculty who are rarely fully integrated members of their own departments and who, despite being engaged in rigorous research, cannot attend conferences even when they want to because their institutions do not support the professional development of these integral members of their communities.

Wampole submits that “conferences feel necessary, but their purpose is unclear.” While the exact form that conference collaborations take might usefully be retooled, their purpose in supporting innovations in research, teaching, administration or activism could not be more clear. The process of making a productive contribution to research depends upon knowing what people already know, and this is significantly aided by the feedback of other scholars working on similar or related questions. Even as we acknowledge the legitimate problem of the environmental impact of that much travel, we don’t think anything can fully substitute for the intellectual experience of hearing a good plenary talk followed by a vigorous debate that is the catalyst for deeper conversations throughout the conference. Published scholarship is essential, but it takes time to develop, and face-to-face conversations and the accountability conferences provide are a great way to incubate ideas that are just being formed.

Could conferences be better? Of course they could, but they are organized and run by groups of committed faculty members or the staff they have hired to help them, who do their best despite inevitable budget constraints and competing time demands. Instead of focusing on the problem of boredom, how about addressing truly meaningful problems, like the economic barriers to participation for graduate students and less financially privileged researchers, or lessening the impact of mass travel on the environment, or the lack of child care resources, or the way such conferences are misrepresented in the anti-intellectual popular media?

Here is the bottom line: conferences are created by the faculty they serve. They are not merely events where we put ourselves on display or where we criticize from an outside position -- they are collaborative ventures. Faculty researchers do not just attend their conferences; they own them. And so, we offer the following countermanifesto.

A Conference Manifesto for the Rest of Us:

(1) We will consider the quality of the conferences we attend as our own responsibility. If we are unhappy with the structure, we will contact the organizing committee or form a coalition to initiate changes to the obstacles that limit the conference’s success. (We know of one such coalition currently forming in response to a lack of female presenters at a major conference, and this is not an isolated example.)

(2) We will strive to be precise and productive. We will offer meaningful rather than petty critiques, strive not to generalize from extreme examples and, as much as possible, focus on useful alternatives rather than finger pointing.

(3) If we are not in a position of power, or we feel too disaffected to contribute to positive solutions at a structural level, then we will be the change we seek in our individual interactions. If a scholar presents a paper in which the larger purpose is not clear, we will ask him about that purpose during the Q and A. If it is clear that a speaker is having trouble articulating an argument, we will help her see what it is. We will attend as many events as we can, offer real feedback and participate in real discussions. Put more simply, we will continue to be generous.

(4) We will acknowledge academic generosity where we find it, namely:

  • in the organizers who laboriously put together meeting programs, speakers and events to foster collaborative dialogue and the exchange of ideas;
  • in the keynote speakers, senior colleagues and established scholars who routinely engage more junior members of the profession in meaningful conversations;
  • in the conference-goers who ask thoughtful questions;
  • in the professors who mentor students;
  • and, institutionally beyond the world of conferences, in the faculty who work to improve conditions on their campuses, in the anonymous reviewers who provide constructive feedback on essays and in the adjuncts who spend endless unremunerated hours facilitating learning.

(5) We will be humble. We will recognize that although humanists are excellent at being critical, we are fortunate to have these communities to help us improve our research.

(6) We will attempt always to get over ourselves. Our presentations may be great, but they aren’t perfect.

(7) And finally, we will be aware. We will continue to think carefully about how we use the resources invested in us as scholars. It appears to us that the humanities are at least beginning to be recognized as having both intrinsic and extrinsic values, and it is up to us to communicate those values to people who doubt both, rather than to reinforce stereotypes through exclusionary rhetoric or condescension. We posit that there is real value in the thoughtful public intellectual, and we will work to be scholars who are willing to ask hard questions about our own work, to engage in thorny debates about priorities, to radically reimagine what higher education might look like in the 21st century and to challenge the parameters or privileges of our own positions. We will make sure that we can clearly show why our work matters, because no matter how frustrating conferences can be, they are places where humanities scholarship does some of its most important work.

Cora Fox is an associate professor of English and associate director of the Institute for Humanities Research at Arizona State University. Andrea Kaston Tange is a professor of English at Eastern Michigan University and editor of the Journal of Narrative Theory. Rebecca A. Walsh is an assistant professor of English at North Carolina State University and co-chair of The H. D. International Society.

Section: 
Editorial Tags: 

Scholar discusses his book on the creation of the research university and disciplines

Author of new book on the creation of the research university discusses the role of disciplines and information overload -- from the 18th century to the rise of MOOCs.

Essay criticizes studies that claim to show Shakespeare is ignored by English departments

Were it so… that some little profit might be reaped (which God knows is very little) out of some of our playbooks, the benefit thereof will nothing near countervail the harm that the scandal will bring unto the library, when it shall be given out that we stuff it full of baggage [i.e., trashy] books.

-- Sir Thomas Bodley, founder of the University of Oxford’s Bodleian Library, explaining why he did not wish to keep English plays in his library (1612).

On William Shakespeare’s birthday this year, the American Council of Trustees and Alumni (ACTA) issued a report, “The Unkindest Cut: Shakespeare in Exile in 2015,” which warned that “less than 8 percent of the nation’s top universities require English majors to take even a single course that focuses on Shakespeare.” Warnings about the decline of a traditional literary canon are familiar from conservative academic organizations such as ACTA and the National Association of Scholars. What increasingly strikes me, however, is how frozen in amber these warning are.

In a nation obsessed with career-specific and STEM education, there is scant support for humanities in general. Where are the conservative voices advocating for the place of English and the humanities in the university curriculum? One would think this advocacy natural for such academics and their allies. After all, when Matthew Arnold celebrated the “best that has been thought and known,” he was proposing cultural study not only as an antidote to political radicalism but also to a life reduced, by the people he called philistines, to industrial production and the consumption of goods.

We have our modern philistines. Where are our modern conservative voices to call them out? Instead, on the shrinking support for the liberal arts in American education -- the most significant issue facing the humanities -- organizations such as ACTA and NAS mistake a parochial struggle over particular authors and curricula for the full-throated defense of the humanities.

Worse, these organizations suggest that if one does not study Shakespeare or a small set of other writers in the traditional literary canon (moreover, in only certain ways), then literature and culture are not worth studying -- hardly a way to advocate for literary studies.

The requirements at my own institution suggest how misleading the ACTA position is, and how thin a commitment to the humanities it represents. With no Shakespeare requirement in the George Mason University English department, it is true that some of our majors won’t study Shakespeare. However, because our majors must take a course in a pre-1800 literature -- nearly all the departments ACTA examined have a similar requirement -- that means they’ll study Chaucer, or medieval intellectual history, or Wyatt, Sidney, Donne, Jonson, Milton, etc. (The study of Spenser, however, appears to me somewhat in decline; ACTA, if you want to take up the cause of The Faerie Queene, let me know.)

How can writers as great as these be off ACTA’s map? Is it because ACTA doesn’t really value them? Its Bardolatry is idolatry -- the worship of the playwright as wooden sign rather than living being, a Shakespeare to scold with, but no devotion to the rich literary and cultural worlds of which Shakespeare was a part. Hence, too, the report maintains that a course such as Renaissance Sexualities is no substitute for what it calls the “seminal study of Shakespeare” -- though certainly such a course might feature the Renaissance sonnet tradition, including Shakespeare’s important contribution to it, not to mention characters from Shakespeare’s plays such Romeo and Juliet or Rosalind and Ganymede.

ACTA also warns that rather than Shakespeare, English departments are “often encouraging instead trendy courses on popular culture.” This warning similarly indicates the narrowness of ACTA’s commitment to literary study. As anyone who’s ever taken a Shakespeare course should know, not only were Shakespeare’s plays popular culture in his own day (English plays were scandalous trash, thought Thomas Bodley), but also the very richness of Shakespeare’s literary achievement comes from his own embrace of multiple forms of culture. His sources are not just high-end Latin authors but also translations of pulpy Italian “novels,” English popular writers, folktales, histories and travelogues, among others. The plays remain vibrant today because Shakespeare allows all these sources to live and talk to one another.

Indeed, the literary scholars William Kerrigan and Gordon Braden point out that in this quality Shakespeare was typical of his age, for the vibrancy of the Renaissance derives in part from its hybridity. The classical was a point of departure, but neither Shakespeare nor Renaissance culture was slavishly neoclassical. Modern English departments, in their embrace of multiple literary cultures, in their serious study of our human expression, evince the same spirit. 

Conservatives have suggested that the hybridity of the modern English major is responsible for declining interest in the major. That claim cannot be proved. Anecdotes and intuitions are insufficient to do so. Data on trends in the number of majors over time can only show correlation, not causation.

And in terms of correlation, here are four more likely drivers of the decline in the percentage of students majoring in English: students are worried about finding jobs and are being told (wrongly, according to the actual statistics) that the English major is not a path to one; students now have many new majors to choose from, many no longer in the liberal arts; English has traditionally had more female than male majors, and women now pursue majors, such as in business or STEM fields, from which they used to be discouraged (a good change); political leaders have abandoned the liberal arts in favor of STEM and career-specific education and are advising students to do the same (even President Obama jumped on this bandwagon, though he later apologized).

Regarding this last cause, the voices of organizations such as ACTA and NAS could particularly help, since many of these politicians are conservatives, and leaders of these academic organizations have ties to conservative political circles. In doing so, conservatives could help reclaim a legacy. In 1982, William Bennett, as chair of the National Endowment for the Humanities, urged colleges to support the humanities against “more career-oriented things.” By 1995, Bennett had become disgusted with what he saw as an overly progressive agenda in the humanities. Picking up his marbles and going home, Bennett urged Congress to defund the NEH. More recently, Bennett agreed with North Carolina Governor Pat McCrory that the goal of publicly funded education should be to get students jobs. “How many Ph.D.s in philosophy do I need to subsidize?” Bennett asked.

Shakespeare was generous in his reading and thinking. We can be, too. Literary scholars may disagree on many things -- on the values to be derived from a particular literary work, on the ways it ought to be framed, on which literary works are most worthy of classroom reading. But such disagreements are just part of the study of the humanities in a democratic society. When we support the humanities, we support an important public space to have these disagreements. We also support Shakespeare -- who really isn’t going away from the English curriculum -- and the study of literature more generally.

The ACTA study, as far as I can tell, was mainly met with silence. That’s because the study is a rehash of an earlier one from 2007, itself a rehash of the culture wars of the 1980s and ’90s. No one cared, because most people have moved on from the culture wars, and for many of our political leaders, culture itself doesn’t much matter anymore. Culture wars have become a war on culture. In that battle, all lovers of literature should be on the same side. Advocating for the humanities, even as we argue about them, is walking and chewing gum. We should be able to do both at the same time. I appeal to conservative academic organizations that we need to. The one-sided emphasis on majors that lead directly to careers and the blanket advocacy of STEM fields are far greater threats to the humanities than sustainability studies. And without the humanities, there is no institutionalized study of Toni Morrison. Or pulp fiction. Or Sidney. Or Shakespeare.

Robert Matz is professor of English, with a focus on English Renaissance literature, at George Mason University. He currently serves as senior associate dean for George Mason’s College of Humanities and Social Sciences.

Editorial Tags: 
Image Source: 
Wikimedia Commons

Essay on a long academic job search

A year after Patrick Iber's story of rejection captured so much attention, he offers an update.

Job Tags: 
Ad keywords: 
Topic: 
Editorial Tags: 
Show on Jobs site: 

Report offers a mixed picture of state of the humanities in 2015

Section: 

Annual report on the disciplines acknowledges cuts and challenges, but also sees signs of hope and growth.

Anna Deavere Smith delivers NEH's Jefferson Lecture

Anna Deavere Smith, actress and playwright, argues in Jefferson Lecture that humanities are key to a better understanding of what it is to be American.

Review of Richard H. Davis, "The Bhagavad Gita: A Biography"

Lost among my books, probably in a box somewhere, is a paperback copy of Bhagavad Gita As It Is, offered to me at a reasonable price by a smiling Hare Krishna devotee working the crowd in Union Square. The word “smiling” is probably redundant. What the group advertises is bliss, and it would be a pretty shoddy product if it broke down under the pressure of New Yorkers’ indifference.

I bought it -- the book, anyway -- but hadn’t noticed it going AWOL until reading Richard H. Davis’s The 'Bhagavad Gita': A Biography, a volume in Princeton University Press’s rewarding Lives of Great Religious Books series. Davis, a professor of religion at Bard College, mentions that A. C. Bhaktivedanta, “a vigorous 70-year-old Bengali,” arrived in the United States in 1965 and in short order was teaching and chanting among the protohippies in Greenwich Village. Swami Prabhupada, as he came to be known, published his own heavily annotated edition of the Gita in 1968 -- the one you can still get from his robed and shaved-headed acolytes now, 50 years after he began proselytizing.

The swami went on to his reward in 1977. The International Society for Krishna Consciousness he founded can now claim, semiplausibly, to have put out more than 20 million copies of Bhagavad Gita As It Is in some 56 languages. It is a sign of Davis’s accomplishment with his “biography” that he leaves the reader aware of how small a blip those missionary efforts are in the context of the Gita’s history -- let alone on the scripture’s own cosmic scale.

As sacred texts go, the Bhagavad Gita (“song of the Lord”) is notable for both its brevity and the relatively straightforward relationship between doctrine and narrative. It has a plot. The setting is ancient India, shortly before a war that will leave more than a million dead. Arjuna, a warrior by birth, surveys the two armies poised for battle and, turning to his charioteer, Krishna, expresses overwhelming despair at the pointlessness of the fratricidal warfare about to begin.

Krishna first counsels a kind of stoic attitude toward the performance of duty: the lot of the warrior is to fight, but without attachment, to fulfill destiny without desire or fear as to its outcome. It is attachment, the corruption of action by the passions, that keeps someone bound to the cycle of rebirth.

Then Krishna reveals that he is not just a god moonlighting as chariot driver but the Supreme Being ne plus ultra, something beyond all human imagination or understanding: “Arjuna sees Krishna’s arms and eyes, bellies and mouths, stretching out in all directions. He sees all the gods contained within Krishna’s vast body.” The vision can only be called mind melting as Krishna speaks the words that Robert Oppenheimer recalled while witnessing the first atomic explosion:

If the radiance of a thousand suns

Were to burst at once into the sky,

That would be like the Splendor

Of the Mighty One…

I am become Death

The shatterer of worlds.

Returning to human form, Krishna makes what is in some ways the most powerful revelation of all. Love and devotion are Krishna’s due, and Arjuna is prepared to give them. But the relationship is not one-way. Krishna expresses his love for Arjuna and promises to be the warrior’s ultimate refuge: “I will liberate you from all sins. Do not grieve.”

With that, Arjuna’s doubts and hesitation are put to rest, and the battle is joined.

The dialogue appears as a philosophical interlude in The Mahabharata, an epic poem of prodigious scale. It is unclear which came first -- the dialogue may have been composed as part of the larger work and then extracted, or it could be a freestanding text that some ancient editor spliced in. “Some observers,” Davis notes, “have pointed to the unlikelihood, or the ‘dramatic absurdity,’ as one noted Indologist put it, of great masses of zealous warriors sitting idly by for ninety minutes while a soldier and his charioteer chat in the no-man’s land.”

As an aesthetic objection that seems fair enough. The situation doesn’t work as a realistic segment in a chronicle of war. (I can’t say, having never read The Mahabharata, or even met anyone who has.) But its “dramatic absurdity” nonetheless works in expressing the mood of terrible existential pain, the moment of facing life or death and feeling overwhelmed by the reality right in front of you. That quality makes the Gita a powerful work even for readers incapable of regarding surrender to Krishna as what William James called “a live option.”

For medieval Indian poets, artists and sages, the conversation between Arjuna and Krishna resonated with ideas and debates of long standing; they read it as a work concentrating and clarifying doctrines expressed rather more obliquely in the Vedas, a much older set of scriptures. The Bhagavad Gita’s depiction of Krishna also put pressure on the devotees of other gods to produce revelatory works of their own. “These gitas,” Davis writes, “always involve discourses conveyed from deities to listeners that constitute authoritative instruction on the fundamental nature of the world along with guidance for effective human conduct leading to worldly benefits and ultimately liberation.”

Infomercials of the gods! Still, it was Krishna’s gita that became the Gita -- a text widely, if dubiously, regarded as “the Hindu bible.” Its ascension was no sure thing. In two absorbing chapters, Davis traces a series of stages leading from the first English translation in 1785 (a byproduct of British imperial interests) to widespread fascination among the literati (Thoreau took it to Walden pond, Whitman died with it under his pillow) to a kind of rebirth as an element of Indian national identity, in part through Gandhi’s reading of Edwin Arnold’s The Song Celestial, which put the Gita into English, and iambic pentameter to boot.

Davis notes that only a very small share of early iconography of Krishna shows him in scenes from the Bhagavad Gita. More commonly depicted were legends of his mischievous childhood or his role as combative young prince. Treating the Gita as the Hindu equivalent of the Judeo-Christian scriptures probably revealed more about British Protestant sensibilities than it did about Indian religion.

But it proved to be a productive sort of confusion: with so many questions about the Bible they knew troubling the minds of Westerners, the new scripture from the East proved timely. Davis says just a little about the broad similarity between Krishna and Christ (each understood as a human incarnation of the divine, with a message of love) but clearly it was on the minds of some enthusiasts even before gurus started making trips to Europe and America.

There’s so much else to say about The 'Bhagavad Gita': A Biography -- but my karma depends upon meeting a deadline, so not today. Princeton’s Lives of the Great Religious Books continues to offer interesting titles (up soon: The Book of Revelation) and is the rare instance of a series with a concept that really works.

Editorial Tags: 

Pages

Subscribe to RSS - Humanities
Back to Top