History

The Mood Is the Message

Certain research topics seem to destined to inspire the question, “Seriously, you study that?” So it is with the field of Twitter scholarship. Which -- just to get this out of the way -- is not actually published in 140 characters or less. (The average “tweet” is the equivalent of two fairly terse sentences. It is like haiku, only more self-involved.)

The Library of Congress announced in April that it was acquiring the complete digital archives of the “microblogging” service, beginning with the very first tweet, from ancient times. At present, the Twitter archive consists of 5 terabytes of data. If all of the printed holdings of the LC were digitalized, it would come to 10 to 20 terabytes (this figure does not include manuscripts, photographs, films, or audio recordings).

Some 50 million new messages are sent on Twitter each day, although one recent discussion at the LC suggested that the rate is much higher -- at least when the site is not shutting down from sheer traffic volume, which seems to be happening a lot lately. A new video on YouTube shows a few seconds from the "garden hose" of incoming Twitter content.

When word of this acquisition was posted to the Library of Congress news blog two months ago, it elicited comment by people who could not believe that anything so casual and hyper-ephemeral as the average tweet was worth preserving for posterity – let alone analyzing. Thanks to the Twitter archive, historians will know that someone ate a sandwich. Why would they care?

Other citizens became agitated at the thought that “private” communications posted to Twitter were being stored and made available to a vast public. Which really does seem rather unclear on the concept. I’m as prone to dire mutterings about the panopticon as anybody -- but come on, folks. The era of digital media reinforces the basic principle that privacy is at least in part a matter of impulse control. Keeping something to yourself is not compatible with posting it to a public forum. Evidently this is not as obvious as it should be. Things you send directly to friends on Twitter won't be part of the Library's holdings, but if you celebrated a hook-up by announcing it to all and sundry, it now belongs to the ages.

A working group of librarians is figuring out how to “process” this material (to adapt the lingo we used when I worked as an archival technician in the Library's manuscript division) before making the collection available to researchers. But it’s not as if scholars have been waiting around until the collection is ready. Public response to the notion of “Twitter studies” might be incredulous, but the existing literature gives you some idea of what can be done with this giant pulsing mass of random discursive particles.

A reading of the scholarship suggests that individual tweets, as such, are not the focus of very much attention. I suppose the presidential papers of Barack Obama will one day include an annotated edition of postings to his Twitter feed. But that is the exception and not the rule.

Instead, the research, so far, tends to fall into two broad categories. One body focuses on the properties of Twitter as a medium. (Or, what amounts to a variation on the same thing, as one part of an emerging new-media ecosystem.) The other approach involves analyzing gigantic masses of Twitter data to find evidence concerning public opinion or mood.

Before giving a thumbnail account of some of this work – which, as the bibliography I’ve consulted suggests, seems intrinsically interdisciplinary – it may be worth pointing out something mildly paradoxical: the very qualities that make Twitter seem unworthy of study are precisely what render it potentially quite interesting. The spontaneity and impulsiveness of expression it encourages, and the fact that millions of people use it to communicate in ways that often blur the distinction between public and private space, mean that Twitter has generated an almost real-time documentary record of ordinary existence over the past four years.

There may be some value to developing tools for understanding ordinary existence. It is, after all, where we spend most of our time.

Twitter shares properties found in numerous other new-media formats. The term “information stream” is sometimes used to characterize digital communication, of whatever sort. Inside Higher Ed “flows” at the rate of a certain number of articles per day during the workweek. An online scholarly journal, by contrast, will probably trickle. A television network’s website -- or the more manic sort of Twitter feed -- will tend to gush. But the “streaming” principle is the same in any case, and you never step into the same river twice.

A recent paper by Mor Naaman and others from the School of Communication and Information at Rutgers University uses a significant variation on this concept, the “social awareness stream,” to label Twitter and Facebook, among other formats. Social awareness streams, according to Naaman et al., “are typified by three factors distinguishing them from other communication: a) the public (or personal-public) nature of the communication and conversation; b) the brevity of posted content; and, c) a highly connected social space, where most of the information consumption is enabled and driven by articulated online contact networks.”

Understanding those “articulated online contact networks” involves, for one thing, mapping them. And such mapping efforts have been underway since well before Twitter came on the scene. What makes the Twitter “stream” particularly interesting is that – unlike Facebook and other social-network services -- the design of the service permits both reciprocal connections (person A “follows” person B, and vice versa) and one-sided (A follows B, but that’s it). This makes for both strong and weak communicative bonds within networks -- but also among them. And various conventions have emerged to allow Twitter users to signal one another or to urge attention to a particular topic or comment. Besides “retweeting” someone’s message, you can address a particular person (using the @ symbol, like so: @JohnDoe) or index a message by topic (noted with the hashtag, thusly: #topicdujour).

All of this is, of course, familiar enough to anyone who uses Twitter. But it has important implications for just what kind of communication system Twitter fosters. To quote the title of an impressive paper by Haewoon Kwak and three other researchers from the department of computer science at the Korea Advanced Institute of Science and Technology: “What is Twitter, a Social Network or a News Media?” (No sense protesting that “media” is not a singular noun. Best to grind one’s teeth quietly.)

Analyzing almost 42 million user profiles and 106 million tweets, Kwak and colleagues find that Twitter occupies a strange niche that combines elements of both mass media and homophilous social groups. (Homophily is defined as the tendency of people to sustain more contact with those they judge to be similar to themselves than with those who they perceive to be dissimilar.) "Twitters shows a low level of reciprocity," they write. "77.9 percent of user pairs with any link between them are connected one-way, and only 22.1 percent have reciprocal relationships between them.... Previous studies have reported much higher reciprocity on other social networking services: 68 percent on Flickr and 84 percent on Yahoo."

In part, this reflects the presence on Twitter of already established mass-media outlets – not to mention already-famous people who have millions of “followers” without reciprocating. But the researchers find that a system of informal but efficient “retweet trees” also function “as communication channels of information diffusion.” Interest in a given Twitter post can rapidly spread across otherwise disconnected social networks. Kwak’s team found that any retweeted item would “reach an average of 1,000 users no matter what the number of followers is of the original tweet. Once retweeted, a tweet gets retweeted [again] almost instantly on the second, third, and fourth hops away from the source, signifying fast diffusion of information after the first retweet.”

Eventually someone will synthesize these and other analyses of Twitter’s functioning -- along with studies of other institutional and mass-media networks -- and give us some way to understand this post-McLuhanesque cultural system. In the meantime, research is being done on how to use the constant landslide of Twitter messages to gauge public attitudes and mood.

As Brendan O’Connor and his co-authors from Carnegie Mellon University note in a paper published last month, the usual method of conducting a public-opinion poll by telephone can cost tens of thousands of dollars. (Besides, lots of us hang up immediately on the suspicion that it will turn into a telemarketing call.)

Using one billion Twitter messages from 2008 and ’09 as a database, O’Connor and colleagues ran searches for keywords related to politics and the economy, then generated a “sentiment score” based on the lists of 1,600 “positive” and 1,200 “negative” words. They then compared these “text sentiment” findings to the results of more traditional public opinion polls concerning consumer confidence, the election of 2008, and the new president’s approval ratings. They found sufficiently strong correlation to be encouraging -- and noted that by the summer of 2009, when many more people were on Twitter than had been the case in 2008, the text-sentiment results proved a good predictor of consumer confidence levels.

A different methodology was used in “Modeling Public Mood and Emotion: Twitter Sentiment and Socio-Economic Phenomena” by John Bollen of Indiana University and two other authors. They collected all public tweets from August 1 to December 20, 2008 and harvested from them data about the content that could be plugged into “a well-established psychometric instrument, the Profile of Mood States” which “measures six individual dimensions of mood, namely Tension, Depression, Anger, Vigor, Fatigue, and Confusion.” This sounds like something from one of Woody Allen’s better movies.

The data crunching yielded “a six dimensional mood vector” covering the months in question. Which, as luck would have it, coincided with both the financial meltdown and the presidential election of 2008. The resulting graphs are intriguing.

Following the election, the negative moods (Tension, Depression, etc.) fell off. There was “a significant spike in Vigor.” Examination of samples of Twitter traffic showed “a preponderance of tweets expressing high levels of energy and positive sentiments.”

But by December 2008, as the Dow Jones Industrial Average fell to below 9000 points, the charts show a conspicuous rise in Anger -- and an even stronger one for Depression. The researchers write that this may have been an early signal of “what appears to be a populist movement in opposition to the new Obama administration.”

“Tweets may be regarded,” write Bollen and colleagues, “as microscopic instantiations of mood.” And they speculate that the microblogging system may do more than reflect shifts of public temper: “The social network of Twitter may highly affect the dynamics of public sentiment…[O]ur results are suggestive of escalating bursts of mood activity, suggesting that sentiment spreads across network ties.”

As good a reason as any to put this archive of the everyday into the time capsule. And while my perspective on this may be a little off-center, I think it is fair that the Twitter record should be stored at the the Library of Congress, which also houses the papers of the American presidents up through Theodore Roosevelt.

Almost 20 years ago, I started to work there just around the corner from the bound volumes containing, among other things, the diaries of George Washington. The experience of taking a quick look at them was something like a rite of passage for people working in the manuscript division. And to judge by later conversations among colleagues, the experience was usually slightly bewildering.

You would open the volume and gaze at the very page where his hand had guided the quill. You would start to read, expecting deep thoughts, or historical-seeming ones, at any rate. And this, more or less, is what you found on every page:

"Rained today. Three goats died. Need to buy a new plow.”

He had another 85 characters to spare.

P.S. Follow me on Twitter here, and keep up with news on scholarly publishing here.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

After the Postsecular

Call it a revival, of sorts. In recent years, anyone interested in contemporary European philosophy has noticed a tendency variously called the religious or theological "turn" (adapting a formulation previously used to describe the "linguistic turn" of the 1960s and '70s). Thinkers have revisited scriptural texts, for example, or traced the logic of seemingly secular concepts, such as political sovereignty, back to their moorings in theology. The list of figures involved would include Emmanuel Levinas, Jacques Derrida, Gianni Vattimo, Alain Badiou, Giorgio Agamben, Slavoj Žižek, and Jürgen Habermas -- to give a list no longer or more heterogenous than that.

A sampling of recent work done in the wake of this turn can be found in After the Postsecular and the Postmodern: New Essays in Continental Philosophy of Religion, a collection just issued by Cambridge Scholars Publishing. One of the editors, Anthony Paul Smith, is a Ph.D. candidate at the University of Nottingham and also a research fellow at the Institute for Nature and Culture at DePaul University. The other, Daniel Whistler, is a tutor at the University of Oxford, where he just submitted a dissertation on F.W.J. Schelling's theology of language. I interviewed them about their book by e-mail. A transcript of the discussion follows.

Q: Let’s start with one word in your title -- "postsecular." What do you mean by this? People used to spend an awful lot of energy trying to determine just when modernity ended and postmodernity began. Does “postsecularity” imply any periodization?

APS: In the book we talk about the postsecular event, an obvious nod to the philosophy of Alain Badiou. For a long time in Europe and through its colonial activities our frame of discourse, the way we understood the relationship of politics and religion, was determined by the notion that there is a split between public politics and private religion. This frame of reference broke down. We can locate that break, for the sake of simplicity, in the anti-colonial struggles of the latter half of the 20th century. The most famous example is, of course, the initial thrust of the Iranian Revolution.

It took some time before the implications of this were thought through, and it is difficult to pin down when “postsecularity” came to prominence in the academy, but in the 1990s a number of Christian theologians like John Milbank and Stanley Hauerwas, along with non-Christian thinkers like Talal Asad, began to question the typical assumption of philosophy of religion: that religious traditions and religious discourses need to be mediated through a neutral secular discourse in order to make sense. Their critique was simple: the secular is not neutral. Philosophy is intrinsically biased towards the secular. If you follow people like Asad and Tomoko Masuzawa, this means it is biased toward a Christian conception of the secular, and this hinders it from appreciating the thought structures at work in particular religions.

One of the reasons the title of the book reads, “after the postsecular” is that we felt philosophy of religion had yet to take the postsecular event seriously enough; it was ignoring the intellectual importance of this political event and still clinging to old paradigms for philosophizing about religion, when they had in fact been put into question by the above critique. So, the question is: What does philosophy of religion do now, after the postsecular critique?

DJW: There are two other reasons we speak of this volume being situated after the postsecular. First, in our “Introduction” we distinguish between a genuine postsecular critique of the kind Anthony mentions and a problematic theological appropriation of this critique. The former results in a pluralization of discourses about religion, because the secular is no longer the overarching master-narrative, but one more particular tradition. The latter, however, has tried to replace the secular master-narrative with a Christian one, and so has perversely impeded this process of pluralization.

Yet it is precisely this theological move (exemplified by Radical Orthodoxy) which is more often than not associated with the postsecular. Thus, one of the aims of the volume is to move beyond (hence, “after”) this theological appropriation of the postsecular.

Second, we also conjecture in the Introduction that postsecularity has ended up throwing the baby out with the bathwater – that is, everything from the secular tradition, even what is still valuable. So, in Part One of the volume, especially, the contributors return to the modern, secular tradition to test what is of value in it and what can be reappropriated for contemporary philosophy of religion. In this sense, "after the postsecular" means a mediated return to the secular.

Q: You mentioned Radical Orthodoxy, of which the leader is John Milbank. His rereading of the history of European philosophy and social theory tries to claim a central place for Christian theology as "queen of the sciences." As an agnostic, I tend to think of this as sort of the intellectual equivalent of the Society for Creative Anachronism. But clearly it's been an agenda-setting program in some sectors of theology and philosophy of religion. In counterposing your notion of the postsecular to Radical Orthodoxy, are you implying that the latter is exhausted? Or does that mean that Radical Orthodoxy is still a force to be reckoned with?

APS: On the one hand Radical Orthodoxy, as a particular movement or tendency, is probably exhausted in terms of the creativity and energy that attracted a lot of younger scholars who were working mostly in Christian theology but also in Continental philosophy of religion.

In a way, those of us in this field know what Radical Orthodoxy is now -- whereas before its anachronism seemed to be opening genuinely interesting lines of intellectual inquiry, perhaps encouraging interesting changes in the structure of institutional religious life. Now its major figures have aligned themselves with the thought of the current Pope in his attempt at “Re-Christianizing Europe,” with its nefarious narrative of a Christian Europe needing to be defended against Islam and secularism. They are also aligned with the policies of the present-day UK Tory Party via Phillip Blond and his trendy ResPublica think-tank.

So, on the other hand, while its creative power is probably on the wane, it is still something that must be reckoned with -- precisely because of this newfound institutional power, and because we know that its research program ends in old answers to new questions. We have to move beyond mere criticism, though, to offering a better positive understanding of religion, philosophy, and politics, and this volume begins to do that. This means going far beyond addressing Radical Orthodoxy as such, though, and to addressing the reactionary and obfuscatory form of thought that lies beneath Radical Orthodoxy and which persists in other thinkers who don’t identity with this particular movement.

DJW: Yes, it is something broader that troubles continental philosophy of religion now – not merely Radical Orthodoxy as such, but what we try to articulate in our Introduction as the more general tendency to theologize philosophy of religion. Many philosophers of religion – even when they see themselves as opponents of Radical Orthodoxy – ultimately treat their discipline as an extension of theology. It is quite normal to attend a keynote lecture at a Continental philosophy of religion conference and end up listening to a theology lecture! This is the reason that questions concerning the specificity of philosophy of religion (what sets it structurally apart from theology) dominate After the Postsecular and the Postmodern. Such questions are not meant solely as attacks on Radical Orthodoxy, but aim to interrogate the whole zeitgeist in which Radical Orthodoxy participates.

Q: I'm struck by how your book reflects a revival of interest in certain thinkers -- Schelling, Bergson, Rosenzweig. Or rather, perhaps, their transformation from the focus of more or less historical interest to inspiration for contemporary speculation. How much of this is a matter of following in the footsteps of Deleuze or Žižek?

DJW: Deleuze and Žižek are exemplary figures for many of the contributors to this volume. We philosophize in their shadow – and, you’re right, in particular it is their perverse readings of Bergson, Schelling etc which have taught us how to relate to the history of philosophy in new, heterodox ways.

“Experiment” is one of the key words in After the Postsecular and the Postmodern: all of us who contributed wanted to see what new potential could be opened up within philosophy of religion by mutating its traditions and canons through the lens of contemporary speculation. Having said that, I think both terms of your distinction (“inspiration for contemporary speculation” and “historical interest”) are important at the present moment.

Ignorance of the history of philosophy of religion is the academic norm, and our wager is that through straightforward history of philosophy one can excavate resources that have been neglected, so as to begin to see the discipline afresh. It is a matter of revitalizing our sense of what philosophy of religion can do. Therefore, while mutating the history of philosophy is crucial, so too is understanding what that history is. So little has been written about Bergson or Rosenzweig’s contributions in this regard that a relatively straight-laced understanding of them is one of the volume’s most pressing tasks.

APS: In France at the time that Deleuze was studying and writing his first books, there was hegemony in the study of philosophy by the "three H's” (Hegel, Husserl, and Heidegger). He followed a different path in his own work, writing important studies on Hume, Bergson, and Nietzsche (amongst others). With the rise in Deleuze’s popularity these choices in figures have taken on the character of a canon, but at the time it was considered quite heretical and bold.

While the historical canon for mainstream Anglophone philosophy of religion tends to focuses on Locke, Hume, and Kant, we hope our volume helps to establish an alternative canon that draws on more speculative thinkers from the modern tradition, like Spinoza, Schelling, and Bergson. We think that not only will this help us to address the persistent questions of philosophy of religion but will allow us to reframe those very questions.

Q: The names of a few contributors are familiar to me from reading An und für sich and other blogs. Would you say something about how the sort of "floating seminar space" of online conversation shapes the emergence of a project like this one?

APS: Many people have noted the democratic nature of blogging, which can disrupt the usual hierarchies in the academic world. While that can lead to intensely antagonistic encounters -- especially in the early days when we were all still navigating this new social space -- it can also lead to incredible intellectual friendships. I started blogging when I was 19 in the hopes of being part of an intellectual community that I didn’t have at university. This lack of a community was partly because I was a commuter student traveling four hours round trip per day, which didn’t leave a lot of time to participate face-to-face, and partly because my own interests in religion were not shared by most of the other students in my philosophy department.

The group blogs I have been a part of, first The Weblog and then An und für sich, attracted people in similar situations -- people who existed in a liminal space between philosophy, theory, theology, and religious studies and wanted to discuss these issues, but for whatever reason couldn’t do so in their immediate communities.

I think it is safe to say that without the blogging community the volume wouldn’t have existed. It was because of the blog that Daniel first contacted me about participating in the postgraduate conference in philosophy of religion that he had set up in Oxford and it was this conference that ultimately led to the volume. We have tried to transfer the democratic spirit of blogging to the volume, so while we do have contributions from established academics in the volume, we also have included a number of graduate students, intellectuals outside the academy, and those still searching for a tenured position (if there are any!).

Even though we don’t have a “big name” like Žižek or Vattimo in the volume, we have still been able to attract interest simply on the strength of the ideas in the book, which are talked about on AUFS and other blogs. The volume has even made its way onto a syllabus already! John Caputo, formerly professor of philosophy at Villanova and now professor of religion and humanities at Syracuse, has his students reading the Editors’ Introduction for his graduate course called "The Future of Continental Philosophy of Religion," which we are really excited about.

Q: Sometimes the relationship of academic theological discourse to any creed or confession can be difficult to make out. With the philosophy of religion, obviously, such distance seems to be built right in. What are the stakes of your book – if any – for "people of faith," as the expression goes? That is, do you see this work as having consequences for what goes on at a church, synagogue, mosque, or whatever?

DJW: I tend to deploy a rather crude, form/content model on this issue: the material with which "people of faith," theologians, and philosophers of religion all deal is the same – "religion" in the broadest sense of the word. It is the operations of thought to which this material is subjected that differentiates them. What distinguishes philosophy of religion from theology or everyday religious practice is the specific kind of labor to which “religion” is here subjected. The question then becomes: Does "religion" after such transformations bear any resemblance to or (more importantly) have any relevance to the “religion” with which “people of faith” engage? And the answer is still very much open to dispute.

To take some examples: George Pattison (one of the contributors to the volume) is currently involved in a project on the phenomenology of religious life and it seems plausible that some form of this project could indeed be relevant to everyday religious practice – articulating its often implicit assumptions. On the other hand, I would be horrified if someone found a kernel of everyday relevance in my contribution on Schelling (in which I argue that names such as “Christ” or “Krishna” are literally the products of geological eruptions).

Personally (and here I am speaking very much for myself), I think there’s an element of smugness to the anti-“ivory tower” rhetoric that has emerged in the academy in the last century: the assertion that academics have something interesting or useful to say to the world imparts, in my mind, false value to what we say. In other words, I feel content to revel in the uselessness of my work.

APS: I love this answer! The militancy behind it stands against the pathetic “Theologian-Pope impulse” of so many theologians or the “Philosopher-King impulse” of so many philosophers that think the salvation for the world lies in our thought.

However, I want to nuance it somewhat, as I do think some of what lies behind what we do as academics, the reasons we take up this work, can participate in political struggles or help to deal with the very serious problems we face without our thought being directly “useful” in some crude practice of meeting targets or productivity goals. Spinoza wouldn’t have been much use as the ruler of the Netherlands, I’m sure, but when his ideas were taken up by others, and thereby mutated, they did have a real effect, and much of it positive.

The same goes for most of our great philosophers. But what Dan called the "uselessness" of our work in some sense mirrors the uselessness of religion in general. This character that religion has, identified by philosophers like Bataille, Nietzsche, and contemporaries like Goodchild, is in many ways offensive to the shape of contemporary life, where everything has its proper price, where we have to be thrifty and austere. Religion seems like a magnificent waste of time and money, unless of course it can be put to use convincing people to go to war to kill or be good little boys and girls and not harm their potential market value as workers with too much unclean living.

The same is true of this kind of academic work we do. It is useless within the parameters of contemporary society, but when contemporary society produces things like the poor and middle-class paying for massive bank bailouts and ecological disasters in the Gulf of Mexico and off the coast of Nigeria, then maybe uselessly thinking about things outside those parameters isn’t such a bad way to spend ones life.

Q: As I've been reading your book, Republican leader Newt Gingrich and others have been arguing idea that the imposition of Sharia law in the United States is an urgent danger that must be fought. From one perspective, this looks like pure cynicism; the notion that it’s a real issue in American political life is laughable. But what do you make of it? How does it fit in any narrative of the postsecular condition, or any analysis of the strains and fault lines of secularity?

APS: Right, there is about as much danger of Sharia law being imposed as there is of French becoming the national language! This is an example of what we call in our introduction the “obscure postsecular” (again drawing on Badiou). Out one side of their mouth these politicians tell us that we must defend our modern, secular values from the medieval barbarism of radical Islam, and out of the other side they are condemning secularists for not understanding the “power of religion.”

The power of this obscure postsecular, why it gets taken seriously, is because it latches on to a kernel of truth. Frankly, many in the public sphere don’t understand the power of religion! Hell, when it comes to Islam, many of them don’t even understand the basics, let alone that within Islam there is a cacophony of different spiritual practices and, as in most religions, an internal conflict between a law-bound Islam and an Islam of liberty. This is argued for very clearly in a number of French scholars of Islam, like Henry Corbin and Christian Jambet, though it doesn’t appear to have been a lesson the ruling class have learned going by the recent idiotic, racist and completely unsecular headscarf ban in France.

So, this lack of knowledge is behind both Gingrich’s call to resist Sharia law and the ruling, which Gingrich referenced, from the New Jersey judge that a Muslim man could forcibly rape his wife because it was a religious custom; I know of a number of Islamic feminists who I’m sure would like to speak with Judge Edith Payne! With both Gingrich and Payne we have an obscuring of the postsecular: they both recognize that something has changed, but they call on some transcendent identity of Islam or America that obscures any real confrontation with that change. Notice that neither one of them recognizes that there are elements within Islam -- mainstream Islam! -- that reject honor killings, abuse of women, the murder of civilians, and the like.

The situation becomes even more obscure in the UK, where I currently live. While in the U.S. all our money declares “In God We Trust”; in the UK all money bears the image of the sovereign, Queen Elizabeth II. Surely this, a divine right monarchy, is an example of the relic of medievalism that Gingrich mentions! Yet, on the other side of the bill, depending on the denomination, you will find Charles Darwin or Adam Smith. The very figures who ushered in the forms of thought that our old narratives tell us swept away medieval superstition.

Now, to my mind this means that all our conventional narratives of secularization are inherently flawed. The classic liberal narrative of a neutral secular has been undone by the postsecular event. The liberal secular was a weapon used in the expansion of European imperialism, which tried to deny those in the colonial world resources from their varied religious traditions.

At the same time the anti-liberal narrative that secularity is to be rejected because of this complicity is also false. It has a similar political function, by creating and exacerbating divisions within a particular class but along imaginary or unimportant differences, playing into a myopic Clash of Civilizations theory that actually engenders the reality of that clash. The volume offers resources towards constructing a very different theory of the secular, of a postsecular secular, what we call a “generic secular” that goes some way towards superseding these flawed, conventional narratives.

Practically that means both a straightforward understanding of particular religions as they present themselves in their complexity, suppressing as much as possible the imperialist tendencies of the liberal secular, and deploying the same kind of bold internal, immanent critique of these particular religions that we find in the modern thinkers covered in the volume. The answer to these political problems may partially be found by experimenting with ideas from Islam and Christianity from the position of the generic secular.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Arguing with Tony Judt

While scanning new magazines or newspapers, there are certain bylines you tend to notice. The list of them varies from person to person. But the habits of attention involved tend to be the same.

Usually the author is someone whose work you find informative, or stimulating, or otherwise agreeable (or some combination of these things). You tend to read the article immediately -- or postpone gratification until you’ve perused everything you must. Of course, things are not always so pleasant. The author may be your bête noir -- the very sight of his or her name provoking keen irritation. Which, to be sure, can involve its own pleasures.

Much of this speed-scan/instant notification is -- in my experience anyway -- involuntary, like a Pavlovian reflex. It would be possible to draw up a comprehensive checklist of authors whose bylines trigger my attention. But that would be after the fact. The "list" is unwritten and usually in flux. The whole process seems idiosyncratic and ad hoc. The brain knows what it wants, but isn't necessarily that rational or deliberate about it.

The historian Tony Judt, who died over the weekend, got entered into my registry not quite 20 years ago, when he started writing for The New York Review of Books and other publications. Some of his work was stimulating and some of it was annoying. His books on the European Left proved to be both. Judt was dismissive of questions and figures I thought were important, or else ignored them entirely. Reading Judt on Marxism involved a certain amount of intracranial yelling. As C.L.R. James once said about T.S. Eliot, he was someone I read in order to remind myself of what I do not think.

With Judt’s more recent writings on political topics (on the Middle East, the "strange death of liberal America,” and the prospects for a revitalized social democracy, for example), I noticed that other people were doing the complaining, in letters-to-the-editor columns and otherwise. This was gratifying, for Judt's conclusions were often similar to ones I'd come to by different means; and it was also instructive to see how he argued back against our shared opponents, especially since he did it more calmly than might be my wont.

Now it is the columnist’s privilege to express these things in utterly subjective terms. But just to be clear, no personal contact was involved. I never met Judt, nor made any effort to do so. The voice I was arguing with (or concurring with, as the case might be) was always the voice on the page.

This bears spelling out -- not because it was unusual, but because it is a completely normal relationship between author and reader, in many respects, at least for those of us whose sense of that relationship predates the Internet. It can be very intense, but it also possesses an element of distance, even of strict impersonality. Judt was someone with certain ideas who had published certain books and articles. His individuality stopped and started there.

All of which changed late last year, when I saw the video.

If you’ve seen it, you probably know what I mean. If not, here it is.

So little sense of Judt as an individual did I have that even his accent came as a surprise -- to say nothing of the impact of seeing him in a wheelchair, with tubes running into his nose (“a quadriplegic with facial Tupperware,” as he put it), all of it necessary given the muscular degeneration caused by ALS, also called Lou Gehrig’s Disease.

Judt’s lecture -- sponsored last fall by the Remarque Institute, which he founded at New York University in 1995 -- has since been expanded into a book, Ill Fares the Land, published this spring by Penguin. In different circumstances my attention would go strictly to his political concerns. But that has become difficult, maybe impossible. His condition required Judt to compose the book in his head, then dictate it. Knowing that a text was written in prison always creates a sort of double consciousness in the reader. All the more so when the prison is the author’s body.

The grace and humor projected in that video naturally filtered into the tone of the voice that began to come from the page, especially as Judt began to write the series of essays, several of them amounting to a memoir, over the final months of his life. The first of them was a description -- calm and candid, but at times panic-inducing to read -- of what the days and nights were like under his changed circumstances. By that point, the relationship between author and public was no longer the same. Each essay might very well be his last. And seeing his name on the cover of The New York Review of Books now registered, not as part of my habitual, conditioned readerly expectations, but as a challenge that might as well be called “existential.”

Judt belonged to a generation that used that word a lot, maybe too much, but it feels like the one needed here. Faced not just with mortality but with conditions in which putting words on paper was getting ever more difficult, the decision to remain committed to the public role of writer and thinker was not obvious. Nobody would have blamed him for quitting. A choice was involved. He kept going.

My habit of scanning the names of contributors turned more deliberate, in Judt’s case. His name on the cover of The New York Review of Books was encouraging, because it meant he was staying the course. But also troubling, since it posed hard questions: In his circumstances, would you be able to keep on working? For how long? In the name of what values? Are you sure? You wouldn't give up, collapse like a black hole into despair? Again, you're sure about that?

Not that he was asking these questions, or even hinting at them. They amount to one reader’s confession of the thoughts raised by Judt’s example. At the same time, they are implicit in our common condition. How much of what we do is out of conviction, and how much out of the momentum of routine?

At this point, the habit of watching for Judt’s byline will soon give way. I'll still reread him every so often, of course -- in particular, his Reappraisals: Reflections on the Forgotten Twentieth Century, published a couple of years ago by Penguin. It may be the best place to start for anyone who has yet to make his acquaintance. His essays are also excellent models of writing by a scholar addressing a public both academic and non-.

Not to say that his arguments are always persuasive. Some of them are actually pretty irritating, in my opinion, but that's the way it should be.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Liberal Arts II: The Economy Requires Them

Many of us committed to the liberal arts have been defensive for as long as we can remember.

We have all cringed when we have heard a version of the following joke: The graduate with a science degree asks, “Why does it work?”; the graduate with an engineering degree asks, “How does it work?”; the graduate with a liberal arts degree asks, “Do you want fries with that?”

We have responded to such mockery by proclaiming the value of the liberal arts in the abstract: it creates a well-rounded person, is good for democracy, and develops the life of the mind. All these are certainly true, but somehow each misses the point that the joke drives home. Today’s college students and their families want to see a tangible financial outcome from the large investment that is now American higher education. That doesn’t make them anti-intellectual, but simply realists. Outside of home ownership, a college degree might be the largest single purchase for many Americans.

There is a disconnect as parents and students worry about economic outcomes when too many of us talk about lofty ideals. More families are questioning both the sticker price of schools and the value of whole fields of study. It is natural in this environment for us to feel defensive. It is time, however, that we in the liberal arts understand this new environment, and rather than merely react to it, we need to proactively engage it. To many Americans the liberal arts have a luxury they feel they need to give up to make a living -- nice but impractical. We need to speak more concretely to the economic as well as the intellectual value of a liberal arts degree.

The liberal arts always situate graduates on the road for success. More Fortune 500 CEOs have had liberal arts B.A.s than professional degrees. The same is true of doctors and lawyers. And we know the road to research science most often comes through a liberal arts experience. Now more than ever, as employment patterns seem to be changing, we need to engage the public on the value of a liberal arts degree in a more forceful and deliberate way.

We are witnessing an economic shift that may be every bit as profound as the shift from farm to factory. Today estimates are that over 25 percent of the American population is working as contingent labor -- freelancers, day laborers, consultants, micropreneurs.

Sitting where we do it is easy to dismiss this number because we assume it comes from day laborers and the working class, i.e., the non-college-educated. But just look at higher education's use of adjuncts and you see the trend. The fastest-growing sector of this shift is in the formally white-collar world our students aspire to. This number has been steadily rising and is projected to continue its upward climb unchanged. We are living in a world where 9:00-5:00 jobs are declining, careers with one company over a lifetime are uncommon, and economic risk has shifted from large institutions to individuals. Our students will know a world that is much more unstable and fluid than the one of a mere generation ago.

We have known for many years that younger workers (i.e., recent college graduates) move from firm to firm, job to job and even career to career during their lifetime. What we are seeing now, however, is different. And for as many Americans, they are hustling from gig to gig, too. These workers, many our former students, may never know economic security, but they may know success. For many of the new-economy workers, success is measured by more than just money, as freedom, flexibility and creativity count too.

If this is the new economy our students are going to inherit, we as college and university administrators, faculty and staff need to take stock of the programs we offer (curricular as well as extracurricular) to ensure that we serve our students' needs and set them on a successful course for the future. The skills they will need may be different from those of their predecessors. Colleges and universities with a true culture of assessment already are making the necessary strategic adjustments.

In 1956, William Whyte, the noted sociologist, wrote The Organizational Man to name the developing shift in work for that generation. Whyte recognized that white-collar workers traded independence for stability and security. What got them ahead in the then-new economy was the ability to fit in (socialization) and a deep set of narrow vocational skills. Firms at the time developed career ladders, and successful junior executives who honed their skills and got along advanced up the food chain.

Today, no such career ladder exists. And narrow sets of skills may not be the ticket they once were. We are witnessing a new way of working developing before our eyes. Today, breadth, cultural knowledge and sensitivity, flexibility, the ability to continually learn, grow and reinvent, technical skills, as well as drive and passion, define the road to success. And liberal arts institutions should take note, because this is exactly what we do best.

For liberal arts educators, this economic shift creates a useful moment to step out of the shadows. We no longer need to be defensive because what we have to offer is now more visibly useful in the world. Many of the skills needed to survive and thrive in the new economy are exactly those a well-rounded liberal arts education has always provided: depth, breadth, knowledge in context and motion, and the search for deeper understanding.

It will not be easy to explain to future students and their parents that a liberal arts degree may not lead to a particular “job” per se, because jobs in the traditional sense are disappearing. But, we can make a better case about how a liberal arts education leads to both a meaningful life and a successful career.

In this fluid world, arts and sciences graduates may have an advantage. They can seek out new opportunities and strike quickly. They are innovative and nimble. They think across platforms, understand society and culture, and see technology as a tool rather than an end in itself. In short, liberal arts graduates have the tools to make the best out of the new economy. And, above all, we need to better job identifying our successes, our alumni, as well as presenting them to the public. We need to ensure that the public knows a liberal arts degree is still, and always has been, a ticket to success.

This could be a moment for the rebirth of the liberal arts. For starters, we are witnessing exciting new research about the economy that is situating the discussion more squarely within the liberal arts orbit, and in the process blurring disciplinary boundaries. These scholars are doing what the American studies scholar Andrew Ross has called “scholarly reporting,” a blend of investigative reporting, social science and ethnography, as a way to understand the new economy shift. Scholars such as the sociologists Dalton Conley and Sharon Zurkin and the historian Bryant Simon offer new models of engaged scholarship that explain the cultural parameters of the new economy. We need to recognize and support this research because increasingly we will need to teach it as the best way to ensure our students understand the moment.

We also need to be less territorial, and recognize that the professional schools are not the enemy. They have a lot to offer our students. Strategic partnerships between professional schools and the arts and sciences enrich both and offer liberal arts students important professional opportunities long closed off to them. We also need to find ways to be good neighbors to the growing micropreneurial class, either by providing space, wifi, or interns. Some schools have created successful incubators, which can jump-start small businesses and give their students important ground-floor exposure to the emerging economy.

Today’s liberal arts graduates will need to function in an economy that is in some ways smaller. Most will work for small firms and many will simply work on their own. They will need to multitask as well as blend work and family. And, since there will be little budget or time for entry-level training, we need to ensure that all our students understand the basics of business even if they are in the arts. We also might consider preparing our graduates as if they were all going to become small business owners, because in a sense many of them are going to be micropreneurs.

Author/s: 
Richard A. Greenwald
Author's email: 
newsroom@insidehighered.com

Richard A Greenwald is dean of the Caspersen School of Graduate Studies, director of university partnerships, and professor of history at Drew University in Madison, N.J. His next book is entitled The Micropreneurial Age: The Permanent Freelancer and the New American (Work)Life.

Liberal Arts I: They Keep Chugging Along

When the economy goes down, one expects the liberal arts -- especially the humanities -- to wither, and laments about their death to go up. That’s no surprise since these fields have often defined themselves as unsullied by practical application. This notion provides little comfort to students -- and parents -- who are anxious about their post-college prospects; getting a good job -- in dire times, any job -- is of utmost importance. (According to CIRP’s 2009 Freshman Survey, 56.5 percent of students -- the highest since 1983 -- said that “graduates getting good jobs” was an important factor when choosing where to go to college.)

One expects students, then, to rush to courses and majors that promise plenty of entry-level jobs. Anticipating this, college administrators would cut back or eliminate programs that are not “employment friendly,” as well as those that generate little research revenue. Exit fields like classics, comparative literature, foreign languages and literatures, philosophy, religion, and enter only those that are preprofessional in orientation. Colleges preserving a commitment to the liberal arts would see a decline in enrollment; in some cases, the institution itself would disappear.

So runs the widespread narrative of decline and fall. Everyone has an anecdote or two to support this story, but does it hold in general and can we learn something from a closer examination of the facts?

The National Center for Education Statistics reports that the number of bachelor's degrees in “employment friendly” fields has been on the rise since 1970. Undergraduate business degrees -- the go-to “employment friendly” major -- has increased from 1970-71, with 115,400 degrees conferred, to 2007-08, with 335,250 conferred. In a parallel development, institutions graduated seven times more communications and journalism majors in 2007-08 than in 1970-71. And while numbers are small, there has been exponential growth in “parks, recreation, leisure, and fitness studies,” “security and protective services,” and “transportation and materials moving” degrees. Computer science, on the other hand, peaked in the mid-80s, dropped in the mid-90s, peaked again in the mid-2000s, and dropped again in the last five years.

What has students’ turn to such degrees meant for the humanities and social sciences? A mapping of bachelor degrees conferred in the humanities from 1966 to 2007 by the Humanities Indicator Project shows that the percentage of such majors was highest in the late 1960s (17-18 percent of all degrees conferred), low in the mid-1980s (6-7 percent), and more or less level since the early 1990s (8-9 percent). Trends, of course, vary from discipline to discipline.

Degrees awarded in English dropped from a high of 64,627 in 1970-71 to half that number in the early 1980s, before rising to 55,000 in the early 1990s and staying at that level since then. The social sciences and history were hit with a similar decline in majors in 1970s and 1980s, but then recovered nicely in the years since then and now have more than they did in 1970. The numbers of foreign language, philosophy, religious studies, and area studies majors have been stable since 1970. IPEDS data pick up where the Humanities Indicator Project leaves off and tell that in 2008 and 2009, the number of students who graduated with bachelor's degrees in English, foreign language and literatures, history, and philosophy and religion have remained at the same level.

What’s surprising about this bird’s-eye view of undergraduate education is not the increase in the number of majors in programs that should lead directly to a job after graduation, but that the number of degrees earned in the humanities and related fields have not been adversely affected by the financial troubles that have come and gone over the last two decades.

Of course, macro-level statistics reveal only part of the story. What do things look like at the ground level? How are departments faring? Course enrollments? Majors? Since the study of the Greek and Roman classics tends to be a bellwether for trends in the humanities and related fields (with departments that are small and often vulnerable), it seemed reasonable to ask Adam Blistein of the American Philological Association whether classics departments were being dropped at a significant number of places. “Not really” was his answer; while the classics major at Michigan State was cut, and a few other departments were in difficulty, there was no widespread damage to the field -- at least not yet.

Big declines in classics enrollments? Again, the answer seems to be, “Not really.” Many institutions report a steady gain in the number of majors over the past decade. Princeton’s classics department, for example, announced this past spring 17 graduating seniors, roughly twice what the number had been three decades ago. And the strength is not just in elite institutions. Charles Pazdernik at Grand Valley State University in hard-hit Michigan reported that his department has 50+ majors on the books and strong enrollments in language courses.

If classics seems to be faring surprisingly well, what about the modern languages? There are dire reports about German and Russian, and the Romance languages seem increasingly to be programs in Spanish, with a little French and Italian tossed in. The Modern Language Association reported in fall 2006 -- well before the current downturn -- a 12.9 percent gain in language study since 2002. This translates into 180,557 more enrollments. Every language except Biblical Hebrew showed increases, some exponential -- Arabic (126.5 percent), Chinese (51 percent), and Korean (37.1 percent) -- while others less so -- French (2.2 percent), German (3.5 percent), and Russian (3.9 percent). (Back to the ancient world for a moment: Latin saw a 7.9 percent increase, and ancient Greek 12.1 percent). The study of foreign languages, in other words, seems not to be disappearing; the mix is simply changing.

Theoretical and ideological issues have troubled and fragmented literature departments in recent years, but a spring 2010 conference on literary studies at the National Humanities Center suggests that the field is enjoying a revitalization. The mood was eloquent, upbeat, innovative; no doom and gloom, even though many participants were from institutions where painful budget cuts had recently been made.

A similar mood was evident at National Forum on the Future of Liberal Education, a gathering of some highly regarded assistant professors in the humanities and social sciences this past February. They were well aware that times were tough, the job market for Ph.D.s miserable, and tenure prospects uncertain. Yet their response was to get on with the work of strengthening liberal education, rather than bemoan its decline and fall. Energy was high, and with it the conviction that the best way to move liberal education forward was to achieve demonstrable improvements in student learning.

It’s true that these young faculty members are from top-flight universities. What about smaller, less well-endowed institutions? Richard Ekman of the Council of Independent Colleges reports that while a few of the colleges in his consortium are indeed in trouble, most were doing quite well, increasing enrollments and becoming more selective.
And what about state universities and land grant institutions, where most students go to college? Were they scuttling the liberal arts and sciences because of fierce cutbacks? David Shulenburger of the Association of Public and Land-grant Universities says that while budget cuts have resulted in strategic “consolidation of programs and sometimes the elimination of low-enrollment majors,” he does not “know of any public universities weakening their liberal education requirements.”

Mark Twain once remarked that reports of his death were greatly exaggerated. The liberal arts disciplines, it seems, can say the same thing. The on-the-ground stories back up the statistics and reinforce the idea that the liberal arts are not dying, despite the soft job market and the recent recession. Majors are steady, enrollments are up in particular fields, and students -- and institutions -- aren’t turning their backs on disciplines that don’t have obvious utility for the workplace. The liberal arts seem to have a particular endurance and resilience, even when we expect them to decline and fall.

One could imagine any number of reasons why this is the case -- the inherent conservatism of colleges and universities is one -- but maybe something much more dynamic is at work. Perhaps the stamina of the liberal arts in today’s environment draws in part from the vital role they play in providing students with a robust liberal education, that is, a kind of education that develops their knowledge in a range of disciplinary fields, and importantly, their cognitive skills and personal competencies. The liberal arts continue -- and likely will always -- give students an education that delves into the intricate language of Shakespeare or Woolf, or the complex historical details of the Peloponnesian War or the French Revolution. That is a given.

But what the liberal arts also provide is a rich site for students to think critically, to write analytically and expressively, to consider questions of moral and ethical importance (as well as those of meaning and value), and to construct a framework for understanding the infinite complexities and uncertainties of human life. This is, as many have argued before, a powerful form of education, a point that students, the statistics and anecdotes show, agree with.

Author/s: 
W. Robert Connor and Cheryl Ching
Author's email: 
info@insidehighered.com

W. Robert Connor is the former president of the Teagle Foundation, to which he is now a senior adviser. Cheryl Ching is a program officer at Teagle.

If Henry Could See Us Now

"Who knows but if men constructed their dwellings with their own hands, and provided food for themselves and families simply and honestly enough, the poetic faculty would be universally developed, as birds universally sing when they are so engaged?" So writes Henry David Thoreau in the first chapter of Walden, in the middle of a lengthy disquisition about the meaning of shelter in mid-19th century America. Using white pine from the shores of Walden Pond and lumber salvaged from an old shack, Thoreau stimulated his own poetic faculties by constructing his 10- by 15-foot dwelling at the outset of his famous sojourn.

With Thoreau’s exhortation and example firmly in mind and the blessing of the college administration, the department of environmental studies and sciences undertook the reconstruction of Thoreau’s cabin as our contribution to Ithaca College’s First Year Reading Initiative for 2010. The president had selected Walden as the text that would be sent to all incoming first-year students. Few books could serve as so stimulating a provocation in our hyper-mediated age, when it is harder than ever "to front the essential facts of life," when more people than ever seem to be living lives of quiet desperation. Reconstructing Thoreau’s cabin, therefore, not only resonated well with my department’s values, but would offer students an opportunity to, in Thoreau’s own vision of higher education, "not play life, or study it merely, while the community supports them at this expensive game, but earnestly live it from beginning to end." (Emphasis original.)

Over the course of the summer everyone we contacted about helping with the project was enthusiastic. The local timber framers who had the tools and expertise to lead the build, the salvager who would provide us with the wood, and the local re-use center where we would get the windows and which would help us with the de-nailing — all leaped at the chance to participate, in many cases offering their services free or at a steep discount. Students, faculty, alumni, and community members who learned about the project all expressed a desire, even a craving, to become involved, to be able to build with their own hands. Their answer to Thoreau’s question, "Shall we ever resign the pleasure of construction to the carpenter?" was loud and clear.

And so sketches were made. A crew of students and faculty spent a day and a half pulling hemlock boards and timbers from a collapsed 120-year-old barn. The campus site for the build was selected. We sent the hand-drawn sketches to an architect friend to be rendered as computer-designed drawings.

And that was the moment when the magic of creative possibility conjured by Thoreau dissipated in the reality of 21st-century America. We can't say we weren’t warned by Henry himself, who had observed even in the 1840s that human institutions often serve those who created them in unwelcome ways. Our well-meaning friend innocently inquired, "Are you sure you won’t need a building permit for this project?"

An educational project temporarily occupying a space for a year, a 150-square-foot cabin? Surely not.

But, alas, once even our innocent inquiries were made, the Town of Ithaca bureaucrats scampered into their iron cages and set about their regulatory duties — duties, it should be said, the people have charged them with. Unable to see how irrelevant modern building codes were for this project, the director of code enforcement immediately declared our plans as drawn were a menace to public health and safety. The entire thing was transformed from frustration to farce when he insisted that the cabin would need ... a sprinkler system.

At least as frustrating was the inability of the college’s own bureaucracy to either defend the principle that this project was not even subject to review (there were precedents for such an argument) or to advocate for an expedited process. Not without reason, the college administration was fearful of alienating the local government over a project that was a low priority compared to the massive building projects under way and anticipated. No matter how powerful the experience of reconstructing the cabin might be for a few hundred students, no matter that such a project conforms more closely to the vision of higher education I believe in (and Thoreau seems to have as well) than the new 130,000-square foot athletics and events center, no one was willing to challenge the town’s misapplication of rules, at least not in time to make a difference.

And so the salvaged wood sits in a storage facility awaiting its transformation, awaiting its opportunity to transform. If the town issues a permit, the winter does not stretch too far into April (as it sometimes does in these parts), and it is possible to remobilize the reconstruction team in the spring, we may yet find a replica of Thoreau’s cabin standing on our campus. If it does get built it will be as much an emblem of how accurate Thoreau’s characterizations of our society were (and are) as a triumph of experiential learning.

Even if the project becomes another one of those good ideas that run afoul of the sclerotic bureaucracies that too often hamper creativity, however, those of us most closely involved have learned, as Thoreau had, that we often forge our own chains. Our experience has confirmed the essential truth that we are almost all conformists, bound by rules and conventions we seldom question and even more rarely challenge. We need rules; who would want to live in a modern society without the rule of law? But we need to consciously consider and reconsider both the rightness of a given rule and the proper application of it. Throughout Walden Thoreau — sometimes gently, sometimes stridently — admonishes us to defy convention and seek our own path, his way of considering and reconsidering the boundaries we set for ourselves. "How deep the ruts of tradition and conformity!" he lamented. He found the expression of original thought and belief "a phenomenon so rare that I would any day walk ten miles to observe it."

In his equally famous essay “Resistance to Civil Government” (now commonly called “Civil Disobedience”), Thoreau writes, “The mass of men serve the state... not as men mainly, but as machines... . In most cases there is no free exercise whatever of the judgment or of the moral sense; but they have put themselves on a level with wood and earth and stones; and wooden men can perhaps be manufactured that will serve the purpose as well.” Thoreau condemned servility in the face of state immorality on a grand scale — in his time, this immorality was slavery and the war of aggression against Mexico that was a product of the debate over slavery.

Yet he clearly also believed that our submissiveness in the face of injustice — or, in our case in the face of intractable bureaucracy — begins with the habits of mind we cultivate in our day-to-day activities. In a provocative passage from Walden on clothes Thoreau writes, “I am sure that there is greater anxiety, commonly, to have fashionable, or at least clean and unpatched clothes, than to have a sound conscience.” Conscience may, at times, require clean clothes (I doubt if Thurgood Marshall would have gotten very far in his legal career without them), and Thoreau himself counseled that a person should “maintain himself in whatever attitude he find himself through obedience to the laws of his being, which will never be one of opposition to a just government, if he should chance to meet such.” But when social acceptance becomes the guiding principle of one’s life, when we blindly follow the spoken and unspoken rules of our culture, the world becomes a blander, less just place.

I am in no way trying to raise our impeded attempt to reconstruct Thoreau’s cabin on a privileged college campus to the level of injustice embodied by apartheid or Jim Crow, to name but two of the oppressive systems defied by people operating under Thoreau’s influence (though I like to imagine Thoreau being summoned to the town office for code violations). But I do wonder what it says about our society when we adhere so assiduously to rules and permits for things like a humble cabin while at the same time multinational corporations operate with virtual impunity. Whether it is the oil catastrophe in the Gulf of Mexico, or the poisoning of millions of gallons of water through natural gas extraction by hydraulic fracturing in my part of the country, or factory farms in Iowa that have hens laying eggs over two year old fecal piles, the absence of meaningful rules and regulations has profoundly compromised human and ecosystem health on a staggering scale. And what of the carte blanche given investment banks, in which case the absence of oversight brought the entire global financial system to its knees? But the cabin must have its sprinkler system or public safety will be jeopardized!

We too often regulate the small things inflexibly, while ignoring the behaviors and habits of thought that pose genuine threats to our — and the organisms with which we share this planet’s — very survival. Or, worse, we allow corporations to buy themselves exemptions from oversight, either through the now-legalized bribery of massive campaign contributions or in less visible ways (for just one example, see the behavior of the Minerals Management Service under the Bush administration, behavior that directly contributed to the Deepwater Horizon disaster). The result is what can seem like the worst of all possible worlds: common folk feeling oppressed by regulations that seem omnipresent and inflexible while the wealthy and powerful can often get away with murder.

Despite his reputation as a curmudgeon, Thoreau finishes Walden on an optimistic note, most famously telling us "that if one advances confidently in the direction of his dreams ... he will meet with a success unexpected in common hours." We tell our students some variant of this sentiment from the moment they arrive on campus until the last echo of the commencement speech. Our confidence may have faltered over the past few weeks as we advanced toward our modest little dream of reconstructing Henry’s cabin on campus. There remain innumerable bureaucratic hurdles to surmount before we can build the version of the cabin we envision — sans sprinkler system. Perhaps students will yet wield chisels, froes, handsaws, augurs, and hammers, and in so doing develop their poetic faculties as they contemplate the meaning of the rough-hewn, handmade cabin they have built on a modern college campus.

Author/s: 
Michael Smith
Author's email: 
info@insidehighered.com

Michael Smith teaches history and environmental studies at Ithaca College.

Shrub Studies

Next week, Crown Publishers will issue President George W. Bush’s memoir Decision Points, covering what the former president calls “eight of the most consequential years in American history,” which seems like a fair description. They were plenty consequential. To judge from the promotional video, Bush will plumb the depths of his insight that it is the role of a president to be “the decider.” Again, it’s hard to argue with his point -- though you have to wonder if he shouldn’t let his accumulated wisdom ripen and mellow for a while before serving it.

Princeton University Press has already beat him into print with The Presidency of George W. Bush: A First Historical Assessment, edited by Julian E. Zelizer, who is a professor of history and public affairs at Princeton. The other 10 contributors are professors of history, international relations, law, and political science, and they cover the expected bases -- the “War on Terror,” the invasion of Iraq, social and economic policy, religion and race. It is a scholarly book, which means that it is bound to make everybody mad. People on the left get angry at remembering the Bush years, while those on the right grow indignant that anyone still wants to talk about them. So the notion that they were consequential is perhaps not totally uncontroversial after all.

The contributors make three points about the Bush administration’s place in the history of American conservatism that it may be timely to sum up, just now.

In the introduction, Zelizer writes that Bush’s administration “marked the culmination of the second stage in the history of modern conservatism.” The earlier period, running from the 1940s through the ‘70s, had been a time of building an effective movement out of ideological factions (fundamentalists, libertarians, and neoconservatives, among others) “none of which sat very comfortably alongside any other.” Following Reagan's victory in the 1980 election, “conservatives switched from being an oppositional force in national politics to struggling with the challenges of governance that came from holding power.”

This summer, Zelizer published a valuable review-essay on the recent historiography of the American right. It can be recommended to anyone who wants more depth than the following (admittedly schematic) remarks will manage.

(1) In the chapter called “How Conservatives Learned to Stop Worrying and Love Presidential Power,” Zelizer points to a tendency among earlier generations of American conservatives to be suspicious of the executive branch. He traces this back to polemics against FDR during the 1930s, when conservatives painted the New Deal as akin to Hitlerian dictatorship or Stalinist five-year planning. And he quotes the early neoconservative intellectual James Burnham saying, in 1959, that “the primacy of the legislature in the intent of the Constitution is plain on the face of that document.” A strong executive meant growing central power, while delegates to Congress had an incentive to protect local authority.

This sensibility changed in the course of the cold war, writes Zelizer, and particularly under the leadership of Nixon and Reagan. Distrust of executive power gave way to increasing conservative reliance on it. Concentrating executive authority in the hands of the president (rather than spreading it out among various agencies) would promote efficiency and coordinate decision-making -- so the argument went. But just as importantly, it would mean that a conservative president could curb the regulatory powers of the state.

The claims for executive authority intensified under the War on Terror -- yielding what Zelizer calls the Bush administration’s “defiant, if not downright hostile [attitude] about any kind of congressional restrictions whatsoever." But this was not just something that “the decider” decided. It reflected a decades-long reorientation in conservative ideology. "The Right cannot legitimately divorce itself from strong presidential power,” writes Zelizer. “[A]n expanding historical literature … is attempting to revise our knowledge about conservatism by demonstrating how conservatives have had a more complex and less adversarial relationship with the modern state than we previously assumed.”

(2) There was a time when manufacturing "stood atop the commanding heights of the U.S. political economy,” writes Nelson Lichtenstein -- a professor of history at the University of California at Santa Barbara -- in his chapter “Ideology and Interest on the Social Policy Home Front.” He identifies this epoch as running from 1860 until 1980. The Bush presidency belongs to the era of "retail supremacy," in which the employment trend is low-wage and high-turnover. As of 2008, there were five times as many jobs in the service sector as in “the ‘goods-producing’ industries that once constituted the core of the U.S. economy” such as agriculture, construction, and manufacture.

Free-market principles were a basic part of conservative ideology in both eras. But the beneficiaries have changed. Once upon a time, advocates of laissez-faire would sometimes find themselves accused of being mouthpieces of the National Association of Manufacturers, averse to trade regulation and price controls. But by the Bush years, that was a thing of the past. “As employers of many low-wage workers,” writes Lichtenstein, “most retailers favored the lightest possible regulatory hand, especially when it came to welfare-state mandates such as those covering employee health insurance, retirement pay, and health and safety issues.”

The gaps created by stagnating wages and shrinking benefits were plugged – for a while anyway – by "cheap imports, easy credit, an overpriced dollar, and an array of new financial products that widened the range of assets (mainly houses) that both homeowners and bankers could borrow against."

An older style of right-wing thought lauded the free market as a merciless field of combat -- a way to test the entrepreneur’s self-control and the manufacturer's commitment to increasing productivity. But the form of conservatism taking its place has freed itself from notions of delayed gratification or expanding domestic output. Wal-Mart capitalism in the Bush years claimed only to deliver the goods cheaply, no matter where they might come from.

(3) In the 1970s, conservatives liked to say that their ranks were filling up with “liberals who had been mugged by reality.” That phrase suggested that reality is one tough dude -- totally indifferent to anybody’s mere opinion.

It was quite another matter when a leading Bush administration official (unnamed but often assumed to be Karl Rove) told a reporter for The New York Times Magazine in 2004: “We’re an empire now, and when we act, we create our own reality.” Nor was this merely the judgment of a solitary solipsist. In his chapter "Creating Their Own Reality,” David Greenberg -- an associate professor of history, journalism, and media studies at Rutgers University -- maintains “the Right under Bush found itself promoting a view of knowledge in which any political claim, no matter how objectively verifiable or falsifiable, was treated as simply one of two competing descriptions of reality, with power and ideology, not science or disciplined inquiry, as the arbiters.” (Or deciders, if you will.)

There was no reality, only interpretations of reality -- and the existence of weapons of mass destruction was a function of who controlled the narrative. Little surprise that there were jokes about the rise of conservative postmodernism during the ‘00s. If Fox denied that climate change was taking place, who had the right to insist otherwise? Not some elitist, anyway.

Greenberg traces the right's “forays into epistemological relativism” back to influence of networks of right-leaning think-tanks and journalists. He quotes a contributor to The Weekly Standard from 2003, on how the right had created “a cottage industry” for spin: “Criticize other people for not being objective. Be as subjective as you want. It’s a great little racket.” And going a step beyond what Greenberg describes, we see another development along the same reality-aversive lines: the growing importance of conservative political figures whose authority within the movement comes primarily, or even exclusively, from their status as mass-media celebrities.

The former president did not create any of these tendencies. He simply took them over as legacies from what has been, for 30 years now, the strongest and most disciplined force in American politics.

Several passages in The Presidency of George W. Bush were obviously written in the flush of assumptions that the election of 2008 was a major turning point in the country's history -- the point at which the conservative movement had not just lost any chance at constructing a "permanent Republican majority" but condemned itself to wander in the electoral wilderness for a long season. Well, nobody should expect historians to be prophets, or political scientists to be bookies.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Putting the 'Humanities' in 'Digital Humanities'

Reflecting on the recent The Humanities and Technology conference (THAT Camp) in San Francisco, what strikes me most is that digital humanities events consistently tip more toward the logic-structured digital side of things. That is, they are less balanced out by the humanities side. But what I mean by that itself has been a problem I've been mulling for some time now. What is the missing contribution from the humanities?

I think this digital dominance revolves around two problems.

The first is an old problem. The humanities’ pattern of professional anxiety goes back to the 1800s and stems from pressure to incorporate the methods of science into our disciplines or to develop our own, uniquely humanistic, methods of scholarship. The "digital humanities" rubs salt in these still open wounds by demonstrating what cool things can be done with literature, history, poetry, or philosophy if only we render humanities scholarship compliant with cold, computational logic. Discussions concern how to structure the humanities as data.

The showy and often very visual products built on such data and the ease with which information contained within them is intuitively understood appear, at first blush, to be a triumph of quantitative thinking. The pretty, animated graphs or fluid screen forms belie the fact that boring spreadsheets and databases contain the details. Humanities scholars, too, often recoil from the presumably shallow grasp of a subject that data visualization invites.

For many of us trained in the humanities, to contribute data to such a project feels a bit like chopping up a Picasso into a million pieces and feeding those pieces one by one into a machine that promises to put it all back together, cleaner and prettier than it looked before.

Which leads to the second problem, the difficulty of quantifying an aesthetic experience and — more often — the resistance to doing so. A unique feature of humanities scholarship is that its objects of study evoke an aesthetic response from the reader (or viewer). While a sunset might be beautiful, recognizing its beauty is not critical to studying it scientifically. Failing to appreciate the economy of language in a poem about a sunset, however, is to miss the point.

Literature is more than the sum of its words on a page, just as an artwork is more than the sum of the molecules it comprises. To itemize every word or molecule on a spreadsheet is simply to apply more anesthetizing structure than humanists can bear. And so it seems that the digital humanities is a paradox, trying to combine two incompatible sets of values.

Yet, humanities scholarship is already based on structure: language. "Code," the underlying set of languages that empowers all things digital, is just another language entering the profession. Since the application of digital tools to traditional humanities scholarship can yield fruitful results, perhaps what is often missing from the humanities is a clearer embrace of code.

In fact, "code" is a good example of how something that is more than the sum of its parts emerges from the atomic bits of text that logic demands must be lined up next to each other in just such-and-such a way. When well-structured code is combined with the right software (e.g., a browser, which itself is a product of code), we see William Blake’s illuminated prints, or hear Gertrude Stein reading a poem, or access a world-wide conversation on just what is the digital humanities. As the folks at WordPress say, code is poetry.

I remember 7th-grade homework assignments programming onscreen fireworks explosions in BASIC. When I was in 7th grade, I was willing to patiently decipher code only because of the promise of cool graphics on the other end. When I was older, I realized the I was willing to read patiently through Hegel and Kant because I learned to see the fireworks in the code itself. To avid readers of literature, the characters of a story come alive to us, laying bare our own feelings or moral inclinations in the process.

Detecting patterns, interpreting symbolism, and analyzing logical inconsistencies in text are all techniques used in humanities scholarship. Perhaps the digital humanities' greatest gift to the humanities can be the ability to invest a generation of "users" in the techniques and practiced meticulous attention to detail required to become a scholar.

Author/s: 
Phillip Barron
Author's email: 
newsroom@insidehighered.com

Trained in analytic philosophy, Phillip Barron is a digital history developer at the University of California at Davis.

In Praise of the Americans

Over the weekend -- while busily procrastinating here in the main library of the University of Texas at Austin -- I stumbled across something suitable for the Intellectual Affairs column running just before Thanksgiving. Inside Higher Ed has a growing international readership. Still, I hope it will not be too provincial to call attention to a long-forgotten essay called “In Praise of the Americans” by Stephen Leacock, a Canadian political scientist and economist who was also one of the best-known humorists of his day. The essay appeared during the Great Depression, as the final word in an anthology intended to explain the United States to European readers.

The volume in question, America as Americans See It, was published by the Literary Guild in 1932. It contains more than 40 essays by various eminent and near-eminent figures of that era, plus dozens of photographs and cartoons. The editor, Fred J. Ringel, says in the introduction that he intended to prepare a study of the national culture after he arrived in the United States. (From where, he doesn’t indicate, and this seems to be his one major publication.) But he gave up and decided to edit an anthology instead. Among the better-remembered contributors are W.E.B. Du Bois and Upton Sinclair. There is also an essay by one Clare Boothe Brokaw, an editor at Vanity Fair, on the rituals and pretenses of high society. This author would become somewhat better known when she changed surnames after marrying Henry Luce, and her observations would be recycled into more memorable form in a play called The Women.

Although Ringel doesn’t mention it, many readers of the time would have recalled a similar volume called Civilization in the United States, published in 1922. I'd guess that America as Americans See It appearing on its tenth anniversary was not a total coincidence. The editor of the previous collection was Harold Stearns, who had also published a volume of his own writings called America and the Young Intellectuals (1921) that looks, with hindsight, like a sort of opening salvo in the culture wars. In Civilization in the United States, he joined forces with H.L. Mencken, Lewis Mumford, Van Wyck Brooks, and others to produce a landmark work of social criticism.

On the day it was published, Stearns boarded a ship to Paris – where, in celebration of having escaped Prohibition, he promptly got drunk, staying that way pretty much continuously for the next decade. Legend has it that when Stearns ended up sleeping in the gutter, one expatriate pointed him out to another and said, “There’s Civilization in the United States.” (In The Sun Also Rises, Hemingway bases a character on Stearns at his booziest.)

All of which forms part of the backdrop for America as Americans See It, and in particular to Stephen Leacock’s essay, which the editor retitled “A Neighbor Looks at America.” A short biographical headnote in the anthology notes that Leacock’s syndicated articles were reaching “as many as nine million readers in the United States and Canada alone,” while some of his work had worldwide circulation. His first volume, Elements of Political Science (1906), was for a long time the standard undergraduate textbook and was translated into 19 languages, including Chinese and Urdu. But he was best known for his humorous writings, which also fed the great demand for him as a lecturer. These extracurricular activities earned him about five times as much as his salary as a head of the department of political economy at McGill University in Montreal.

When his first collection of anecdotes and satires appeared in 1910, it became an international best-seller and earned him comparisons to Mark Twain, who had just died. “Theirs were common gifts for broad burlesque, grotesque exaggeration, juxtaposition of irrelevant ideas, and casual shifting from one comic pose to another,” points out Leacock's biographer David M. Legate. I’d say there is also some resemblance to Robert Benchley and Ring Lardner.

Leacock wrote an enormous amount -- there were one or two collections of his essays every year until his death in 1944. Much of it doesn't hold up very well, after all this time. When you need a footnote to get a joke, it’s not really a joke any more; it is a fossil. But his observations on the civilization just south of Canada are another matter. Apart from a couple of topical references, they still apply after almost eighty years.

“Americans are a queer people,” he writes. “They can’t rest…. They rush up and down across their continent as tourists; they move about in great herds to conventions, they invade the wilderness, they flood the mountains, they keep the hotels full. But they can’t rest. The scenery rushes past them. They learn it, but they don’t see it. Battles and monuments are announced to them in a rubber neck bus… So they go on rushing until the Undertaker gathers them to the last convention.”

The same state of distracted haste prevails in the educational system and in publishing. Americans “have more schools,” Leacock writes, “and better schools, and spend more money on schools and colleges than all of Europe. They print more books in one year than the French print in ten. But they can’t read. They cover their country with 100,000 tons of Sunday newspapers every week. But they don’t read them. They’re too busy. They use them for fires and to make more paper with.” Today, of course, we publish everything digitally, then ignore it.

If they ever bothered to read anything, Americans would probably be unhappy. But we don’t. So (as the quintessential American phrase now goes) it’s all good: “All the world criticizes them and they don’t give a damn….Moralists cry over them, criminologists dissect them, writers shoot epigrams at them, prophets foretell the end of them, and they never move. Seventeen brilliant books analyze them every month; they don’t read them .… But that’s all right. The Americans don’t give a damn; don’t need to; never did need to. That is their salvation.”

That is the last word of his essay -- but also of America as Americans See It, which, like other such volumes, Leacock treats as a symptom of American overproduction, destined to meet American indifference.His notion of total indifference as a basis for salvation is, no doubt about it, ironic. But you can do worse than to run a political campaign on that basis.

Either way, the man clearly had our number. The more things change, the more they stay the same. So happy Thanksgiving.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

That's Offensive!

Late last month, following a protest by House G.O.P. leader John Boehner and the Catholic League president William Donohue over its imagery of ants swarming over a crucifix, the National Portrait Gallery removed a video called “A Fire in My Belly” by the late David Wojnarowicz from an exhibition. (See this report in IHE.) Over the past week, the Museum of Contemporary Art in Los Angeles painted over a mural it had commissioned from an artist named Blu; the mural showed rows of coffins draped in dollar bills. MOCA explained that the work was “inappropriate” given its proximity to a VA hospital and a memorial to Japanese-American soldiers, but has invited the muralist to come back and try again.

All of this in the wake of last spring's furor over the cartoon series South Park’s satirical depiction of Muhammad (or rather, its flirtation with that depiction). I didn’t pay all that much attention to the controversy as it was occurring, since I was still getting angry e-mail messages from Hindus who objected to a scholarly book for its impiety towards their gods. It felt like I had absorbed enough indignation to last a good long time. But there’s always plenty more where that came from. People feel aggrieved even during the holiday season. Actually, just calling it “the holiday season” is bound to upset somebody.

So it may not make sense to use the word “timely” to describe Stefan Collini’s new book That’s Offensive! Criticism, Identity, Respect (published by Seagull and distributed by the University of Chicago Press). The topic seems perennial.

A professor of intellectual history and English literature at the University of Cambridge, Collini is also the author of Absent Minds: Intellectuals in Britain (2006) and Common Reading: Critics, Historians, Publics (2008), both from Oxford University Press. His latest volume is part of a new series, “Manifestos for the 21st Century,” published in association with the internationally renowned journal Index on Censorship. As with his other recent work, it takes as its starting point the question of how criticism functions within a society.

The word “criticism” has a double meaning. There is the ordinary-usage sense of it to mean “fault-finding,” which implies an offended response almost by definition. Less obviously tending to provoke anger and defensiveness is criticism as, in Collini’s words, “the general public activity of bringing some matter under reasoned or dispassionate scrutiny.” Someone may find it absurd or perverse that there are critics who think Milton made Satan the real hero of Paradise Lost, but I doubt this interpretation has made anyone really unhappy, at least within recent memory.

Alas, this distinction is not really so hard and fast, since even the most dispassionate criticism often involves “a broader analysis of the value or legitimacy of particular claims or practices.” And it is sometimes easier to distinguish this from fault-finding in theory than in practice. “Such analysis,” writes Collini, “will frequently be conducted in terms other than those which the proponents of a claim or the devotees of a practice are happy to accept as self-descriptions, and this divergence of descriptive languages then becomes a source of offense in itself.”

Not to accept a self-description implies that it is somehow inadequate, even self-delusional. This rarely goes over well. An artist showing coffins draped with dollar bills, rather than flags, is making a polemical point -- in ways that a scholar analyzing the psychosexual dimension of religious narratives probably isn’t. But offense will be taken either way.

Such conflicts are intense enough when the exchange is taking place within a given society. When questions about “the value or legitimacy of particular claims or practices” are posed across cultural divides, the possibilities for outrage multiply -- and the problem arises of whether critique amounts to an act of aggression.

Let me simply recommend Collini’s book, rather than try to synopsize his argument on that score. But it seems like a good antidote to both clash-of-civilizationists and identity-politicians.

“Criticism may be less valued or less freely practiced in some societies than in others,” he writes, “but it is not intrinsically or exclusively associated with one kind of society, in the way that, say, hamburgers or cricket are. And anyway, different ‘cultures’ are not tightly sealed, radically discontinuous entities: they are porous, overlapping, changing ways of life lived by people with capacities and inclinations that are remarkably similar to those we are familiar with. While there are various ways to show ‘respect’ for people some of whose beliefs differ from our own, exempting those beliefs from criticism is not one of them.”

As a corollary, this implies cultivating a willingness to listen to critiques of our own deeply embedded self-descriptions. No easy thing -- for "so natural to mankind," in the words of John Stuart Mill, "is intolerance to what it really cares about." Amen to that.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Pages

Subscribe to RSS - History
Back to Top