Last year, the bicentennial of Thomas Paine’s death came and went without much ceremony. It’s always some anniversary or other; perhaps we were just all commemorated out. Besides, even if there had been some big effort to mark the memory of the American Revolution’s greatest pamphleteer and most radical ideologue, media attention would have focused on a fairly sensational topic: the strange afterlife of his physical remains.
Ten years after Paine was buried in New Rochelle, N.Y., his body was dug up by a political enemy and transported to England. (This seems like carrying argumentativeness to an extreme.) The subsequent fate of Paine's body is a complicated matter. Its owner, if that is the word one wants, died in the 1830s. Bits and pieces of Paine circulated on what seems like a fairly ghoulish black market; one story had it that some of the bones were made into buttons. Eventually a number of the remains were gathered up by the Paine National Historical Association, which returned them to New Rochelle for reburial in 1905.
There was more to it than bringing a creepy situation to a close. Even before his death, Paine had been largely forgotten in the United States. His religious skepticism and early support for the principles of the French Revolution made him the most inconvenient of Founding Fathers. But his memory remained alive in the underground of freethinkers and radicals, and giving him a proper burial was a mission for some of them. It was a step towards recovering the truth about the origins of the United States, which were never so pious as some people would have you believe.
Paine’s busy afterlife first came to my attention a few years ago – just about the time, as coincidence had it, that I was reading about what happened to Voltaire’s non-literary remains. They were dug up in 1791, at the high tide of the French Revolution, so that they could be buried alongside the body of Jean-Jacques Rousseau.
In spite of his subtitle, Kammen, a professor emeritus of American history and culture at Cornell University, does not focus exclusively on the United States. But he focuses on how tightly linked the phenomenon of reburial is to the shaping of historical memory in the wake of the American Revolution.
The tradition of marking the graves of common people emerged fairly recently in history; it only became widespread in the 19th century. The act of re-commemorating an individual’s life by moving his remains is quintessentially modern -- a kind of secular resurrection of the deceased person's social importance. “Relocation and reburial,” writes Kammen, “or ‘translation’ of a body, to use the traditional, Latin-derived word, are invariably all about the resurgence of the reputation of and hence respect for someone whose lamp and visage had dimmed in some way.”
The concerns of the living mattered at least as much as the intentions of the dead. “Survivors (or members of the next generation),” Kammen writes, “frequently insisted – sometimes despite evidence to the contrary – that the deceased had really wanted to be buried in a spot other than the one initially chosen, often for reasons of convenience at the time. Arguments over where the most appropriate site might be frequently persisted for decades, and in certain cases even longer.”
The most extreme case may be Daniel Boone. The explorer and politician was born in Pennsylvania in 1734, lived for many years in Kentucky, and died in 1820 in Missouri, where he was buried. Within a couple of decades, the Kentucky legislature decided its favorite son should be brought home. Boone himself might not have liked that idea. Given certain unhappy experiences in Kentucky involving real estate, he never set foot in the state again after 1799.
And in any case, Missouri officials wanted him to stay where he had been planted.
Things got emotional, and then they got weird. With the approval of members of Boone’s family, his remains were excavated and transported to Kentucky to be reburied with great public celebration. But first the skull was, in the words of one witness, “handled by the persons present and its peculiarities commented upon.” The new resting place was good for Kentucky's tourism. It also boosted the sale of lots at the cemetery where the body was interred. But meanwhile, people in Missouri grew increasingly annoyed.
As one editorialist put it in 1888, the great man’s grave was “desecrated to gratify a spasm of Kentucky pride.” A rumor circulated that Boone’s relatives had deliberately guided Kentucky’s patriots to the wrong grave, meaning that the real body was still in Missouri. The argument raged on for decades, and it still does. Search “Daniel Boone gravesite” in 2010 and you still have your choice of locations.
The image of people standing around commenting on the shape of Daniel Boone’s skull may sound comic or disgusting, but it was not such an unusual thing to do, back then. Perhaps the most disconcerting aspect of Kammen’s book is its reminder of how much has changed about our attitudes towards close contact between the living and the dead.
One 19th century man of the cloth pointed out, Kammen says, “that the desire to scrutinize bodily decay – he called it a ‘morbid desire’ – was especially prevalent among women; some even wanted to descend into tombs, lift the coffin lid, and ‘gaze upon the mouldering bones’ of their parent or child.” No doubt there was some perfectly understandable reason why they felt this urge. Even so, I just don’t want to think about it.
And then there is this description of an exhumation in 1875, reported in a Baltimore newspaper:
“The laborers employed to perform the task, upon digging to the depth of about five feet, discovered the coffin in a good state of preservation, after having lain in place nearly 26 years. The lid was removed, and the remains curiously examined by the few present. There, before their gaze, was extended the skeleton, almost in perfect condition, and lying with the long bony hands reposing one upon the other, as they had been arranged in death. The skull bore marks of greater decay, the teeth from the upper jaw having become dislodged, but those in the lower were all in place, and some little hair was still clinging near the forehead....”
Here the guest of honor was Edgar Allen Poe. At least his peaceful condition shows he was not buried alive.
Kammen’s anthology of “translations” is, in effect, an account of what historians are always doing – digging into the past, moving the remains to a new location, engraving a new memorial. For that matter, it brings to mind the old joke about dissertation writing as a process of transporting bones from one coffin to another.
But it is also a reminder of the final context of all human activity, scholarly and otherwise. In the words of Thomas Paine, writing in 1777: “However men may differ in their ideas of grandeur or government here, the grave is nevertheless a perfect republic.” In the long run, we all end up naturalized.
The new titles that arrive from publishers each week usually come with promotional material that, apart from remembering to recycle, I carefully ignore. But over the past week -- thanks to an eagle-eyed colleague -- I have been making up for this practiced neglect by lingering over one publicist's letter in particular.
It is remarkable. It may be the most striking and provocative bit of prose concerning a scholarly book to have circulated in some while. The passage in question runs to one paragraph appearing about two-thirds of the way down the page of a note accompanying the page proofs for 1877: America’s Year of Living Violently by Michael A. Bellesiles, to be published by the New Press in August. Here it is:
“A major new work of popular history, 1877 is also notable as the comeback book for a celebrated U.S. historian. Michael Bellesiles is perhaps most famous as the target of an infamous ‘swiftboating’ campaign by the National Rifle Association, following the publication of his Bancroft Prize-winning book Arming America (Knopf, 2000) -- ‘the best kind of non-fiction,’ according to the Chicago Tribune -- which made daring claims about gun ownership in early America. In what became the history profession’s most talked-about and notorious case of the past generation, Arming America was eventually discredited after an unprecedented and controversial review called into question its sources, charges which Bellesiles and his many prominent supporters have always rejected.”
These sentences have absorbed and rewarded my attention for days on end. They are a masterpiece of evasion. The paragraph is, in its way, quite impressive. Every word of it is misleading, including “and” and “the.”
Bellesiles has a certain claim to fame, certainly, but not as “the target of an infamous ‘swiftboating’ campaign.” He is, and will be forever remembered as, a historian whose colleagues found him to have violated his profession's standards of scholarly integrity. Arming America won the Bancroft Prize -- the highest honor for a book on American history. But far more salient is the fact that the Bancroft committee took the unprecedented step of withdrawing the prize.
It is true that he drew the ire of the National Rifle Association, and I have no inclination to give that organization's well-funded demagogy the benefit of any doubt. But gun nuts did not force Bellesiles to do sloppy research or to falsify sources. That his scholarship was grossly incompetent on many points is not a "controversial" notion. Nor is it open to dispute whether or not he falsified sources. That has been exhaustively documented by his peers. To pretend otherwise is itself demagogic.
If a major commercial press wants to help a disgraced figure make his comeback, that is one thing, but rewriting history is another. The New Press published many excellent books by important authors. It is out of respect for that record that I want to invite it to make a public apology for violating the trust its readers have in it.
The saga of Michael Bellesiles (pronounced "buh-LEELS" or "buh-LAYELS," depending on who you ask) was at its height in 2001 and came to a resolution (or so one thought) the following year, when Bellesiles resigned from his position as professor of history at Emory University. As the case was unfolding, I followed it rather closely, but until seeing the New Press statement last week had managed to forget it almost entirely.
This was not just a matter of midlife memory loss. The affair was embarrassing and disgraceful, and it left Bellesiles in a position where he had little left that anyone would recognize as dignity. If you regard Charlton Heston as a role model for political activism, maybe the whole thing seems like a glorious chapter in recent history. For anyone else, to forget the whole thing was a mercy.
Matters began with an article Bellesiles published in The Journal of American History in 1996. He claimed that his research among probate records suggested a very low rate of individual gun ownership in colonial America -- and indeed well into the 19th century. What Bellesiles called a “gun culture” only really developed in the wake of the Civil War,he argued, when mass-production of firearms made them more affordable.
Expanding on his thesis in Arming America, the author presented a new way of looking at the early days of the country. Firearms had been scarce and expensive, and were not found in most households. Hunting mostly involved using traps, rather than shooting. What guns were commonly available were usually old and in bad shape. The men who took up arms for their country during the American Revolution mostly got them from depots. And those citizen-soldiers didn't shoot very well, for not many of them were accustomed to handling guns. Since, again, guns were expensive and scarce.
Bellesiles cited many and diverse sources for all of these claims, but the most impressive aspect of his work -- the part he mentioned in interviews, and the part that professional historians and journalistic reviewers alike always stressed -- was the statistical evidence from his examination of probate records.
Now, people who care about no other part of the Constitution so much as the Second Amendment were incensed by Bellesiles's counternarrative of early history, which is hardly surprising. Besides conducting themselves in the usual polemical matter WHICH OFTEN INVOLVES WRITING LIKE THIS, they started to examine his notes and sources very, very closely. That is not surprising, either. Who else would have the incentive?
But the gun nuts were not the only people who had problems with Bellesiles’s work. Arming America received many favorable reviews in major journals of opinion, but fellow historians had been expressing reservations about the probate data ever since that article had appeared in the JAH a few years earlier. For one thing, there were questions about how Bellesiles had gathered his information, and where; and about whether he was counting things correctly. He treated wills as if they were a completely reliable list of the whole of someone's property, even though the experts on probate know better, and even though he cited some of those scholars in his own notes.
The statistical claims in particular were a problem. Scholars would later try -- and fail -- to duplicate the results Bellesiles reported from his number-crunching. At first, it was possible to shrug this off as evidence that he was clumsy with the calculator. But things were not that simple.The figures on Bellesiles’s statistical tables were the tip of the iceberg.
People following up his notes kept finding problems: inaccurate quotations, mischaracterized sources, failure to include evidence that ran contrary to his thesis, and so on. At first, it was easy to dismiss the complaints because they had a screed-like quality. But qualified scholars who looked into the matter came away shaking their heads. A symposium on Arming America appeared in the William and Mary Quarterly in early 2002, followed not much later by James Lindgren’s review-essay in The Yale Law Journal.
At the request of Emory University, three prominent historians, assisted by graduate students, examined the evidence about Bellesiles’s work. In particular, they looked at his claims concerning what probate and militia records showed about gun ownership in early America -- and, in what proved even more of a problem, at how he accounted for the discrepancies between what he claimed and what the archival records actually showed. The resulting “Report of the Investigative Committee in the Matter of Professor Michael Bellesiles,” released in October 2002, was devastating.
“We have interviewed Professor Bellesiles,” the committee reported, “and found him both cooperative and respectful of this process. Yet the best that can be said of his work with the probate and militia records is that he is guilty of unprofessional and misleading work. Every aspect of his work in the probate records is deeply flawed.... Subsequent to the allegations of research misconduct, his responses have been prolix, confusing, evasive, and occasionally contradictory. We are surprised and troubled that Bellesiles has not availed himself of the opportunities he has had since the notice of this investigation to examine, identify, and share his remaining research materials.”
While acknowledging that "unfamiliarity with quantitative methods or plain incompetence" possibly accounted for some of the deficiencies in Bellesiles's statistical data, the committee found that he was also in violation of the standards of scholarly integrity as defined by the American Historical Association, which (to quote its report) "includes ‘an awareness of one’s own bias and a readiness to follow sound methods and analysis wherever they may lead,’ ‘disclosure of all significant qualifications of one’s arguments,’ careful documentation of findings and the responsibility to ‘thereafter be prepared to make available to others their sources, evidence, and data,’ and the injunction that ‘historians must not misrepresent evidence or the sources of evidence.’ ”
Bellesiles was culpable on all points. “In fact,” the report noted, “Professor Bellesiles told the committee that because of criticism from other scholars, he himself had begun to doubt the quality of his probate research well before he published it in the Journal of American History.”
So much for the myth of a scholar whose greatest crime was making “daring claims” that left him vulnerable to "swiftboating." Michael Bellesiles's greatest enemy was never the NRA. It was Michael Bellesiles.
Just after reading the promotional letter accompanying Bellesiles's new book, I contacted the New Press to find out more about this campaign to rehabilitate him. The publicist offered to provide me with a copy of the chapter on Arming America from Jon Wiener’s book Historians in Trouble: Plagiarism, Fraud, and Politics in the Ivory Tower, published by the New Press in 2005.
As it happened, I had already seen the chapter, and have ended up going over it a couple of times over the past week while reading other material on l'affaire Bellesiles. Wiener portrays his subject as the victim of a witch hunt -- suggesting that his errors were few in number, limited in significance for his argument, and finally of a rather unremarkable sort. They were the result of being sloppy about record-keeping and venturing too far out of his depth in the cliometrics department. To be human is to make mistakes. Besides, everybody forgets about those parts of Arming America where there weren’t any problems.
This seems generous to a fault. Anyone trying to form an assessment of the affair needs to read the Emory report -- keeping in mind that the committee ignored numerous problems with claims and evidence. Indeed, a fairly useful pedagogical tool for students in history, law, and journalism would be a casebook on Arming America, including documentation from Bellesiles's various attempts to explain himself and the evidence that made his efforts more difficult (such as this, for example).
In any case, I finally got in touch with Marc Favreau, his editor at the New Press, to ask whether any sort of due diligence had been practiced with Bellesiles's new book, considering the author's reputation. He responded that he was "well-versed" in the scholarly disputes over Arming America and referred me to Wiener's book. "What we are concerned with, then and now," he told me, "is the extent to which the fury around Michael’s thesis was stoked by a virulent pro-gun movement."
Now, this is hardly satisfactory. A thing may be true even if Charlton Heston said it. But in any case, Favreau insisted that Bellesiles's new book 1877 will be uncontroversial as to both argument and methodology. "In our initial conversations with him we were impressed by his knowledge of and passion for his subject matter," he said. "Although trade publishers rarely, if ever, solicit peer reviews (unlike university presses), we've nevertheless been very pleased to receive wonderful advance quotes from a number of prominent scholars and historians."
It also seemed appropriate to get in touch with Bellesiles himself. He is currently an adjunct in history at Central Connecticut State University. I wrote to him to ask what he hoped the book would accomplish, given that his return to public life must necessarily include questions about his credibility. Had he taken any particular steps that would inspire confidence in someone who was acquainted with his colleagues' findings about Arming America?
"I rest my credibility on the basic standards of scholarship," he responded by e-mail, "and have done what every reputable historian does, and exactly what I did in Arming America: I cite my sources."
At this point while reading his note, I found my eyes turning away from the screen in embarrassment. Eight years ago, when reputable historians found Bellesiles's work to lack scholarly integrity, none of them claimed he had failed to cite sources. Anyone can cite sources. The pro-gun arguments of John Lott are decked out with the apparatus of scholarship, but that doesn't mean his statistical claims aren't dubious.
In any case, Bellesiles has made himself "familiar with modern technology, computer databases, and all aspects of our digital world," he told me. "All my notes to 1877 are digitized and thoroughly backed up in a number of different formats."
All things considered, this is only just so reassuring.
In October, the U.S. Department of Labor announced a fine of more than $87.4 million on BP North America Inc. for "failure to correct [the] potential hazards faced by employees” that had been uncovered by the Occupational Safety and Health Administration. This set an all-time record for penalties set by OSHA on any company -- dwarfing the previous one, from 2005, of a mere $21 million, imposed after an explosion at a BP refinery killed 15 people and injured 170 others.
Since last fall, BP has gone on to bigger things. A tone of moral indignation has been heard lately (on Capitol Hill, for instance) regarding those OSHA violations. But why the outrage? It’s just business. As long as risk to the company's workers can be translated into a calculable expense, decisions will be made on a rational basis. With an eye on the bottom line, the company can decide whether or not to install adequate equipment to protect either workers or the environment.
Or not to protect them, as the case may be. Profit is profit, and the ocean has no lawyer. Let's not pretend otherwise.
Of course, events might have unfolded very differently if the people working on the offshore rig had decided to shut production down when the company pushed them (once again) to cut corners and ignore danger signs. Every time I see a picture from the Gulf of Mexico, I wonder about that. But when politicians or people in the mass media discuss the situation, work stoppage by BP's employees is one possibility that never comes up.
The very idea seems almost unthinkable. It is easier to get mad at how flagrantly BP ignored safety violations than to imagine labor acting outside the established framework of government regulation and corporate decision making. Maybe BP can afford this failure of the imagination -- but I doubt the planet can, at least not forever.
So it’s a good time to have a new edition of Irving Bernstein’s two studies The Lean Years (1960) and The Turbulent Years (1969). Originally published by Houghton Mifflin, they have just been reissued in paperback by Haymarket Books and offer, between them, a classic survey of how American workers fared during the 1920s and ‘30s. SPOILER ALERT: They tended to do best when they had the confidence and the willingness to challenge their employers -- and not just over wages. Bernstein, who at the time of his death in 2001 was an emeritus professor of political science at the University of California at Los Angeles, makes clear that control over working conditions was usually also at stake.
What set Bernstein's work apart from the usual run of scholarship on American labor history at mid-century was his strong interest in the life and activity of non-unionized people -- including those working in agriculture, or leaving it behind for new kinds of employment, in the case of African-Americans leaving the South. And Bernstein wrote with grace. He had a knack for the thumbnail biography of ordinary people: There are numerous miniature portraits embedded in the epic. He was sensitive to the changes in mood among workers as they faced the boom of the 1920s (which passed most of them by) and the agony of the Depression (which hit them hardest). In many cases, they blamed themselves for their misery. The possibility of joining forces with others to change anything took a while to sink in.
The new paperback editions come with introductions by Frances Fox Piven, a professor of sociology and political science at the City University of New York Graduate Center, who draws out Bernstein's argument on this point: "The train of developments that connects changes in social conditions to a changed consciousness is not simple. People ... harbor somewhere in their memories the building blocks of different and contradictory interpretations of what it is that is happening to them, of who should be blamed, and what can be done about it. Even the hangdog and ashamed unemployed worker who swings his lunch box and strides down the street so the neighbors will think he is going to a job can also have other ideas that only have to be evoked, and when they are, make it possible for him on another day to rally with others and rise up in anger at his condition."
Quoting that passage gives me pause -- for Piven, a former president of the American Sociological Association, has in recent months been the focus of intricate theories about how Barack Obama was using ACORN to impose martial law on gated communities. Or perhaps ACORN was using Barack Obama to that end. I must admit some difficulty in reading the pertinent diagram. But in short, she has been involved in some quite nefarious activity, such as encouraging poor people to vote.
No doubt this will make Piven's endorsement of Irving Bernstein's two books seem particularly worrying. Only someone in the Tea Party (a well-funded movement organized by professional lobbyists) is supposed to "rally with others and rise up in anger in his condition" -- not an unemployed person who wants work and decent health care. Furthermore, protesters ought to direct their rage strictly at the government, and never at private enterprise.
I suppose the late Irving Bernstein will end up as a box in the big flow chart of cyclothymic, pseudopopulist political discourse. It seems like a matter of time. But if you read his books, something eventually becomes clear. He thought the New Deal had saved capitalism and made it more fair. He was not fond of the Communists, who expected the Depression would work to their advantage. Before writing his labor histories, Bernstein specialized in collective bargaining. (Aside from publishing books on the subject, he served in arbitration disputes.) The Turbulent Years is dedicated to Clark Kerr -- the president of the University of California system and a major target of the radical student movement in the 1960s.
In short, when Bernstein wrote with sympathy about the strikes and street fighting of the 1930s, it was not out of an instinctive combativeness but from a sense that people do these things because they have been left no choice by "an unbalanced society" (to borrow an expression he used to describe the United States on the eve of the crash of 1929). If his book sounds almost revolutionary now, that is a sign that the ordinary frame of reference for political judgment has skewed so far to the right that reality is standing sideways.
I contacted Frances Fox Piven to ask her opinion of this assessment.
"Bernstein definitely thought of himself as a centrist, but a reformer," she told me. "He was quite contemptuous, for example, of ideologues on the Left in the 1930s. But he was never contemptuous of workers themselves, and his respect and empathy for workers forced him to pay attention, even respectful attention, to the strikes and sit-downs and demonstrations they undertook during the 1930s. One of the consequences of the rise of a turbulent and aggressive labor movement was to open up normal politics, to move the political culture to the Left. The Civil Rights movement had a similar consequence thirty years later. It is chastening to observe that in the absence of mass movements from the bottom (and the Tea Party is not a movement from the bottom) that our politics reverts to a kind of default position in which business interest groups have outsized influence."
If a sufficiently "turbulent and aggressive" spirit had prevailed among the people working for BP just a couple of months ago, there might not now be one hundred thousand barrels of crude oil (by the company's own estimate) surging into the ocean every day -- with no end in sight.
Certain research topics seem to destined to inspire the question, “Seriously, you study that?” So it is with the field of Twitter scholarship. Which -- just to get this out of the way -- is not actually published in 140 characters or less. (The average “tweet” is the equivalent of two fairly terse sentences. It is like haiku, only more self-involved.)
The Library of Congress announced in April that it was acquiring the complete digital archives of the “microblogging” service, beginning with the very first tweet, from ancient times. At present, the Twitter archive consists of 5 terabytes of data. If all of the printed holdings of the LC were digitalized, it would come to 10 to 20 terabytes (this figure does not include manuscripts, photographs, films, or audio recordings).
Some 50 million new messages are sent on Twitter each day, although one recent discussion at the LC suggested that the rate is much higher -- at least when the site is not shutting down from sheer traffic volume, which seems to be happening a lot lately. A new video on YouTube shows a few seconds from the "garden hose" of incoming Twitter content.
When word of this acquisition was posted to the Library of Congress news blog two months ago, it elicited comment by people who could not believe that anything so casual and hyper-ephemeral as the average tweet was worth preserving for posterity – let alone analyzing. Thanks to the Twitter archive, historians will know that someone ate a sandwich. Why would they care?
Other citizens became agitated at the thought that “private” communications posted to Twitter were being stored and made available to a vast public. Which really does seem rather unclear on the concept. I’m as prone to dire mutterings about the panopticon as anybody -- but come on, folks. The era of digital media reinforces the basic principle that privacy is at least in part a matter of impulse control. Keeping something to yourself is not compatible with posting it to a public forum. Evidently this is not as obvious as it should be. Things you send directly to friends on Twitter won't be part of the Library's holdings, but if you celebrated a hook-up by announcing it to all and sundry, it now belongs to the ages.
A working group of librarians is figuring out how to “process” this material (to adapt the lingo we used when I worked as an archival technician in the Library's manuscript division) before making the collection available to researchers. But it’s not as if scholars have been waiting around until the collection is ready. Public response to the notion of “Twitter studies” might be incredulous, but the existing literature gives you some idea of what can be done with this giant pulsing mass of random discursive particles.
A reading of the scholarship suggests that individual tweets, as such, are not the focus of very much attention. I suppose the presidential papers of Barack Obama will one day include an annotated edition of postings to his Twitter feed. But that is the exception and not the rule.
Instead, the research, so far, tends to fall into two broad categories. One body focuses on the properties of Twitter as a medium. (Or, what amounts to a variation on the same thing, as one part of an emerging new-media ecosystem.) The other approach involves analyzing gigantic masses of Twitter data to find evidence concerning public opinion or mood.
Before giving a thumbnail account of some of this work – which, as the bibliography I’ve consulted suggests, seems intrinsically interdisciplinary – it may be worth pointing out something mildly paradoxical: the very qualities that make Twitter seem unworthy of study are precisely what render it potentially quite interesting. The spontaneity and impulsiveness of expression it encourages, and the fact that millions of people use it to communicate in ways that often blur the distinction between public and private space, mean that Twitter has generated an almost real-time documentary record of ordinary existence over the past four years.
There may be some value to developing tools for understanding ordinary existence. It is, after all, where we spend most of our time.
Twitter shares properties found in numerous other new-media formats. The term “information stream” is sometimes used to characterize digital communication, of whatever sort. Inside Higher Ed “flows” at the rate of a certain number of articles per day during the workweek. An online scholarly journal, by contrast, will probably trickle. A television network’s website -- or the more manic sort of Twitter feed -- will tend to gush. But the “streaming” principle is the same in any case, and you never step into the same river twice.
A recent paper by Mor Naaman and others from the School of Communication and Information at Rutgers University uses a significant variation on this concept, the “social awareness stream,” to label Twitter and Facebook, among other formats. Social awareness streams, according to Naaman et al., “are typified by three factors distinguishing them from other communication: a) the public (or personal-public) nature of the communication and conversation; b) the brevity of posted content; and, c) a highly connected social space, where most of the information consumption is enabled and driven by articulated online contact networks.”
Understanding those “articulated online contact networks” involves, for one thing, mapping them. And such mapping efforts have been underway since well before Twitter came on the scene. What makes the Twitter “stream” particularly interesting is that – unlike Facebook and other social-network services -- the design of the service permits both reciprocal connections (person A “follows” person B, and vice versa) and one-sided (A follows B, but that’s it). This makes for both strong and weak communicative bonds within networks -- but also among them. And various conventions have emerged to allow Twitter users to signal one another or to urge attention to a particular topic or comment. Besides “retweeting” someone’s message, you can address a particular person (using the @ symbol, like so: @JohnDoe) or index a message by topic (noted with the hashtag, thusly: #topicdujour).
All of this is, of course, familiar enough to anyone who uses Twitter. But it has important implications for just what kind of communication system Twitter fosters. To quote the title of an impressive paper by Haewoon Kwak and three other researchers from the department of computer science at the Korea Advanced Institute of Science and Technology: “What is Twitter, a Social Network or a News Media?” (No sense protesting that “media” is not a singular noun. Best to grind one’s teeth quietly.)
Analyzing almost 42 million user profiles and 106 million tweets, Kwak and colleagues find that Twitter occupies a strange niche that combines elements of both mass media and homophilous social groups. (Homophily is defined as the tendency of people to sustain more contact with those they judge to be similar to themselves than with those who they perceive to be dissimilar.) "Twitters shows a low level of reciprocity," they write. "77.9 percent of user pairs with any link between them are connected one-way, and only 22.1 percent have reciprocal relationships between them.... Previous studies have reported much higher reciprocity on other social networking services: 68 percent on Flickr and 84 percent on Yahoo."
In part, this reflects the presence on Twitter of already established mass-media outlets – not to mention already-famous people who have millions of “followers” without reciprocating. But the researchers find that a system of informal but efficient “retweet trees” also function “as communication channels of information diffusion.” Interest in a given Twitter post can rapidly spread across otherwise disconnected social networks. Kwak’s team found that any retweeted item would “reach an average of 1,000 users no matter what the number of followers is of the original tweet. Once retweeted, a tweet gets retweeted [again] almost instantly on the second, third, and fourth hops away from the source, signifying fast diffusion of information after the first retweet.”
Eventually someone will synthesize these and other analyses of Twitter’s functioning -- along with studies of other institutional and mass-media networks -- and give us some way to understand this post-McLuhanesque cultural system. In the meantime, research is being done on how to use the constant landslide of Twitter messages to gauge public attitudes and mood.
As Brendan O’Connor and his co-authors from Carnegie Mellon University note in a paper published last month, the usual method of conducting a public-opinion poll by telephone can cost tens of thousands of dollars. (Besides, lots of us hang up immediately on the suspicion that it will turn into a telemarketing call.)
Using one billion Twitter messages from 2008 and ’09 as a database, O’Connor and colleagues ran searches for keywords related to politics and the economy, then generated a “sentiment score” based on the lists of 1,600 “positive” and 1,200 “negative” words. They then compared these “text sentiment” findings to the results of more traditional public opinion polls concerning consumer confidence, the election of 2008, and the new president’s approval ratings. They found sufficiently strong correlation to be encouraging -- and noted that by the summer of 2009, when many more people were on Twitter than had been the case in 2008, the text-sentiment results proved a good predictor of consumer confidence levels.
A different methodology was used in “Modeling Public Mood and Emotion: Twitter Sentiment and Socio-Economic Phenomena” by John Bollen of Indiana University and two other authors. They collected all public tweets from August 1 to December 20, 2008 and harvested from them data about the content that could be plugged into “a well-established psychometric instrument, the Profile of Mood States” which “measures six individual dimensions of mood, namely Tension, Depression, Anger, Vigor, Fatigue, and Confusion.” This sounds like something from one of Woody Allen’s better movies.
The data crunching yielded “a six dimensional mood vector” covering the months in question. Which, as luck would have it, coincided with both the financial meltdown and the presidential election of 2008. The resulting graphs are intriguing.
Following the election, the negative moods (Tension, Depression, etc.) fell off. There was “a significant spike in Vigor.” Examination of samples of Twitter traffic showed “a preponderance of tweets expressing high levels of energy and positive sentiments.”
But by December 2008, as the Dow Jones Industrial Average fell to below 9000 points, the charts show a conspicuous rise in Anger -- and an even stronger one for Depression. The researchers write that this may have been an early signal of “what appears to be a populist movement in opposition to the new Obama administration.”
“Tweets may be regarded,” write Bollen and colleagues, “as microscopic instantiations of mood.” And they speculate that the microblogging system may do more than reflect shifts of public temper: “The social network of Twitter may highly affect the dynamics of public sentiment…[O]ur results are suggestive of escalating bursts of mood activity, suggesting that sentiment spreads across network ties.”
As good a reason as any to put this archive of the everyday into the time capsule. And while my perspective on this may be a little off-center, I think it is fair that the Twitter record should be stored at the the Library of Congress, which also houses the papers of the American presidents up through Theodore Roosevelt.
Almost 20 years ago, I started to work there just around the corner from the bound volumes containing, among other things, the diaries of George Washington. The experience of taking a quick look at them was something like a rite of passage for people working in the manuscript division. And to judge by later conversations among colleagues, the experience was usually slightly bewildering.
You would open the volume and gaze at the very page where his hand had guided the quill. You would start to read, expecting deep thoughts, or historical-seeming ones, at any rate. And this, more or less, is what you found on every page:
"Rained today. Three goats died. Need to buy a new plow.”
He had another 85 characters to spare.
P.S. Follow me on Twitter here, and keep up with news on scholarly publishing here.
Call it a revival, of sorts. In recent years, anyone interested in contemporary European philosophy has noticed a tendency variously called the religious or theological "turn" (adapting a formulation previously used to describe the "linguistic turn" of the 1960s and '70s). Thinkers have revisited scriptural texts, for example, or traced the logic of seemingly secular concepts, such as political sovereignty, back to their moorings in theology. The list of figures involved would include Emmanuel Levinas, Jacques Derrida, Gianni Vattimo, Alain Badiou, Giorgio Agamben, Slavoj Å½iÅ¾ek, and Jürgen Habermas -- to give a list no longer or more heterogenous than that.
A sampling of recent work done in the wake of this turn can be found in After the Postsecular and the Postmodern: New Essays in Continental Philosophy of Religion, a collection just issued by Cambridge Scholars Publishing. One of the editors, Anthony Paul Smith, is a Ph.D. candidate at the University of Nottingham and also a research fellow at the Institute for Nature and Culture at DePaul University. The other, Daniel Whistler, is a tutor at the University of Oxford, where he just submitted a dissertation on F.W.J. Schelling's theology of language. I interviewed them about their book by e-mail. A transcript of the discussion follows.
Q: Let’s start with one word in your title -- "postsecular." What do you mean by this? People used to spend an awful lot of energy trying to determine just when modernity ended and postmodernity began. Does “postsecularity” imply any periodization?
APS: In the book we talk about the postsecular event, an obvious nod to the philosophy of Alain Badiou. For a long time in Europe and through its colonial activities our frame of discourse, the way we understood the relationship of politics and religion, was determined by the notion that there is a split between public politics and private religion. This frame of reference broke down. We can locate that break, for the sake of simplicity, in the anti-colonial struggles of the latter half of the 20th century. The most famous example is, of course, the initial thrust of the Iranian Revolution.
It took some time before the implications of this were thought through, and it is difficult to pin down when “postsecularity” came to prominence in the academy, but in the 1990s a number of Christian theologians like John Milbank and Stanley Hauerwas, along with non-Christian thinkers like Talal Asad, began to question the typical assumption of philosophy of religion: that religious traditions and religious discourses need to be mediated through a neutral secular discourse in order to make sense. Their critique was simple: the secular is not neutral. Philosophy is intrinsically biased towards the secular. If you follow people like Asad and Tomoko Masuzawa, this means it is biased toward a Christian conception of the secular, and this hinders it from appreciating the thought structures at work in particular religions.
One of the reasons the title of the book reads, “after the postsecular” is that we felt philosophy of religion had yet to take the postsecular event seriously enough; it was ignoring the intellectual importance of this political event and still clinging to old paradigms for philosophizing about religion, when they had in fact been put into question by the above critique. So, the question is: What does philosophy of religion do now, after the postsecular critique?
DJW: There are two other reasons we speak of this volume being situated after the postsecular. First, in our “Introduction” we distinguish between a genuine postsecular critique of the kind Anthony mentions and a problematic theological appropriation of this critique. The former results in a pluralization of discourses about religion, because the secular is no longer the overarching master-narrative, but one more particular tradition. The latter, however, has tried to replace the secular master-narrative with a Christian one, and so has perversely impeded this process of pluralization.
Yet it is precisely this theological move (exemplified by Radical Orthodoxy) which is more often than not associated with the postsecular. Thus, one of the aims of the volume is to move beyond (hence, “after”) this theological appropriation of the postsecular.
Second, we also conjecture in the Introduction that postsecularity has ended up throwing the baby out with the bathwater – that is, everything from the secular tradition, even what is still valuable. So, in Part One of the volume, especially, the contributors return to the modern, secular tradition to test what is of value in it and what can be reappropriated for contemporary philosophy of religion. In this sense, "after the postsecular" means a mediated return to the secular.
Q: You mentioned Radical Orthodoxy, of which the leader is John Milbank. His rereading of the history of European philosophy and social theory tries to claim a central place for Christian theology as "queen of the sciences." As an agnostic, I tend to think of this as sort of the intellectual equivalent of the Society for Creative Anachronism. But clearly it's been an agenda-setting program in some sectors of theology and philosophy of religion. In counterposing your notion of the postsecular to Radical Orthodoxy, are you implying that the latter is exhausted? Or does that mean that Radical Orthodoxy is still a force to be reckoned with?
APS: On the one hand Radical Orthodoxy, as a particular movement or tendency, is probably exhausted in terms of the creativity and energy that attracted a lot of younger scholars who were working mostly in Christian theology but also in Continental philosophy of religion.
In a way, those of us in this field know what Radical Orthodoxy is now -- whereas before its anachronism seemed to be opening genuinely interesting lines of intellectual inquiry, perhaps encouraging interesting changes in the structure of institutional religious life. Now its major figures have aligned themselves with the thought of the current Pope in his attempt at “Re-Christianizing Europe,” with its nefarious narrative of a Christian Europe needing to be defended against Islam and secularism. They are also aligned with the policies of the present-day UK Tory Party via Phillip Blond and his trendy ResPublica think-tank.
So, on the other hand, while its creative power is probably on the wane, it is still something that must be reckoned with -- precisely because of this newfound institutional power, and because we know that its research program ends in old answers to new questions. We have to move beyond mere criticism, though, to offering a better positive understanding of religion, philosophy, and politics, and this volume begins to do that. This means going far beyond addressing Radical Orthodoxy as such, though, and to addressing the reactionary and obfuscatory form of thought that lies beneath Radical Orthodoxy and which persists in other thinkers who don’t identity with this particular movement.
DJW: Yes, it is something broader that troubles continental philosophy of religion now – not merely Radical Orthodoxy as such, but what we try to articulate in our Introduction as the more general tendency to theologize philosophy of religion. Many philosophers of religion – even when they see themselves as opponents of Radical Orthodoxy – ultimately treat their discipline as an extension of theology. It is quite normal to attend a keynote lecture at a Continental philosophy of religion conference and end up listening to a theology lecture! This is the reason that questions concerning the specificity of philosophy of religion (what sets it structurally apart from theology) dominate After the Postsecular and the Postmodern. Such questions are not meant solely as attacks on Radical Orthodoxy, but aim to interrogate the whole zeitgeist in which Radical Orthodoxy participates.
Q: I'm struck by how your book reflects a revival of interest in certain thinkers -- Schelling, Bergson, Rosenzweig. Or rather, perhaps, their transformation from the focus of more or less historical interest to inspiration for contemporary speculation. How much of this is a matter of following in the footsteps of Deleuze or Å½iÅ¾ek?
DJW: Deleuze and Å½iÅ¾ek are exemplary figures for many of the contributors to this volume. We philosophize in their shadow – and, you’re right, in particular it is their perverse readings of Bergson, Schelling etc which have taught us how to relate to the history of philosophy in new, heterodox ways.
“Experiment” is one of the key words in After the Postsecular and the Postmodern: all of us who contributed wanted to see what new potential could be opened up within philosophy of religion by mutating its traditions and canons through the lens of contemporary speculation. Having said that, I think both terms of your distinction (“inspiration for contemporary speculation” and “historical interest”) are important at the present moment.
Ignorance of the history of philosophy of religion is the academic norm, and our wager is that through straightforward history of philosophy one can excavate resources that have been neglected, so as to begin to see the discipline afresh. It is a matter of revitalizing our sense of what philosophy of religion can do. Therefore, while mutating the history of philosophy is crucial, so too is understanding what that history is. So little has been written about Bergson or Rosenzweig’s contributions in this regard that a relatively straight-laced understanding of them is one of the volume’s most pressing tasks.
APS: In France at the time that Deleuze was studying and writing his first books, there was hegemony in the study of philosophy by the "three H's” (Hegel, Husserl, and Heidegger). He followed a different path in his own work, writing important studies on Hume, Bergson, and Nietzsche (amongst others). With the rise in Deleuze’s popularity these choices in figures have taken on the character of a canon, but at the time it was considered quite heretical and bold.
While the historical canon for mainstream Anglophone philosophy of religion tends to focuses on Locke, Hume, and Kant, we hope our volume helps to establish an alternative canon that draws on more speculative thinkers from the modern tradition, like Spinoza, Schelling, and Bergson. We think that not only will this help us to address the persistent questions of philosophy of religion but will allow us to reframe those very questions.
Q: The names of a few contributors are familiar to me from reading An und für sich and other blogs. Would you say something about how the sort of "floating seminar space" of online conversation shapes the emergence of a project like this one?
APS: Many people have noted the democratic nature of blogging, which can disrupt the usual hierarchies in the academic world. While that can lead to intensely antagonistic encounters -- especially in the early days when we were all still navigating this new social space -- it can also lead to incredible intellectual friendships. I started blogging when I was 19 in the hopes of being part of an intellectual community that I didn’t have at university. This lack of a community was partly because I was a commuter student traveling four hours round trip per day, which didn’t leave a lot of time to participate face-to-face, and partly because my own interests in religion were not shared by most of the other students in my philosophy department.
The group blogs I have been a part of, first The Weblog and then An und für sich, attracted people in similar situations -- people who existed in a liminal space between philosophy, theory, theology, and religious studies and wanted to discuss these issues, but for whatever reason couldn’t do so in their immediate communities.
I think it is safe to say that without the blogging community the volume wouldn’t have existed. It was because of the blog that Daniel first contacted me about participating in the postgraduate conference in philosophy of religion that he had set up in Oxford and it was this conference that ultimately led to the volume. We have tried to transfer the democratic spirit of blogging to the volume, so while we do have contributions from established academics in the volume, we also have included a number of graduate students, intellectuals outside the academy, and those still searching for a tenured position (if there are any!).
Even though we don’t have a “big name” like Å½iÅ¾ek or Vattimo in the volume, we have still been able to attract interest simply on the strength of the ideas in the book, which are talked about on AUFS and other blogs. The volume has even made its way onto a syllabus already! John Caputo, formerly professor of philosophy at Villanova and now professor of religion and humanities at Syracuse, has his students reading the Editors’ Introduction for his graduate course called "The Future of Continental Philosophy of Religion," which we are really excited about.
Q: Sometimes the relationship of academic theological discourse to any creed or confession can be difficult to make out. With the philosophy of religion, obviously, such distance seems to be built right in. What are the stakes of your book – if any – for "people of faith," as the expression goes? That is, do you see this work as having consequences for what goes on at a church, synagogue, mosque, or whatever?
DJW: I tend to deploy a rather crude, form/content model on this issue: the material with which "people of faith," theologians, and philosophers of religion all deal is the same – "religion" in the broadest sense of the word. It is the operations of thought to which this material is subjected that differentiates them. What distinguishes philosophy of religion from theology or everyday religious practice is the specific kind of labor to which “religion” is here subjected. The question then becomes: Does "religion" after such transformations bear any resemblance to or (more importantly) have any relevance to the “religion” with which “people of faith” engage? And the answer is still very much open to dispute.
To take some examples: George Pattison (one of the contributors to the volume) is currently involved in a project on the phenomenology of religious life and it seems plausible that some form of this project could indeed be relevant to everyday religious practice – articulating its often implicit assumptions. On the other hand, I would be horrified if someone found a kernel of everyday relevance in my contribution on Schelling (in which I argue that names such as “Christ” or “Krishna” are literally the products of geological eruptions).
Personally (and here I am speaking very much for myself), I think there’s an element of smugness to the anti-“ivory tower” rhetoric that has emerged in the academy in the last century: the assertion that academics have something interesting or useful to say to the world imparts, in my mind, false value to what we say. In other words, I feel content to revel in the uselessness of my work.
APS: I love this answer! The militancy behind it stands against the pathetic “Theologian-Pope impulse” of so many theologians or the “Philosopher-King impulse” of so many philosophers that think the salvation for the world lies in our thought.
However, I want to nuance it somewhat, as I do think some of what lies behind what we do as academics, the reasons we take up this work, can participate in political struggles or help to deal with the very serious problems we face without our thought being directly “useful” in some crude practice of meeting targets or productivity goals. Spinoza wouldn’t have been much use as the ruler of the Netherlands, I’m sure, but when his ideas were taken up by others, and thereby mutated, they did have a real effect, and much of it positive.
The same goes for most of our great philosophers. But what Dan called the "uselessness" of our work in some sense mirrors the uselessness of religion in general. This character that religion has, identified by philosophers like Bataille, Nietzsche, and contemporaries like Goodchild, is in many ways offensive to the shape of contemporary life, where everything has its proper price, where we have to be thrifty and austere. Religion seems like a magnificent waste of time and money, unless of course it can be put to use convincing people to go to war to kill or be good little boys and girls and not harm their potential market value as workers with too much unclean living.
The same is true of this kind of academic work we do. It is useless within the parameters of contemporary society, but when contemporary society produces things like the poor and middle-class paying for massive bank bailouts and ecological disasters in the Gulf of Mexico and off the coast of Nigeria, then maybe uselessly thinking about things outside those parameters isn’t such a bad way to spend ones life.
Q: As I've been reading your book, Republican leader Newt Gingrich and others have been arguing idea that the imposition of Sharia law in the United States is an urgent danger that must be fought. From one perspective, this looks like pure cynicism; the notion that it’s a real issue in American political life is laughable. But what do you make of it? How does it fit in any narrative of the postsecular condition, or any analysis of the strains and fault lines of secularity?
APS: Right, there is about as much danger of Sharia law being imposed as there is of French becoming the national language! This is an example of what we call in our introduction the “obscure postsecular” (again drawing on Badiou). Out one side of their mouth these politicians tell us that we must defend our modern, secular values from the medieval barbarism of radical Islam, and out of the other side they are condemning secularists for not understanding the “power of religion.”
The power of this obscure postsecular, why it gets taken seriously, is because it latches on to a kernel of truth. Frankly, many in the public sphere don’t understand the power of religion! Hell, when it comes to Islam, many of them don’t even understand the basics, let alone that within Islam there is a cacophony of different spiritual practices and, as in most religions, an internal conflict between a law-bound Islam and an Islam of liberty. This is argued for very clearly in a number of French scholars of Islam, like Henry Corbin and Christian Jambet, though it doesn’t appear to have been a lesson the ruling class have learned going by the recent idiotic, racist and completely unsecular headscarf ban in France.
So, this lack of knowledge is behind both Gingrich’s call to resist Sharia law and the ruling, which Gingrich referenced, from the New Jersey judge that a Muslim man could forcibly rape his wife because it was a religious custom; I know of a number of Islamic feminists who I’m sure would like to speak with Judge Edith Payne! With both Gingrich and Payne we have an obscuring of the postsecular: they both recognize that something has changed, but they call on some transcendent identity of Islam or America that obscures any real confrontation with that change. Notice that neither one of them recognizes that there are elements within Islam -- mainstream Islam! -- that reject honor killings, abuse of women, the murder of civilians, and the like.
The situation becomes even more obscure in the UK, where I currently live. While in the U.S. all our money declares “In God We Trust”; in the UK all money bears the image of the sovereign, Queen Elizabeth II. Surely this, a divine right monarchy, is an example of the relic of medievalism that Gingrich mentions! Yet, on the other side of the bill, depending on the denomination, you will find Charles Darwin or Adam Smith. The very figures who ushered in the forms of thought that our old narratives tell us swept away medieval superstition.
Now, to my mind this means that all our conventional narratives of secularization are inherently flawed. The classic liberal narrative of a neutral secular has been undone by the postsecular event. The liberal secular was a weapon used in the expansion of European imperialism, which tried to deny those in the colonial world resources from their varied religious traditions.
At the same time the anti-liberal narrative that secularity is to be rejected because of this complicity is also false. It has a similar political function, by creating and exacerbating divisions within a particular class but along imaginary or unimportant differences, playing into a myopic Clash of Civilizations theory that actually engenders the reality of that clash. The volume offers resources towards constructing a very different theory of the secular, of a postsecular secular, what we call a “generic secular” that goes some way towards superseding these flawed, conventional narratives.
Practically that means both a straightforward understanding of particular religions as they present themselves in their complexity, suppressing as much as possible the imperialist tendencies of the liberal secular, and deploying the same kind of bold internal, immanent critique of these particular religions that we find in the modern thinkers covered in the volume. The answer to these political problems may partially be found by experimenting with ideas from Islam and Christianity from the position of the generic secular.
While scanning new magazines or newspapers, there are certain bylines you tend to notice. The list of them varies from person to person. But the habits of attention involved tend to be the same.
Usually the author is someone whose work you find informative, or stimulating, or otherwise agreeable (or some combination of these things). You tend to read the article immediately -- or postpone gratification until you’ve perused everything you must. Of course, things are not always so pleasant. The author may be your bête noir -- the very sight of his or her name provoking keen irritation. Which, to be sure, can involve its own pleasures.
Much of this speed-scan/instant notification is -- in my experience anyway -- involuntary, like a Pavlovian reflex. It would be possible to draw up a comprehensive checklist of authors whose bylines trigger my attention. But that would be after the fact. The "list" is unwritten and usually in flux. The whole process seems idiosyncratic and ad hoc. The brain knows what it wants, but isn't necessarily that rational or deliberate about it.
The historian Tony Judt, who died over the weekend, got entered into my registry not quite 20 years ago, when he started writing for The New York Review of Books and other publications. Some of his work was stimulating and some of it was annoying. His books on the European Left proved to be both. Judt was dismissive of questions and figures I thought were important, or else ignored them entirely. Reading Judt on Marxism involved a certain amount of intracranial yelling. As C.L.R. James once said about T.S. Eliot, he was someone I read in order to remind myself of what I do not think.
With Judt’s more recent writings on political topics (on the Middle East, the "strange death of liberal America,” and the prospects for a revitalized social democracy, for example), I noticed that other people were doing the complaining, in letters-to-the-editor columns and otherwise. This was gratifying, for Judt's conclusions were often similar to ones I'd come to by different means; and it was also instructive to see how he argued back against our shared opponents, especially since he did it more calmly than might be my wont.
Now it is the columnist’s privilege to express these things in utterly subjective terms. But just to be clear, no personal contact was involved. I never met Judt, nor made any effort to do so. The voice I was arguing with (or concurring with, as the case might be) was always the voice on the page.
This bears spelling out -- not because it was unusual, but because it is a completely normal relationship between author and reader, in many respects, at least for those of us whose sense of that relationship predates the Internet. It can be very intense, but it also possesses an element of distance, even of strict impersonality. Judt was someone with certain ideas who had published certain books and articles. His individuality stopped and started there.
All of which changed late last year, when I saw the video.
If you’ve seen it, you probably know what I mean. If not, here it is.
So little sense of Judt as an individual did I have that even his accent came as a surprise -- to say nothing of the impact of seeing him in a wheelchair, with tubes running into his nose (“a quadriplegic with facial Tupperware,” as he put it), all of it necessary given the muscular degeneration caused by ALS, also called Lou Gehrig’s Disease.
Judt’s lecture -- sponsored last fall by the Remarque Institute, which he founded at New York University in 1995 -- has since been expanded into a book, Ill Fares the Land, published this spring by Penguin. In different circumstances my attention would go strictly to his political concerns. But that has become difficult, maybe impossible. His condition required Judt to compose the book in his head, then dictate it. Knowing that a text was written in prison always creates a sort of double consciousness in the reader. All the more so when the prison is the author’s body.
The grace and humor projected in that video naturally filtered into the tone of the voice that began to come from the page, especially as Judt began to write the series of essays, several of them amounting to a memoir, over the final months of his life. The first of them was a description -- calm and candid, but at times panic-inducing to read -- of what the days and nights were like under his changed circumstances. By that point, the relationship between author and public was no longer the same. Each essay might very well be his last. And seeing his name on the cover of The New York Review of Books now registered, not as part of my habitual, conditioned readerly expectations, but as a challenge that might as well be called “existential.”
Judt belonged to a generation that used that word a lot, maybe too much, but it feels like the one needed here. Faced not just with mortality but with conditions in which putting words on paper was getting ever more difficult, the decision to remain committed to the public role of writer and thinker was not obvious. Nobody would have blamed him for quitting. A choice was involved. He kept going.
My habit of scanning the names of contributors turned more deliberate, in Judt’s case. His name on the cover of The New York Review of Books was encouraging, because it meant he was staying the course. But also troubling, since it posed hard questions: In his circumstances, would you be able to keep on working? For how long? In the name of what values? Are you sure? You wouldn't give up, collapse like a black hole into despair? Again, you're sure about that?
Not that he was asking these questions, or even hinting at them. They amount to one reader’s confession of the thoughts raised by Judt’s example. At the same time, they are implicit in our common condition. How much of what we do is out of conviction, and how much out of the momentum of routine?
At this point, the habit of watching for Judt’s byline will soon give way. I'll still reread him every so often, of course -- in particular, his Reappraisals: Reflections on the Forgotten Twentieth Century, published a couple of years ago by Penguin. It may be the best place to start for anyone who has yet to make his acquaintance. His essays are also excellent models of writing by a scholar addressing a public both academic and non-.
Not to say that his arguments are always persuasive. Some of them are actually pretty irritating, in my opinion, but that's the way it should be.
Many of us committed to the liberal arts have been defensive for as long as we can remember.
We have all cringed when we have heard a version of the following joke: The graduate with a science degree asks, “Why does it work?”; the graduate with an engineering degree asks, “How does it work?”; the graduate with a liberal arts degree asks, “Do you want fries with that?”
We have responded to such mockery by proclaiming the value of the liberal arts in the abstract: it creates a well-rounded person, is good for democracy, and develops the life of the mind. All these are certainly true, but somehow each misses the point that the joke drives home. Today’s college students and their families want to see a tangible financial outcome from the large investment that is now American higher education. That doesn’t make them anti-intellectual, but simply realists. Outside of home ownership, a college degree might be the largest single purchase for many Americans.
There is a disconnect as parents and students worry about economic outcomes when too many of us talk about lofty ideals. More families are questioning both the sticker price of schools and the value of whole fields of study. It is natural in this environment for us to feel defensive. It is time, however, that we in the liberal arts understand this new environment, and rather than merely react to it, we need to proactively engage it. To many Americans the liberal arts have a luxury they feel they need to give up to make a living -- nice but impractical. We need to speak more concretely to the economic as well as the intellectual value of a liberal arts degree.
The liberal arts always situate graduates on the road for success. More Fortune 500 CEOs have had liberal arts B.A.s than professional degrees. The same is true of doctors and lawyers. And we know the road to research science most often comes through a liberal arts experience. Now more than ever, as employment patterns seem to be changing, we need to engage the public on the value of a liberal arts degree in a more forceful and deliberate way.
We are witnessing an economic shift that may be every bit as profound as the shift from farm to factory. Today estimates are that over 25 percent of the American population is working as contingent labor -- freelancers, day laborers, consultants, micropreneurs.
Sitting where we do it is easy to dismiss this number because we assume it comes from day laborers and the working class, i.e., the non-college-educated. But just look at higher education's use of adjuncts and you see the trend. The fastest-growing sector of this shift is in the formally white-collar world our students aspire to. This number has been steadily rising and is projected to continue its upward climb unchanged. We are living in a world where 9:00-5:00 jobs are declining, careers with one company over a lifetime are uncommon, and economic risk has shifted from large institutions to individuals. Our students will know a world that is much more unstable and fluid than the one of a mere generation ago.
We have known for many years that younger workers (i.e., recent college graduates) move from firm to firm, job to job and even career to career during their lifetime. What we are seeing now, however, is different. And for as many Americans, they are hustling from gig to gig, too. These workers, many our former students, may never know economic security, but they may know success. For many of the new-economy workers, success is measured by more than just money, as freedom, flexibility and creativity count too.
If this is the new economy our students are going to inherit, we as college and university administrators, faculty and staff need to take stock of the programs we offer (curricular as well as extracurricular) to ensure that we serve our students' needs and set them on a successful course for the future. The skills they will need may be different from those of their predecessors. Colleges and universities with a true culture of assessment already are making the necessary strategic adjustments.
In 1956, William Whyte, the noted sociologist, wrote The Organizational Man to name the developing shift in work for that generation. Whyte recognized that white-collar workers traded independence for stability and security. What got them ahead in the then-new economy was the ability to fit in (socialization) and a deep set of narrow vocational skills. Firms at the time developed career ladders, and successful junior executives who honed their skills and got along advanced up the food chain.
Today, no such career ladder exists. And narrow sets of skills may not be the ticket they once were. We are witnessing a new way of working developing before our eyes. Today, breadth, cultural knowledge and sensitivity, flexibility, the ability to continually learn, grow and reinvent, technical skills, as well as drive and passion, define the road to success. And liberal arts institutions should take note, because this is exactly what we do best.
For liberal arts educators, this economic shift creates a useful moment to step out of the shadows. We no longer need to be defensive because what we have to offer is now more visibly useful in the world. Many of the skills needed to survive and thrive in the new economy are exactly those a well-rounded liberal arts education has always provided: depth, breadth, knowledge in context and motion, and the search for deeper understanding.
It will not be easy to explain to future students and their parents that a liberal arts degree may not lead to a particular “job” per se, because jobs in the traditional sense are disappearing. But, we can make a better case about how a liberal arts education leads to both a meaningful life and a successful career.
In this fluid world, arts and sciences graduates may have an advantage. They can seek out new opportunities and strike quickly. They are innovative and nimble. They think across platforms, understand society and culture, and see technology as a tool rather than an end in itself. In short, liberal arts graduates have the tools to make the best out of the new economy. And, above all, we need to better job identifying our successes, our alumni, as well as presenting them to the public. We need to ensure that the public knows a liberal arts degree is still, and always has been, a ticket to success.
This could be a moment for the rebirth of the liberal arts. For starters, we are witnessing exciting new research about the economy that is situating the discussion more squarely within the liberal arts orbit, and in the process blurring disciplinary boundaries. These scholars are doing what the American studies scholar Andrew Ross has called “scholarly reporting,” a blend of investigative reporting, social science and ethnography, as a way to understand the new economy shift. Scholars such as the sociologists Dalton Conley and Sharon Zurkin and the historian Bryant Simon offer new models of engaged scholarship that explain the cultural parameters of the new economy. We need to recognize and support this research because increasingly we will need to teach it as the best way to ensure our students understand the moment.
We also need to be less territorial, and recognize that the professional schools are not the enemy. They have a lot to offer our students. Strategic partnerships between professional schools and the arts and sciences enrich both and offer liberal arts students important professional opportunities long closed off to them. We also need to find ways to be good neighbors to the growing micropreneurial class, either by providing space, wifi, or interns. Some schools have created successful incubators, which can jump-start small businesses and give their students important ground-floor exposure to the emerging economy.
Today’s liberal arts graduates will need to function in an economy that is in some ways smaller. Most will work for small firms and many will simply work on their own. They will need to multitask as well as blend work and family. And, since there will be little budget or time for entry-level training, we need to ensure that all our students understand the basics of business even if they are in the arts. We also might consider preparing our graduates as if they were all going to become small business owners, because in a sense many of them are going to be micropreneurs.
Richard A. Greenwald
Richard A Greenwald is dean of the Caspersen School of Graduate Studies, director of university partnerships, and professor of history at Drew University in Madison, N.J. His next book is entitled The Micropreneurial Age: The Permanent Freelancer and the New American (Work)Life.
When the economy goes down, one expects the liberal arts -- especially the humanities -- to wither, and laments about their death to go up. That’s no surprise since these fields have often defined themselves as unsullied by practical application. This notion provides little comfort to students -- and parents -- who are anxious about their post-college prospects; getting a good job -- in dire times, any job -- is of utmost importance. (According to CIRP’s 2009 Freshman Survey, 56.5 percent of students -- the highest since 1983 -- said that “graduates getting good jobs” was an important factor when choosing where to go to college.)
One expects students, then, to rush to courses and majors that promise plenty of entry-level jobs. Anticipating this, college administrators would cut back or eliminate programs that are not “employment friendly,” as well as those that generate little research revenue. Exit fields like classics, comparative literature, foreign languages and literatures, philosophy, religion, and enter only those that are preprofessional in orientation. Colleges preserving a commitment to the liberal arts would see a decline in enrollment; in some cases, the institution itself would disappear.
So runs the widespread narrative of decline and fall. Everyone has an anecdote or two to support this story, but does it hold in general and can we learn something from a closer examination of the facts?
The National Center for Education Statistics reports that the number of bachelor's degrees in “employment friendly” fields has been on the rise since 1970. Undergraduate business degrees -- the go-to “employment friendly” major -- has increased from 1970-71, with 115,400 degrees conferred, to 2007-08, with 335,250 conferred. In a parallel development, institutions graduated seven times more communications and journalism majors in 2007-08 than in 1970-71. And while numbers are small, there has been exponential growth in “parks, recreation, leisure, and fitness studies,” “security and protective services,” and “transportation and materials moving” degrees. Computer science, on the other hand, peaked in the mid-80s, dropped in the mid-90s, peaked again in the mid-2000s, and dropped again in the last five years.
What has students’ turn to such degrees meant for the humanities and social sciences? A mapping of bachelor degrees conferred in the humanities from 1966 to 2007 by the Humanities Indicator Project shows that the percentage of such majors was highest in the late 1960s (17-18 percent of all degrees conferred), low in the mid-1980s (6-7 percent), and more or less level since the early 1990s (8-9 percent). Trends, of course, vary from discipline to discipline.
Degrees awarded in English dropped from a high of 64,627 in 1970-71 to half that number in the early 1980s, before rising to 55,000 in the early 1990s and staying at that level since then. The social sciences and history were hit with a similar decline in majors in 1970s and 1980s, but then recovered nicely in the years since then and now have more than they did in 1970. The numbers of foreign language, philosophy, religious studies, and area studies majors have been stable since 1970. IPEDS data pick up where the Humanities Indicator Project leaves off and tell that in 2008 and 2009, the number of students who graduated with bachelor's degrees in English, foreign language and literatures, history, and philosophy and religion have remained at the same level.
What’s surprising about this bird’s-eye view of undergraduate education is not the increase in the number of majors in programs that should lead directly to a job after graduation, but that the number of degrees earned in the humanities and related fields have not been adversely affected by the financial troubles that have come and gone over the last two decades.
Of course, macro-level statistics reveal only part of the story. What do things look like at the ground level? How are departments faring? Course enrollments? Majors? Since the study of the Greek and Roman classics tends to be a bellwether for trends in the humanities and related fields (with departments that are small and often vulnerable), it seemed reasonable to ask Adam Blistein of the American Philological Association whether classics departments were being dropped at a significant number of places. “Not really” was his answer; while the classics major at Michigan State was cut, and a few other departments were in difficulty, there was no widespread damage to the field -- at least not yet.
Big declines in classics enrollments? Again, the answer seems to be, “Not really.” Many institutions report a steady gain in the number of majors over the past decade. Princeton’s classics department, for example, announced this past spring 17 graduating seniors, roughly twice what the number had been three decades ago. And the strength is not just in elite institutions. Charles Pazdernik at Grand Valley State University in hard-hit Michigan reported that his department has 50+ majors on the books and strong enrollments in language courses.
If classics seems to be faring surprisingly well, what about the modern languages? There are dire reports about German and Russian, and the Romance languages seem increasingly to be programs in Spanish, with a little French and Italian tossed in. The Modern Language Association reported in fall 2006 -- well before the current downturn -- a 12.9 percent gain in language study since 2002. This translates into 180,557 more enrollments. Every language except Biblical Hebrew showed increases, some exponential -- Arabic (126.5 percent), Chinese (51 percent), and Korean (37.1 percent) -- while others less so -- French (2.2 percent), German (3.5 percent), and Russian (3.9 percent). (Back to the ancient world for a moment: Latin saw a 7.9 percent increase, and ancient Greek 12.1 percent). The study of foreign languages, in other words, seems not to be disappearing; the mix is simply changing.
Theoretical and ideological issues have troubled and fragmented literature departments in recent years, but a spring 2010 conference on literary studies at the National Humanities Center suggests that the field is enjoying a revitalization. The mood was eloquent, upbeat, innovative; no doom and gloom, even though many participants were from institutions where painful budget cuts had recently been made.
A similar mood was evident at National Forum on the Future of Liberal Education, a gathering of some highly regarded assistant professors in the humanities and social sciences this past February. They were well aware that times were tough, the job market for Ph.D.s miserable, and tenure prospects uncertain. Yet their response was to get on with the work of strengthening liberal education, rather than bemoan its decline and fall. Energy was high, and with it the conviction that the best way to move liberal education forward was to achieve demonstrable improvements in student learning.
It’s true that these young faculty members are from top-flight universities. What about smaller, less well-endowed institutions? Richard Ekman of the Council of Independent Colleges reports that while a few of the colleges in his consortium are indeed in trouble, most were doing quite well, increasing enrollments and becoming more selective. And what about state universities and land grant institutions, where most students go to college? Were they scuttling the liberal arts and sciences because of fierce cutbacks? David Shulenburger of the Association of Public and Land-grant Universities says that while budget cuts have resulted in strategic “consolidation of programs and sometimes the elimination of low-enrollment majors,” he does not “know of any public universities weakening their liberal education requirements.”
Mark Twain once remarked that reports of his death were greatly exaggerated. The liberal arts disciplines, it seems, can say the same thing. The on-the-ground stories back up the statistics and reinforce the idea that the liberal arts are not dying, despite the soft job market and the recent recession. Majors are steady, enrollments are up in particular fields, and students -- and institutions -- aren’t turning their backs on disciplines that don’t have obvious utility for the workplace. The liberal arts seem to have a particular endurance and resilience, even when we expect them to decline and fall.
One could imagine any number of reasons why this is the case -- the inherent conservatism of colleges and universities is one -- but maybe something much more dynamic is at work. Perhaps the stamina of the liberal arts in today’s environment draws in part from the vital role they play in providing students with a robust liberal education, that is, a kind of education that develops their knowledge in a range of disciplinary fields, and importantly, their cognitive skills and personal competencies. The liberal arts continue -- and likely will always -- give students an education that delves into the intricate language of Shakespeare or Woolf, or the complex historical details of the Peloponnesian War or the French Revolution. That is a given.
But what the liberal arts also provide is a rich site for students to think critically, to write analytically and expressively, to consider questions of moral and ethical importance (as well as those of meaning and value), and to construct a framework for understanding the infinite complexities and uncertainties of human life. This is, as many have argued before, a powerful form of education, a point that students, the statistics and anecdotes show, agree with.
W. Robert Connor and Cheryl Ching
W. Robert Connor is the former president of the Teagle Foundation, to which he is now a senior adviser. Cheryl Ching is a program officer at Teagle.
"Who knows but if men constructed their dwellings with their own hands, and provided food for themselves and families simply and honestly enough, the poetic faculty would be universally developed, as birds universally sing when they are so engaged?" So writes Henry David Thoreau in the first chapter of Walden, in the middle of a lengthy disquisition about the meaning of shelter in mid-19th century America. Using white pine from the shores of Walden Pond and lumber salvaged from an old shack, Thoreau stimulated his own poetic faculties by constructing his 10- by 15-foot dwelling at the outset of his famous sojourn.
With Thoreau’s exhortation and example firmly in mind and the blessing of the college administration, the department of environmental studies and sciences undertook the reconstruction of Thoreau’s cabin as our contribution to Ithaca College’s First Year Reading Initiative for 2010. The president had selected Walden as the text that would be sent to all incoming first-year students. Few books could serve as so stimulating a provocation in our hyper-mediated age, when it is harder than ever "to front the essential facts of life," when more people than ever seem to be living lives of quiet desperation. Reconstructing Thoreau’s cabin, therefore, not only resonated well with my department’s values, but would offer students an opportunity to, in Thoreau’s own vision of higher education, "not play life, or study it merely, while the community supports them at this expensive game, but earnestly live it from beginning to end." (Emphasis original.)
Over the course of the summer everyone we contacted about helping with the project was enthusiastic. The local timber framers who had the tools and expertise to lead the build, the salvager who would provide us with the wood, and the local re-use center where we would get the windows and which would help us with the de-nailing — all leaped at the chance to participate, in many cases offering their services free or at a steep discount. Students, faculty, alumni, and community members who learned about the project all expressed a desire, even a craving, to become involved, to be able to build with their own hands. Their answer to Thoreau’s question, "Shall we ever resign the pleasure of construction to the carpenter?" was loud and clear.
And so sketches were made. A crew of students and faculty spent a day and a half pulling hemlock boards and timbers from a collapsed 120-year-old barn. The campus site for the build was selected. We sent the hand-drawn sketches to an architect friend to be rendered as computer-designed drawings.
And that was the moment when the magic of creative possibility conjured by Thoreau dissipated in the reality of 21st-century America. We can't say we weren’t warned by Henry himself, who had observed even in the 1840s that human institutions often serve those who created them in unwelcome ways. Our well-meaning friend innocently inquired, "Are you sure you won’t need a building permit for this project?"
An educational project temporarily occupying a space for a year, a 150-square-foot cabin? Surely not.
But, alas, once even our innocent inquiries were made, the Town of Ithaca bureaucrats scampered into their iron cages and set about their regulatory duties — duties, it should be said, the people have charged them with. Unable to see how irrelevant modern building codes were for this project, the director of code enforcement immediately declared our plans as drawn were a menace to public health and safety. The entire thing was transformed from frustration to farce when he insisted that the cabin would need ... a sprinkler system.
At least as frustrating was the inability of the college’s own bureaucracy to either defend the principle that this project was not even subject to review (there were precedents for such an argument) or to advocate for an expedited process. Not without reason, the college administration was fearful of alienating the local government over a project that was a low priority compared to the massive building projects under way and anticipated. No matter how powerful the experience of reconstructing the cabin might be for a few hundred students, no matter that such a project conforms more closely to the vision of higher education I believe in (and Thoreau seems to have as well) than the new 130,000-square foot athletics and events center, no one was willing to challenge the town’s misapplication of rules, at least not in time to make a difference.
And so the salvaged wood sits in a storage facility awaiting its transformation, awaiting its opportunity to transform. If the town issues a permit, the winter does not stretch too far into April (as it sometimes does in these parts), and it is possible to remobilize the reconstruction team in the spring, we may yet find a replica of Thoreau’s cabin standing on our campus. If it does get built it will be as much an emblem of how accurate Thoreau’s characterizations of our society were (and are) as a triumph of experiential learning.
Even if the project becomes another one of those good ideas that run afoul of the sclerotic bureaucracies that too often hamper creativity, however, those of us most closely involved have learned, as Thoreau had, that we often forge our own chains. Our experience has confirmed the essential truth that we are almost all conformists, bound by rules and conventions we seldom question and even more rarely challenge. We need rules; who would want to live in a modern society without the rule of law? But we need to consciously consider and reconsider both the rightness of a given rule and the proper application of it. Throughout Walden Thoreau — sometimes gently, sometimes stridently — admonishes us to defy convention and seek our own path, his way of considering and reconsidering the boundaries we set for ourselves. "How deep the ruts of tradition and conformity!" he lamented. He found the expression of original thought and belief "a phenomenon so rare that I would any day walk ten miles to observe it."
In his equally famous essay “Resistance to Civil Government” (now commonly called “Civil Disobedience”), Thoreau writes, “The mass of men serve the state... not as men mainly, but as machines... . In most cases there is no free exercise whatever of the judgment or of the moral sense; but they have put themselves on a level with wood and earth and stones; and wooden men can perhaps be manufactured that will serve the purpose as well.” Thoreau condemned servility in the face of state immorality on a grand scale — in his time, this immorality was slavery and the war of aggression against Mexico that was a product of the debate over slavery.
Yet he clearly also believed that our submissiveness in the face of injustice — or, in our case in the face of intractable bureaucracy — begins with the habits of mind we cultivate in our day-to-day activities. In a provocative passage from Walden on clothes Thoreau writes, “I am sure that there is greater anxiety, commonly, to have fashionable, or at least clean and unpatched clothes, than to have a sound conscience.” Conscience may, at times, require clean clothes (I doubt if Thurgood Marshall would have gotten very far in his legal career without them), and Thoreau himself counseled that a person should “maintain himself in whatever attitude he find himself through obedience to the laws of his being, which will never be one of opposition to a just government, if he should chance to meet such.” But when social acceptance becomes the guiding principle of one’s life, when we blindly follow the spoken and unspoken rules of our culture, the world becomes a blander, less just place.
I am in no way trying to raise our impeded attempt to reconstruct Thoreau’s cabin on a privileged college campus to the level of injustice embodied by apartheid or Jim Crow, to name but two of the oppressive systems defied by people operating under Thoreau’s influence (though I like to imagine Thoreau being summoned to the town office for code violations). But I do wonder what it says about our society when we adhere so assiduously to rules and permits for things like a humble cabin while at the same time multinational corporations operate with virtual impunity. Whether it is the oil catastrophe in the Gulf of Mexico, or the poisoning of millions of gallons of water through natural gas extraction by hydraulic fracturing in my part of the country, or factory farms in Iowa that have hens laying eggs over two year old fecal piles, the absence of meaningful rules and regulations has profoundly compromised human and ecosystem health on a staggering scale. And what of the carte blanche given investment banks, in which case the absence of oversight brought the entire global financial system to its knees? But the cabin must have its sprinkler system or public safety will be jeopardized!
We too often regulate the small things inflexibly, while ignoring the behaviors and habits of thought that pose genuine threats to our — and the organisms with which we share this planet’s — very survival. Or, worse, we allow corporations to buy themselves exemptions from oversight, either through the now-legalized bribery of massive campaign contributions or in less visible ways (for just one example, see the behavior of the Minerals Management Service under the Bush administration, behavior that directly contributed to the Deepwater Horizon disaster). The result is what can seem like the worst of all possible worlds: common folk feeling oppressed by regulations that seem omnipresent and inflexible while the wealthy and powerful can often get away with murder.
Despite his reputation as a curmudgeon, Thoreau finishes Walden on an optimistic note, most famously telling us "that if one advances confidently in the direction of his dreams ... he will meet with a success unexpected in common hours." We tell our students some variant of this sentiment from the moment they arrive on campus until the last echo of the commencement speech. Our confidence may have faltered over the past few weeks as we advanced toward our modest little dream of reconstructing Henry’s cabin on campus. There remain innumerable bureaucratic hurdles to surmount before we can build the version of the cabin we envision —sans sprinkler system. Perhaps students will yet wield chisels, froes, handsaws, augurs, and hammers, and in so doing develop their poetic faculties as they contemplate the meaning of the rough-hewn, handmade cabin they have built on a modern college campus.
Michael Smith teaches history and environmental studies at Ithaca College.
Next week, Crown Publishers will issue President George W. Bush’s memoir Decision Points, covering what the former president calls “eight of the most consequential years in American history,” which seems like a fair description. They were plenty consequential. To judge from the promotional video, Bush will plumb the depths of his insight that it is the role of a president to be “the decider.” Again, it’s hard to argue with his point -- though you have to wonder if he shouldn’t let his accumulated wisdom ripen and mellow for a while before serving it.
Princeton University Press has already beat him into print with The Presidency of George W. Bush: A First Historical Assessment, edited by Julian E. Zelizer, who is a professor of history and public affairs at Princeton. The other 10 contributors are professors of history, international relations, law, and political science, and they cover the expected bases -- the “War on Terror,” the invasion of Iraq, social and economic policy, religion and race. It is a scholarly book, which means that it is bound to make everybody mad. People on the left get angry at remembering the Bush years, while those on the right grow indignant that anyone still wants to talk about them. So the notion that they were consequential is perhaps not totally uncontroversial after all.
The contributors make three points about the Bush administration’s place in the history of American conservatism that it may be timely to sum up, just now.
In the introduction, Zelizer writes that Bush’s administration “marked the culmination of the second stage in the history of modern conservatism.” The earlier period, running from the 1940s through the ‘70s, had been a time of building an effective movement out of ideological factions (fundamentalists, libertarians, and neoconservatives, among others) “none of which sat very comfortably alongside any other.” Following Reagan's victory in the 1980 election, “conservatives switched from being an oppositional force in national politics to struggling with the challenges of governance that came from holding power.”
This summer, Zelizer published a valuable review-essay on the recent historiography of the American right. It can be recommended to anyone who wants more depth than the following (admittedly schematic) remarks will manage.
(1) In the chapter called “How Conservatives Learned to Stop Worrying and Love Presidential Power,” Zelizer points to a tendency among earlier generations of American conservatives to be suspicious of the executive branch. He traces this back to polemics against FDR during the 1930s, when conservatives painted the New Deal as akin to Hitlerian dictatorship or Stalinist five-year planning. And he quotes the early neoconservative intellectual James Burnham saying, in 1959, that “the primacy of the legislature in the intent of the Constitution is plain on the face of that document.” A strong executive meant growing central power, while delegates to Congress had an incentive to protect local authority.
This sensibility changed in the course of the cold war, writes Zelizer, and particularly under the leadership of Nixon and Reagan. Distrust of executive power gave way to increasing conservative reliance on it. Concentrating executive authority in the hands of the president (rather than spreading it out among various agencies) would promote efficiency and coordinate decision-making -- so the argument went. But just as importantly, it would mean that a conservative president could curb the regulatory powers of the state.
The claims for executive authority intensified under the War on Terror -- yielding what Zelizer calls the Bush administration’s “defiant, if not downright hostile [attitude] about any kind of congressional restrictions whatsoever." But this was not just something that “the decider” decided. It reflected a decades-long reorientation in conservative ideology. "The Right cannot legitimately divorce itself from strong presidential power,” writes Zelizer. “[A]n expanding historical literature … is attempting to revise our knowledge about conservatism by demonstrating how conservatives have had a more complex and less adversarial relationship with the modern state than we previously assumed.”
(2) There was a time when manufacturing "stood atop the commanding heights of the U.S. political economy,” writes Nelson Lichtenstein -- a professor of history at the University of California at Santa Barbara -- in his chapter “Ideology and Interest on the Social Policy Home Front.” He identifies this epoch as running from 1860 until 1980. The Bush presidency belongs to the era of "retail supremacy," in which the employment trend is low-wage and high-turnover. As of 2008, there were five times as many jobs in the service sector as in “the ‘goods-producing’ industries that once constituted the core of the U.S. economy” such as agriculture, construction, and manufacture.
Free-market principles were a basic part of conservative ideology in both eras. But the beneficiaries have changed. Once upon a time, advocates of laissez-faire would sometimes find themselves accused of being mouthpieces of the National Association of Manufacturers, averse to trade regulation and price controls. But by the Bush years, that was a thing of the past. “As employers of many low-wage workers,” writes Lichtenstein, “most retailers favored the lightest possible regulatory hand, especially when it came to welfare-state mandates such as those covering employee health insurance, retirement pay, and health and safety issues.”
The gaps created by stagnating wages and shrinking benefits were plugged – for a while anyway – by "cheap imports, easy credit, an overpriced dollar, and an array of new financial products that widened the range of assets (mainly houses) that both homeowners and bankers could borrow against."
An older style of right-wing thought lauded the free market as a merciless field of combat -- a way to test the entrepreneur’s self-control and the manufacturer's commitment to increasing productivity. But the form of conservatism taking its place has freed itself from notions of delayed gratification or expanding domestic output. Wal-Mart capitalism in the Bush years claimed only to deliver the goods cheaply, no matter where they might come from.
(3) In the 1970s, conservatives liked to say that their ranks were filling up with “liberals who had been mugged by reality.” That phrase suggested that reality is one tough dude -- totally indifferent to anybody’s mere opinion.
It was quite another matter when a leading Bush administration official (unnamed but often assumed to be Karl Rove) told a reporter for The New York Times Magazine in 2004: “We’re an empire now, and when we act, we create our own reality.” Nor was this merely the judgment of a solitary solipsist. In his chapter "Creating Their Own Reality,” David Greenberg -- an associate professor of history, journalism, and media studies at Rutgers University -- maintains “the Right under Bush found itself promoting a view of knowledge in which any political claim, no matter how objectively verifiable or falsifiable, was treated as simply one of two competing descriptions of reality, with power and ideology, not science or disciplined inquiry, as the arbiters.” (Or deciders, if you will.)
There was no reality, only interpretations of reality -- and the existence of weapons of mass destruction was a function of who controlled the narrative. Little surprise that there were jokes about the rise of conservative postmodernism during the ‘00s. If Fox denied that climate change was taking place, who had the right to insist otherwise? Not some elitist, anyway.
Greenberg traces the right's “forays into epistemological relativism” back to influence of networks of right-leaning think-tanks and journalists. He quotes a contributor to The Weekly Standard from 2003, on how the right had created “a cottage industry” for spin: “Criticize other people for not being objective. Be as subjective as you want. It’s a great little racket.” And going a step beyond what Greenberg describes, we see another development along the same reality-aversive lines: the growing importance of conservative political figures whose authority within the movement comes primarily, or even exclusively, from their status as mass-media celebrities.
The former president did not create any of these tendencies. He simply took them over as legacies from what has been, for 30 years now, the strongest and most disciplined force in American politics.
Several passages in The Presidency of George W. Bush were obviously written in the flush of assumptions that the election of 2008 was a major turning point in the country's history -- the point at which the conservative movement had not just lost any chance at constructing a "permanent Republican majority" but condemned itself to wander in the electoral wilderness for a long season. Well, nobody should expect historians to be prophets, or political scientists to be bookies.