Many of us committed to the liberal arts have been defensive for as long as we can remember.
We have all cringed when we have heard a version of the following joke: The graduate with a science degree asks, “Why does it work?”; the graduate with an engineering degree asks, “How does it work?”; the graduate with a liberal arts degree asks, “Do you want fries with that?”
We have responded to such mockery by proclaiming the value of the liberal arts in the abstract: it creates a well-rounded person, is good for democracy, and develops the life of the mind. All these are certainly true, but somehow each misses the point that the joke drives home. Today’s college students and their families want to see a tangible financial outcome from the large investment that is now American higher education. That doesn’t make them anti-intellectual, but simply realists. Outside of home ownership, a college degree might be the largest single purchase for many Americans.
There is a disconnect as parents and students worry about economic outcomes when too many of us talk about lofty ideals. More families are questioning both the sticker price of schools and the value of whole fields of study. It is natural in this environment for us to feel defensive. It is time, however, that we in the liberal arts understand this new environment, and rather than merely react to it, we need to proactively engage it. To many Americans the liberal arts have a luxury they feel they need to give up to make a living -- nice but impractical. We need to speak more concretely to the economic as well as the intellectual value of a liberal arts degree.
The liberal arts always situate graduates on the road for success. More Fortune 500 CEOs have had liberal arts B.A.s than professional degrees. The same is true of doctors and lawyers. And we know the road to research science most often comes through a liberal arts experience. Now more than ever, as employment patterns seem to be changing, we need to engage the public on the value of a liberal arts degree in a more forceful and deliberate way.
We are witnessing an economic shift that may be every bit as profound as the shift from farm to factory. Today estimates are that over 25 percent of the American population is working as contingent labor -- freelancers, day laborers, consultants, micropreneurs.
Sitting where we do it is easy to dismiss this number because we assume it comes from day laborers and the working class, i.e., the non-college-educated. But just look at higher education's use of adjuncts and you see the trend. The fastest-growing sector of this shift is in the formally white-collar world our students aspire to. This number has been steadily rising and is projected to continue its upward climb unchanged. We are living in a world where 9:00-5:00 jobs are declining, careers with one company over a lifetime are uncommon, and economic risk has shifted from large institutions to individuals. Our students will know a world that is much more unstable and fluid than the one of a mere generation ago.
We have known for many years that younger workers (i.e., recent college graduates) move from firm to firm, job to job and even career to career during their lifetime. What we are seeing now, however, is different. And for as many Americans, they are hustling from gig to gig, too. These workers, many our former students, may never know economic security, but they may know success. For many of the new-economy workers, success is measured by more than just money, as freedom, flexibility and creativity count too.
If this is the new economy our students are going to inherit, we as college and university administrators, faculty and staff need to take stock of the programs we offer (curricular as well as extracurricular) to ensure that we serve our students' needs and set them on a successful course for the future. The skills they will need may be different from those of their predecessors. Colleges and universities with a true culture of assessment already are making the necessary strategic adjustments.
In 1956, William Whyte, the noted sociologist, wrote The Organizational Man to name the developing shift in work for that generation. Whyte recognized that white-collar workers traded independence for stability and security. What got them ahead in the then-new economy was the ability to fit in (socialization) and a deep set of narrow vocational skills. Firms at the time developed career ladders, and successful junior executives who honed their skills and got along advanced up the food chain.
Today, no such career ladder exists. And narrow sets of skills may not be the ticket they once were. We are witnessing a new way of working developing before our eyes. Today, breadth, cultural knowledge and sensitivity, flexibility, the ability to continually learn, grow and reinvent, technical skills, as well as drive and passion, define the road to success. And liberal arts institutions should take note, because this is exactly what we do best.
For liberal arts educators, this economic shift creates a useful moment to step out of the shadows. We no longer need to be defensive because what we have to offer is now more visibly useful in the world. Many of the skills needed to survive and thrive in the new economy are exactly those a well-rounded liberal arts education has always provided: depth, breadth, knowledge in context and motion, and the search for deeper understanding.
It will not be easy to explain to future students and their parents that a liberal arts degree may not lead to a particular “job” per se, because jobs in the traditional sense are disappearing. But, we can make a better case about how a liberal arts education leads to both a meaningful life and a successful career.
In this fluid world, arts and sciences graduates may have an advantage. They can seek out new opportunities and strike quickly. They are innovative and nimble. They think across platforms, understand society and culture, and see technology as a tool rather than an end in itself. In short, liberal arts graduates have the tools to make the best out of the new economy. And, above all, we need to better job identifying our successes, our alumni, as well as presenting them to the public. We need to ensure that the public knows a liberal arts degree is still, and always has been, a ticket to success.
This could be a moment for the rebirth of the liberal arts. For starters, we are witnessing exciting new research about the economy that is situating the discussion more squarely within the liberal arts orbit, and in the process blurring disciplinary boundaries. These scholars are doing what the American studies scholar Andrew Ross has called “scholarly reporting,” a blend of investigative reporting, social science and ethnography, as a way to understand the new economy shift. Scholars such as the sociologists Dalton Conley and Sharon Zurkin and the historian Bryant Simon offer new models of engaged scholarship that explain the cultural parameters of the new economy. We need to recognize and support this research because increasingly we will need to teach it as the best way to ensure our students understand the moment.
We also need to be less territorial, and recognize that the professional schools are not the enemy. They have a lot to offer our students. Strategic partnerships between professional schools and the arts and sciences enrich both and offer liberal arts students important professional opportunities long closed off to them. We also need to find ways to be good neighbors to the growing micropreneurial class, either by providing space, wifi, or interns. Some schools have created successful incubators, which can jump-start small businesses and give their students important ground-floor exposure to the emerging economy.
Today’s liberal arts graduates will need to function in an economy that is in some ways smaller. Most will work for small firms and many will simply work on their own. They will need to multitask as well as blend work and family. And, since there will be little budget or time for entry-level training, we need to ensure that all our students understand the basics of business even if they are in the arts. We also might consider preparing our graduates as if they were all going to become small business owners, because in a sense many of them are going to be micropreneurs.
Richard A. Greenwald
Richard A Greenwald is dean of the Caspersen School of Graduate Studies, director of university partnerships, and professor of history at Drew University in Madison, N.J. His next book is entitled The Micropreneurial Age: The Permanent Freelancer and the New American (Work)Life.
When the economy goes down, one expects the liberal arts -- especially the humanities -- to wither, and laments about their death to go up. That’s no surprise since these fields have often defined themselves as unsullied by practical application. This notion provides little comfort to students -- and parents -- who are anxious about their post-college prospects; getting a good job -- in dire times, any job -- is of utmost importance. (According to CIRP’s 2009 Freshman Survey, 56.5 percent of students -- the highest since 1983 -- said that “graduates getting good jobs” was an important factor when choosing where to go to college.)
One expects students, then, to rush to courses and majors that promise plenty of entry-level jobs. Anticipating this, college administrators would cut back or eliminate programs that are not “employment friendly,” as well as those that generate little research revenue. Exit fields like classics, comparative literature, foreign languages and literatures, philosophy, religion, and enter only those that are preprofessional in orientation. Colleges preserving a commitment to the liberal arts would see a decline in enrollment; in some cases, the institution itself would disappear.
So runs the widespread narrative of decline and fall. Everyone has an anecdote or two to support this story, but does it hold in general and can we learn something from a closer examination of the facts?
The National Center for Education Statistics reports that the number of bachelor's degrees in “employment friendly” fields has been on the rise since 1970. Undergraduate business degrees -- the go-to “employment friendly” major -- has increased from 1970-71, with 115,400 degrees conferred, to 2007-08, with 335,250 conferred. In a parallel development, institutions graduated seven times more communications and journalism majors in 2007-08 than in 1970-71. And while numbers are small, there has been exponential growth in “parks, recreation, leisure, and fitness studies,” “security and protective services,” and “transportation and materials moving” degrees. Computer science, on the other hand, peaked in the mid-80s, dropped in the mid-90s, peaked again in the mid-2000s, and dropped again in the last five years.
What has students’ turn to such degrees meant for the humanities and social sciences? A mapping of bachelor degrees conferred in the humanities from 1966 to 2007 by the Humanities Indicator Project shows that the percentage of such majors was highest in the late 1960s (17-18 percent of all degrees conferred), low in the mid-1980s (6-7 percent), and more or less level since the early 1990s (8-9 percent). Trends, of course, vary from discipline to discipline.
Degrees awarded in English dropped from a high of 64,627 in 1970-71 to half that number in the early 1980s, before rising to 55,000 in the early 1990s and staying at that level since then. The social sciences and history were hit with a similar decline in majors in 1970s and 1980s, but then recovered nicely in the years since then and now have more than they did in 1970. The numbers of foreign language, philosophy, religious studies, and area studies majors have been stable since 1970. IPEDS data pick up where the Humanities Indicator Project leaves off and tell that in 2008 and 2009, the number of students who graduated with bachelor's degrees in English, foreign language and literatures, history, and philosophy and religion have remained at the same level.
What’s surprising about this bird’s-eye view of undergraduate education is not the increase in the number of majors in programs that should lead directly to a job after graduation, but that the number of degrees earned in the humanities and related fields have not been adversely affected by the financial troubles that have come and gone over the last two decades.
Of course, macro-level statistics reveal only part of the story. What do things look like at the ground level? How are departments faring? Course enrollments? Majors? Since the study of the Greek and Roman classics tends to be a bellwether for trends in the humanities and related fields (with departments that are small and often vulnerable), it seemed reasonable to ask Adam Blistein of the American Philological Association whether classics departments were being dropped at a significant number of places. “Not really” was his answer; while the classics major at Michigan State was cut, and a few other departments were in difficulty, there was no widespread damage to the field -- at least not yet.
Big declines in classics enrollments? Again, the answer seems to be, “Not really.” Many institutions report a steady gain in the number of majors over the past decade. Princeton’s classics department, for example, announced this past spring 17 graduating seniors, roughly twice what the number had been three decades ago. And the strength is not just in elite institutions. Charles Pazdernik at Grand Valley State University in hard-hit Michigan reported that his department has 50+ majors on the books and strong enrollments in language courses.
If classics seems to be faring surprisingly well, what about the modern languages? There are dire reports about German and Russian, and the Romance languages seem increasingly to be programs in Spanish, with a little French and Italian tossed in. The Modern Language Association reported in fall 2006 -- well before the current downturn -- a 12.9 percent gain in language study since 2002. This translates into 180,557 more enrollments. Every language except Biblical Hebrew showed increases, some exponential -- Arabic (126.5 percent), Chinese (51 percent), and Korean (37.1 percent) -- while others less so -- French (2.2 percent), German (3.5 percent), and Russian (3.9 percent). (Back to the ancient world for a moment: Latin saw a 7.9 percent increase, and ancient Greek 12.1 percent). The study of foreign languages, in other words, seems not to be disappearing; the mix is simply changing.
Theoretical and ideological issues have troubled and fragmented literature departments in recent years, but a spring 2010 conference on literary studies at the National Humanities Center suggests that the field is enjoying a revitalization. The mood was eloquent, upbeat, innovative; no doom and gloom, even though many participants were from institutions where painful budget cuts had recently been made.
A similar mood was evident at National Forum on the Future of Liberal Education, a gathering of some highly regarded assistant professors in the humanities and social sciences this past February. They were well aware that times were tough, the job market for Ph.D.s miserable, and tenure prospects uncertain. Yet their response was to get on with the work of strengthening liberal education, rather than bemoan its decline and fall. Energy was high, and with it the conviction that the best way to move liberal education forward was to achieve demonstrable improvements in student learning.
It’s true that these young faculty members are from top-flight universities. What about smaller, less well-endowed institutions? Richard Ekman of the Council of Independent Colleges reports that while a few of the colleges in his consortium are indeed in trouble, most were doing quite well, increasing enrollments and becoming more selective. And what about state universities and land grant institutions, where most students go to college? Were they scuttling the liberal arts and sciences because of fierce cutbacks? David Shulenburger of the Association of Public and Land-grant Universities says that while budget cuts have resulted in strategic “consolidation of programs and sometimes the elimination of low-enrollment majors,” he does not “know of any public universities weakening their liberal education requirements.”
Mark Twain once remarked that reports of his death were greatly exaggerated. The liberal arts disciplines, it seems, can say the same thing. The on-the-ground stories back up the statistics and reinforce the idea that the liberal arts are not dying, despite the soft job market and the recent recession. Majors are steady, enrollments are up in particular fields, and students -- and institutions -- aren’t turning their backs on disciplines that don’t have obvious utility for the workplace. The liberal arts seem to have a particular endurance and resilience, even when we expect them to decline and fall.
One could imagine any number of reasons why this is the case -- the inherent conservatism of colleges and universities is one -- but maybe something much more dynamic is at work. Perhaps the stamina of the liberal arts in today’s environment draws in part from the vital role they play in providing students with a robust liberal education, that is, a kind of education that develops their knowledge in a range of disciplinary fields, and importantly, their cognitive skills and personal competencies. The liberal arts continue -- and likely will always -- give students an education that delves into the intricate language of Shakespeare or Woolf, or the complex historical details of the Peloponnesian War or the French Revolution. That is a given.
But what the liberal arts also provide is a rich site for students to think critically, to write analytically and expressively, to consider questions of moral and ethical importance (as well as those of meaning and value), and to construct a framework for understanding the infinite complexities and uncertainties of human life. This is, as many have argued before, a powerful form of education, a point that students, the statistics and anecdotes show, agree with.
W. Robert Connor and Cheryl Ching
W. Robert Connor is the former president of the Teagle Foundation, to which he is now a senior adviser. Cheryl Ching is a program officer at Teagle.
"Who knows but if men constructed their dwellings with their own hands, and provided food for themselves and families simply and honestly enough, the poetic faculty would be universally developed, as birds universally sing when they are so engaged?" So writes Henry David Thoreau in the first chapter of Walden, in the middle of a lengthy disquisition about the meaning of shelter in mid-19th century America. Using white pine from the shores of Walden Pond and lumber salvaged from an old shack, Thoreau stimulated his own poetic faculties by constructing his 10- by 15-foot dwelling at the outset of his famous sojourn.
With Thoreau’s exhortation and example firmly in mind and the blessing of the college administration, the department of environmental studies and sciences undertook the reconstruction of Thoreau’s cabin as our contribution to Ithaca College’s First Year Reading Initiative for 2010. The president had selected Walden as the text that would be sent to all incoming first-year students. Few books could serve as so stimulating a provocation in our hyper-mediated age, when it is harder than ever "to front the essential facts of life," when more people than ever seem to be living lives of quiet desperation. Reconstructing Thoreau’s cabin, therefore, not only resonated well with my department’s values, but would offer students an opportunity to, in Thoreau’s own vision of higher education, "not play life, or study it merely, while the community supports them at this expensive game, but earnestly live it from beginning to end." (Emphasis original.)
Over the course of the summer everyone we contacted about helping with the project was enthusiastic. The local timber framers who had the tools and expertise to lead the build, the salvager who would provide us with the wood, and the local re-use center where we would get the windows and which would help us with the de-nailing — all leaped at the chance to participate, in many cases offering their services free or at a steep discount. Students, faculty, alumni, and community members who learned about the project all expressed a desire, even a craving, to become involved, to be able to build with their own hands. Their answer to Thoreau’s question, "Shall we ever resign the pleasure of construction to the carpenter?" was loud and clear.
And so sketches were made. A crew of students and faculty spent a day and a half pulling hemlock boards and timbers from a collapsed 120-year-old barn. The campus site for the build was selected. We sent the hand-drawn sketches to an architect friend to be rendered as computer-designed drawings.
And that was the moment when the magic of creative possibility conjured by Thoreau dissipated in the reality of 21st-century America. We can't say we weren’t warned by Henry himself, who had observed even in the 1840s that human institutions often serve those who created them in unwelcome ways. Our well-meaning friend innocently inquired, "Are you sure you won’t need a building permit for this project?"
An educational project temporarily occupying a space for a year, a 150-square-foot cabin? Surely not.
But, alas, once even our innocent inquiries were made, the Town of Ithaca bureaucrats scampered into their iron cages and set about their regulatory duties — duties, it should be said, the people have charged them with. Unable to see how irrelevant modern building codes were for this project, the director of code enforcement immediately declared our plans as drawn were a menace to public health and safety. The entire thing was transformed from frustration to farce when he insisted that the cabin would need ... a sprinkler system.
At least as frustrating was the inability of the college’s own bureaucracy to either defend the principle that this project was not even subject to review (there were precedents for such an argument) or to advocate for an expedited process. Not without reason, the college administration was fearful of alienating the local government over a project that was a low priority compared to the massive building projects under way and anticipated. No matter how powerful the experience of reconstructing the cabin might be for a few hundred students, no matter that such a project conforms more closely to the vision of higher education I believe in (and Thoreau seems to have as well) than the new 130,000-square foot athletics and events center, no one was willing to challenge the town’s misapplication of rules, at least not in time to make a difference.
And so the salvaged wood sits in a storage facility awaiting its transformation, awaiting its opportunity to transform. If the town issues a permit, the winter does not stretch too far into April (as it sometimes does in these parts), and it is possible to remobilize the reconstruction team in the spring, we may yet find a replica of Thoreau’s cabin standing on our campus. If it does get built it will be as much an emblem of how accurate Thoreau’s characterizations of our society were (and are) as a triumph of experiential learning.
Even if the project becomes another one of those good ideas that run afoul of the sclerotic bureaucracies that too often hamper creativity, however, those of us most closely involved have learned, as Thoreau had, that we often forge our own chains. Our experience has confirmed the essential truth that we are almost all conformists, bound by rules and conventions we seldom question and even more rarely challenge. We need rules; who would want to live in a modern society without the rule of law? But we need to consciously consider and reconsider both the rightness of a given rule and the proper application of it. Throughout Walden Thoreau — sometimes gently, sometimes stridently — admonishes us to defy convention and seek our own path, his way of considering and reconsidering the boundaries we set for ourselves. "How deep the ruts of tradition and conformity!" he lamented. He found the expression of original thought and belief "a phenomenon so rare that I would any day walk ten miles to observe it."
In his equally famous essay “Resistance to Civil Government” (now commonly called “Civil Disobedience”), Thoreau writes, “The mass of men serve the state... not as men mainly, but as machines... . In most cases there is no free exercise whatever of the judgment or of the moral sense; but they have put themselves on a level with wood and earth and stones; and wooden men can perhaps be manufactured that will serve the purpose as well.” Thoreau condemned servility in the face of state immorality on a grand scale — in his time, this immorality was slavery and the war of aggression against Mexico that was a product of the debate over slavery.
Yet he clearly also believed that our submissiveness in the face of injustice — or, in our case in the face of intractable bureaucracy — begins with the habits of mind we cultivate in our day-to-day activities. In a provocative passage from Walden on clothes Thoreau writes, “I am sure that there is greater anxiety, commonly, to have fashionable, or at least clean and unpatched clothes, than to have a sound conscience.” Conscience may, at times, require clean clothes (I doubt if Thurgood Marshall would have gotten very far in his legal career without them), and Thoreau himself counseled that a person should “maintain himself in whatever attitude he find himself through obedience to the laws of his being, which will never be one of opposition to a just government, if he should chance to meet such.” But when social acceptance becomes the guiding principle of one’s life, when we blindly follow the spoken and unspoken rules of our culture, the world becomes a blander, less just place.
I am in no way trying to raise our impeded attempt to reconstruct Thoreau’s cabin on a privileged college campus to the level of injustice embodied by apartheid or Jim Crow, to name but two of the oppressive systems defied by people operating under Thoreau’s influence (though I like to imagine Thoreau being summoned to the town office for code violations). But I do wonder what it says about our society when we adhere so assiduously to rules and permits for things like a humble cabin while at the same time multinational corporations operate with virtual impunity. Whether it is the oil catastrophe in the Gulf of Mexico, or the poisoning of millions of gallons of water through natural gas extraction by hydraulic fracturing in my part of the country, or factory farms in Iowa that have hens laying eggs over two year old fecal piles, the absence of meaningful rules and regulations has profoundly compromised human and ecosystem health on a staggering scale. And what of the carte blanche given investment banks, in which case the absence of oversight brought the entire global financial system to its knees? But the cabin must have its sprinkler system or public safety will be jeopardized!
We too often regulate the small things inflexibly, while ignoring the behaviors and habits of thought that pose genuine threats to our — and the organisms with which we share this planet’s — very survival. Or, worse, we allow corporations to buy themselves exemptions from oversight, either through the now-legalized bribery of massive campaign contributions or in less visible ways (for just one example, see the behavior of the Minerals Management Service under the Bush administration, behavior that directly contributed to the Deepwater Horizon disaster). The result is what can seem like the worst of all possible worlds: common folk feeling oppressed by regulations that seem omnipresent and inflexible while the wealthy and powerful can often get away with murder.
Despite his reputation as a curmudgeon, Thoreau finishes Walden on an optimistic note, most famously telling us "that if one advances confidently in the direction of his dreams ... he will meet with a success unexpected in common hours." We tell our students some variant of this sentiment from the moment they arrive on campus until the last echo of the commencement speech. Our confidence may have faltered over the past few weeks as we advanced toward our modest little dream of reconstructing Henry’s cabin on campus. There remain innumerable bureaucratic hurdles to surmount before we can build the version of the cabin we envision —sans sprinkler system. Perhaps students will yet wield chisels, froes, handsaws, augurs, and hammers, and in so doing develop their poetic faculties as they contemplate the meaning of the rough-hewn, handmade cabin they have built on a modern college campus.
Michael Smith teaches history and environmental studies at Ithaca College.
Next week, Crown Publishers will issue President George W. Bush’s memoir Decision Points, covering what the former president calls “eight of the most consequential years in American history,” which seems like a fair description. They were plenty consequential. To judge from the promotional video, Bush will plumb the depths of his insight that it is the role of a president to be “the decider.” Again, it’s hard to argue with his point -- though you have to wonder if he shouldn’t let his accumulated wisdom ripen and mellow for a while before serving it.
Princeton University Press has already beat him into print with The Presidency of George W. Bush: A First Historical Assessment, edited by Julian E. Zelizer, who is a professor of history and public affairs at Princeton. The other 10 contributors are professors of history, international relations, law, and political science, and they cover the expected bases -- the “War on Terror,” the invasion of Iraq, social and economic policy, religion and race. It is a scholarly book, which means that it is bound to make everybody mad. People on the left get angry at remembering the Bush years, while those on the right grow indignant that anyone still wants to talk about them. So the notion that they were consequential is perhaps not totally uncontroversial after all.
The contributors make three points about the Bush administration’s place in the history of American conservatism that it may be timely to sum up, just now.
In the introduction, Zelizer writes that Bush’s administration “marked the culmination of the second stage in the history of modern conservatism.” The earlier period, running from the 1940s through the ‘70s, had been a time of building an effective movement out of ideological factions (fundamentalists, libertarians, and neoconservatives, among others) “none of which sat very comfortably alongside any other.” Following Reagan's victory in the 1980 election, “conservatives switched from being an oppositional force in national politics to struggling with the challenges of governance that came from holding power.”
This summer, Zelizer published a valuable review-essay on the recent historiography of the American right. It can be recommended to anyone who wants more depth than the following (admittedly schematic) remarks will manage.
(1) In the chapter called “How Conservatives Learned to Stop Worrying and Love Presidential Power,” Zelizer points to a tendency among earlier generations of American conservatives to be suspicious of the executive branch. He traces this back to polemics against FDR during the 1930s, when conservatives painted the New Deal as akin to Hitlerian dictatorship or Stalinist five-year planning. And he quotes the early neoconservative intellectual James Burnham saying, in 1959, that “the primacy of the legislature in the intent of the Constitution is plain on the face of that document.” A strong executive meant growing central power, while delegates to Congress had an incentive to protect local authority.
This sensibility changed in the course of the cold war, writes Zelizer, and particularly under the leadership of Nixon and Reagan. Distrust of executive power gave way to increasing conservative reliance on it. Concentrating executive authority in the hands of the president (rather than spreading it out among various agencies) would promote efficiency and coordinate decision-making -- so the argument went. But just as importantly, it would mean that a conservative president could curb the regulatory powers of the state.
The claims for executive authority intensified under the War on Terror -- yielding what Zelizer calls the Bush administration’s “defiant, if not downright hostile [attitude] about any kind of congressional restrictions whatsoever." But this was not just something that “the decider” decided. It reflected a decades-long reorientation in conservative ideology. "The Right cannot legitimately divorce itself from strong presidential power,” writes Zelizer. “[A]n expanding historical literature … is attempting to revise our knowledge about conservatism by demonstrating how conservatives have had a more complex and less adversarial relationship with the modern state than we previously assumed.”
(2) There was a time when manufacturing "stood atop the commanding heights of the U.S. political economy,” writes Nelson Lichtenstein -- a professor of history at the University of California at Santa Barbara -- in his chapter “Ideology and Interest on the Social Policy Home Front.” He identifies this epoch as running from 1860 until 1980. The Bush presidency belongs to the era of "retail supremacy," in which the employment trend is low-wage and high-turnover. As of 2008, there were five times as many jobs in the service sector as in “the ‘goods-producing’ industries that once constituted the core of the U.S. economy” such as agriculture, construction, and manufacture.
Free-market principles were a basic part of conservative ideology in both eras. But the beneficiaries have changed. Once upon a time, advocates of laissez-faire would sometimes find themselves accused of being mouthpieces of the National Association of Manufacturers, averse to trade regulation and price controls. But by the Bush years, that was a thing of the past. “As employers of many low-wage workers,” writes Lichtenstein, “most retailers favored the lightest possible regulatory hand, especially when it came to welfare-state mandates such as those covering employee health insurance, retirement pay, and health and safety issues.”
The gaps created by stagnating wages and shrinking benefits were plugged – for a while anyway – by "cheap imports, easy credit, an overpriced dollar, and an array of new financial products that widened the range of assets (mainly houses) that both homeowners and bankers could borrow against."
An older style of right-wing thought lauded the free market as a merciless field of combat -- a way to test the entrepreneur’s self-control and the manufacturer's commitment to increasing productivity. But the form of conservatism taking its place has freed itself from notions of delayed gratification or expanding domestic output. Wal-Mart capitalism in the Bush years claimed only to deliver the goods cheaply, no matter where they might come from.
(3) In the 1970s, conservatives liked to say that their ranks were filling up with “liberals who had been mugged by reality.” That phrase suggested that reality is one tough dude -- totally indifferent to anybody’s mere opinion.
It was quite another matter when a leading Bush administration official (unnamed but often assumed to be Karl Rove) told a reporter for The New York Times Magazine in 2004: “We’re an empire now, and when we act, we create our own reality.” Nor was this merely the judgment of a solitary solipsist. In his chapter "Creating Their Own Reality,” David Greenberg -- an associate professor of history, journalism, and media studies at Rutgers University -- maintains “the Right under Bush found itself promoting a view of knowledge in which any political claim, no matter how objectively verifiable or falsifiable, was treated as simply one of two competing descriptions of reality, with power and ideology, not science or disciplined inquiry, as the arbiters.” (Or deciders, if you will.)
There was no reality, only interpretations of reality -- and the existence of weapons of mass destruction was a function of who controlled the narrative. Little surprise that there were jokes about the rise of conservative postmodernism during the ‘00s. If Fox denied that climate change was taking place, who had the right to insist otherwise? Not some elitist, anyway.
Greenberg traces the right's “forays into epistemological relativism” back to influence of networks of right-leaning think-tanks and journalists. He quotes a contributor to The Weekly Standard from 2003, on how the right had created “a cottage industry” for spin: “Criticize other people for not being objective. Be as subjective as you want. It’s a great little racket.” And going a step beyond what Greenberg describes, we see another development along the same reality-aversive lines: the growing importance of conservative political figures whose authority within the movement comes primarily, or even exclusively, from their status as mass-media celebrities.
The former president did not create any of these tendencies. He simply took them over as legacies from what has been, for 30 years now, the strongest and most disciplined force in American politics.
Several passages in The Presidency of George W. Bush were obviously written in the flush of assumptions that the election of 2008 was a major turning point in the country's history -- the point at which the conservative movement had not just lost any chance at constructing a "permanent Republican majority" but condemned itself to wander in the electoral wilderness for a long season. Well, nobody should expect historians to be prophets, or political scientists to be bookies.
Reflecting on the recent The Humanities and Technology conference (THAT Camp) in San Francisco, what strikes me most is that digital humanities events consistently tip more toward the logic-structured digital side of things. That is, they are less balanced out by the humanities side. But what I mean by that itself has been a problem I've been mulling for some time now. What is the missing contribution from the humanities?
I think this digital dominance revolves around two problems.
The first is an old problem. The humanities’ pattern of professional anxiety goes back to the 1800s and stems from pressure to incorporate the methods of science into our disciplines or to develop our own, uniquely humanistic, methods of scholarship. The "digital humanities" rubs salt in these still open wounds by demonstrating what cool things can be done with literature, history, poetry, or philosophy if only we render humanities scholarship compliant with cold, computational logic. Discussions concern how to structure the humanities as data.
The showy and often very visual products built on such data and the ease with which information contained within them is intuitively understood appear, at first blush, to be a triumph of quantitative thinking. The pretty, animated graphs or fluid screen forms belie the fact that boring spreadsheets and databases contain the details. Humanities scholars, too, often recoil from the presumably shallow grasp of a subject that data visualization invites.
For many of us trained in the humanities, to contribute data to such a project feels a bit like chopping up a Picasso into a million pieces and feeding those pieces one by one into a machine that promises to put it all back together, cleaner and prettier than it looked before.
Which leads to the second problem, the difficulty of quantifying an aesthetic experience and — more often — the resistance to doing so. A unique feature of humanities scholarship is that its objects of study evoke an aesthetic response from the reader (or viewer). While a sunset might be beautiful, recognizing its beauty is not critical to studying it scientifically. Failing to appreciate the economy of language in a poem about a sunset, however, is to miss the point.
Literature is more than the sum of its words on a page, just as an artwork is more than the sum of the molecules it comprises. To itemize every word or molecule on a spreadsheet is simply to apply more anesthetizing structure than humanists can bear. And so it seems that the digital humanities is a paradox, trying to combine two incompatible sets of values.
Yet, humanities scholarship is already based on structure: language. "Code," the underlying set of languages that empowers all things digital, is just another language entering the profession. Since the application of digital tools to traditional humanities scholarship can yield fruitful results, perhaps what is often missing from the humanities is a clearer embrace of code.
In fact, "code" is a good example of how something that is more than the sum of its parts emerges from the atomic bits of text that logic demands must be lined up next to each other in just such-and-such a way. When well-structured code is combined with the right software (e.g., a browser, which itself is a product of code), we see William Blake’s illuminated prints, or hear Gertrude Stein reading a poem, or access a world-wide conversation on just what is the digital humanities. As the folks at WordPress say, code is poetry.
I remember 7th-grade homework assignments programming onscreen fireworks explosions in BASIC. When I was in 7th grade, I was willing to patiently decipher code only because of the promise of cool graphics on the other end. When I was older, I realized the I was willing to read patiently through Hegel and Kant because I learned to see the fireworks in the code itself. To avid readers of literature, the characters of a story come alive to us, laying bare our own feelings or moral inclinations in the process.
Detecting patterns, interpreting symbolism, and analyzing logical inconsistencies in text are all techniques used in humanities scholarship. Perhaps the digital humanities' greatest gift to the humanities can be the ability to invest a generation of "users" in the techniques and practiced meticulous attention to detail required to become a scholar.
Trained in analytic philosophy, Phillip Barron is a digital history developer at the University of California at Davis.
Over the weekend -- while busily procrastinating here in the main library of the University of Texas at Austin -- I stumbled across something suitable for the Intellectual Affairs column running just before Thanksgiving. Inside Higher Ed has a growing international readership. Still, I hope it will not be too provincial to call attention to a long-forgotten essay called “In Praise of the Americans” by Stephen Leacock, a Canadian political scientist and economist who was also one of the best-known humorists of his day. The essay appeared during the Great Depression, as the final word in an anthology intended to explain the United States to European readers.
The volume in question, America as Americans See It, was published by the Literary Guild in 1932. It contains more than 40 essays by various eminent and near-eminent figures of that era, plus dozens of photographs and cartoons. The editor, Fred J. Ringel, says in the introduction that he intended to prepare a study of the national culture after he arrived in the United States. (From where, he doesn’t indicate, and this seems to be his one major publication.) But he gave up and decided to edit an anthology instead. Among the better-remembered contributors are W.E.B. Du Bois and Upton Sinclair. There is also an essay by one Clare Boothe Brokaw, an editor at Vanity Fair, on the rituals and pretenses of high society. This author would become somewhat better known when she changed surnames after marrying Henry Luce, and her observations would be recycled into more memorable form in a play called The Women.
Although Ringel doesn’t mention it, many readers of the time would have recalled a similar volume called Civilization in the United States, published in 1922. I'd guess that America as Americans See It appearing on its tenth anniversary was not a total coincidence. The editor of the previous collection was Harold Stearns, who had also published a volume of his own writings called America and the Young Intellectuals (1921) that looks, with hindsight, like a sort of opening salvo in the culture wars. In Civilization in the United States, he joined forces with H.L. Mencken, Lewis Mumford, Van Wyck Brooks, and others to produce a landmark work of social criticism.
On the day it was published, Stearns boarded a ship to Paris – where, in celebration of having escaped Prohibition, he promptly got drunk, staying that way pretty much continuously for the next decade. Legend has it that when Stearns ended up sleeping in the gutter, one expatriate pointed him out to another and said, “There’s Civilization in the United States.” (In The Sun Also Rises, Hemingway bases a character on Stearns at his booziest.)
All of which forms part of the backdrop for America as Americans See It, and in particular to Stephen Leacock’s essay, which the editor retitled “A Neighbor Looks at America.” A short biographical headnote in the anthology notes that Leacock’s syndicated articles were reaching “as many as nine million readers in the United States and Canada alone,” while some of his work had worldwide circulation. His first volume, Elements of Political Science (1906), was for a long time the standard undergraduate textbook and was translated into 19 languages, including Chinese and Urdu. But he was best known for his humorous writings, which also fed the great demand for him as a lecturer. These extracurricular activities earned him about five times as much as his salary as a head of the department of political economy at McGill University in Montreal.
When his first collection of anecdotes and satires appeared in 1910, it became an international best-seller and earned him comparisons to Mark Twain, who had just died. “Theirs were common gifts for broad burlesque, grotesque exaggeration, juxtaposition of irrelevant ideas, and casual shifting from one comic pose to another,” points out Leacock's biographer David M. Legate. I’d say there is also some resemblance to Robert Benchley and Ring Lardner.
Leacock wrote an enormous amount -- there were one or two collections of his essays every year until his death in 1944. Much of it doesn't hold up very well, after all this time. When you need a footnote to get a joke, it’s not really a joke any more; it is a fossil. But his observations on the civilization just south of Canada are another matter. Apart from a couple of topical references, they still apply after almost eighty years.
“Americans are a queer people,” he writes. “They can’t rest…. They rush up and down across their continent as tourists; they move about in great herds to conventions, they invade the wilderness, they flood the mountains, they keep the hotels full. But they can’t rest. The scenery rushes past them. They learn it, but they don’t see it. Battles and monuments are announced to them in a rubber neck bus… So they go on rushing until the Undertaker gathers them to the last convention.”
The same state of distracted haste prevails in the educational system and in publishing. Americans “have more schools,” Leacock writes, “and better schools, and spend more money on schools and colleges than all of Europe. They print more books in one year than the French print in ten. But they can’t read. They cover their country with 100,000 tons of Sunday newspapers every week. But they don’t read them. They’re too busy. They use them for fires and to make more paper with.” Today, of course, we publish everything digitally, then ignore it.
If they ever bothered to read anything, Americans would probably be unhappy. But we don’t. So (as the quintessential American phrase now goes) it’s all good: “All the world criticizes them and they don’t give a damn….Moralists cry over them, criminologists dissect them, writers shoot epigrams at them, prophets foretell the end of them, and they never move. Seventeen brilliant books analyze them every month; they don’t read them .… But that’s all right. The Americans don’t give a damn; don’t need to; never did need to. That is their salvation.”
That is the last word of his essay -- but also of America as Americans See It, which, like other such volumes, Leacock treats as a symptom of American overproduction, destined to meet American indifference.His notion of total indifference as a basis for salvation is, no doubt about it, ironic. But you can do worse than to run a political campaign on that basis.
Either way, the man clearly had our number. The more things change, the more they stay the same. So happy Thanksgiving.
Late last month, following a protest by House G.O.P. leader John Boehner and the Catholic League president William Donohue over its imagery of ants swarming over a crucifix, the National Portrait Gallery removed a video called “A Fire in My Belly” by the late David Wojnarowicz from an exhibition. (See this report in IHE.) Over the past week, the Museum of Contemporary Art in Los Angeles painted over a mural it had commissioned from an artist named Blu; the mural showed rows of coffins draped in dollar bills. MOCA explained that the work was “inappropriate” given its proximity to a VA hospital and a memorial to Japanese-American soldiers, but has invited the muralist to come back and try again.
All of this in the wake of last spring's furor over the cartoon series South Park’s satirical depiction of Muhammad (or rather, its flirtation with that depiction). I didn’t pay all that much attention to the controversy as it was occurring, since I was still getting angry e-mail messages from Hindus who objected to a scholarly book for its impiety towards their gods. It felt like I had absorbed enough indignation to last a good long time. But there’s always plenty more where that came from. People feel aggrieved even during the holiday season. Actually, just calling it “the holiday season” is bound to upset somebody.
So it may not make sense to use the word “timely” to describe Stefan Collini’s new book That’s Offensive! Criticism, Identity, Respect (published by Seagull and distributed by the University of Chicago Press). The topic seems perennial.
A professor of intellectual history and English literature at the University of Cambridge, Collini is also the author of Absent Minds: Intellectuals in Britain (2006) and Common Reading: Critics, Historians, Publics (2008), both from Oxford University Press. His latest volume is part of a new series, “Manifestos for the 21st Century,” published in association with the internationally renowned journal Index on Censorship. As with his other recent work, it takes as its starting point the question of how criticism functions within a society.
The word “criticism” has a double meaning. There is the ordinary-usage sense of it to mean “fault-finding,” which implies an offended response almost by definition. Less obviously tending to provoke anger and defensiveness is criticism as, in Collini’s words, “the general public activity of bringing some matter under reasoned or dispassionate scrutiny.” Someone may find it absurd or perverse that there are critics who think Milton made Satan the real hero of Paradise Lost, but I doubt this interpretation has made anyone really unhappy, at least within recent memory.
Alas, this distinction is not really so hard and fast, since even the most dispassionate criticism often involves “a broader analysis of the value or legitimacy of particular claims or practices.” And it is sometimes easier to distinguish this from fault-finding in theory than in practice. “Such analysis,” writes Collini, “will frequently be conducted in terms other than those which the proponents of a claim or the devotees of a practice are happy to accept as self-descriptions, and this divergence of descriptive languages then becomes a source of offense in itself.”
Not to accept a self-description implies that it is somehow inadequate, even self-delusional. This rarely goes over well. An artist showing coffins draped with dollar bills, rather than flags, is making a polemical point -- in ways that a scholar analyzing the psychosexual dimension of religious narratives probably isn’t. But offense will be taken either way.
Such conflicts are intense enough when the exchange is taking place within a given society. When questions about “the value or legitimacy of particular claims or practices” are posed across cultural divides, the possibilities for outrage multiply -- and the problem arises of whether critique amounts to an act of aggression.
Let me simply recommend Collini’s book, rather than try to synopsize his argument on that score. But it seems like a good antidote to both clash-of-civilizationists and identity-politicians.
“Criticism may be less valued or less freely practiced in some societies than in others,” he writes, “but it is not intrinsically or exclusively associated with one kind of society, in the way that, say, hamburgers or cricket are. And anyway, different ‘cultures’ are not tightly sealed, radically discontinuous entities: they are porous, overlapping, changing ways of life lived by people with capacities and inclinations that are remarkably similar to those we are familiar with. While there are various ways to show ‘respect’ for people some of whose beliefs differ from our own, exempting those beliefs from criticism is not one of them.”
As a corollary, this implies cultivating a willingness to listen to critiques of our own deeply embedded self-descriptions. No easy thing -- for "so natural to mankind," in the words of John Stuart Mill, "is intolerance to what it really cares about." Amen to that.
For this week’s column (the last one until the new year) I asked a number of interesting people what book they’d read in 2010 that left a big impression on them, or filled them with intellectual energy, or made them wish it were better known. If all three, then so much the better. I didn’t specify that it had to be a new book, nor was availability in English a requirement.
My correspondents were enthusiastic about expressing their enthusiasm. One of them was prepared to name 10 books – but that’s making a list, rather than a selection. I drew the line at two titles per person. Here are the results.
Lila Guterman is a senior editor at Chemical and Engineering News, the weekly magazine published by the American Chemical Society. She said it was easier to pick an outstanding title from 2010 than it might have been in previous years: “Not sleeping, thanks to a difficult pregnancy followed by a crazy newborn, makes it almost impossible for me to read!”
She named Rebecca Skloot’s The Immortal Life of Henrietta Lacks, published by Crown in February. She called it an “elegantly balanced account of a heartbreaking situation for one family that simultaneously became one of the most important tools of biology and medicine. It was a fast-paced read driven by an incredible amount of reporting: A really exemplary book about bioethics.”
Neil Jumonville, a professor of history at Florida State University, is editor of The New York Intellectual Reader (Routledge, 2007). A couple of collections of essays he recently read while conducting a graduate seminar on the history of liberal and conservative thought in the United States struck him as timely.
“The first is Gregory Schneider, ed., Conservatives in America Since 1930 (NYU Press, 2003). Here we find a very useful progression of essays from the Old Right, Classical Liberals, Traditional Conservatives, anticommunists, and the various guises of the New Right. The second book is Michael Sandel, Liberalism and Its Critics (NYU Press, 1984). Here, among others, are essays from Isaiah Berlin, John Rawls, Robert Nozick, Alisdair MacIntyre, Michael Walzer, a few communitarians represented by Sandel and others, and important pieces by Peter Berger and Hannah Arendt.”
Reading the books alongside one another, he said, tends to sharpen up one's sense of both the variety of political positions covered by broad labels like “liberal” and “conservative” and to point out how the traditions may converge or blend. “Some people understand this beneficial complexity of political positions,” he told me, “but many do not.”
Michael Yates retired as a professor of economics and labor relations at the University of Pittsburgh at Johnstown in 2001. His most recent book is In and Out of the Working Class, published by Arbeiter Ring in 2009.
He named Wallace Stegner’s The Gathering of Zion: The Story of the Mormon Trail, originally published in 1964. “I am not a Mormon or religious in the slightest degree,” he said, “and I am well aware of the many dastardly deeds done in the name of the angel Moroni, but I cannot read the history of the Mormons without a feeling of wonder, and I cannot look at the sculpture of the hand cart pioneers in Temple Square [in Salt Lake City] without crying. If only I could live my life with the same sense of purpose and devotion…. It is not possible to understand the West without a thorough knowledge of the Mormons. Their footprints are everywhere."
Adam Kotsko is a visiting assistant professor of religion at Kalamazoo College. This year he published Politics of Redemption: The Social Logic of Salvation (Continuum) and Awkwardness (Zero Books).
“My vote," he said, "would be for Sergey Dogopolski's What is Talmud? The Art of Disagreement, on all three counts. It puts forth the practices of Talmudic debate as a fundamental challenge to one of the deepest preconceptions of Western thought: that agreement is fundamental and disagreement is only the result of a mistake or other contingent obstacle. The notion that disagreements are to be maintained and sharpened rather than dissolved is a major reversal that I'll be processing for a long time to come. Unfortunately, the book is currently only available as an expensive hardcover.”
Helena Fitzgerald is a contributing editor for The New Inquiry, a website occupying some ambiguous position between a New York salon and an online magazine.
She named Patti Smith’s memoir of her relationship with Robert Mapplethorpe, Just Kids, published by Ecco earlier this year and recently issued in paperback. “I've found Smith to be one of the most invigorating artists in existence ever since I heard ‘Land’ for the first time and subsequently spent about 24 straight with it on repeat. She's one of those artists who I've long suspected has all big secrets hoarded somewhere in her private New York City. This book shares a satisfying number of those secrets and that privately legendary city. Just Kids is like the conversation that Patti Smith albums always made you want to have with Patti Smith.”
Cathy Davidson, a professor of English and interdisciplinary studies at Duke University, was recently nominated by President Obama to serve on the National Council on the Humanities. She, too, named Patti Smith’s memoir as one of the books “that rocked my world this year.” (And here the columnist will interrupt to give a third upturned thumb. Just Kids is a moving and very memorable book.)
Davidson also mentioned rereading Tim Berners-Lee's memoir Weaving the Web, first published by HarperSanFrancisco in 1999. She was “inspired by his honesty in letting us know how, at every turn, the World Wide Web's creation was a surprise, including the astonishing willingness of an international community of coders to contribute their unpaid labor for free in order to create the free and open World Wide Web. Many traditional, conventional scientists had no idea what Berners-Lee was up to or what it could possibly mean and, at times, neither did he. His genius is in admitting that he forged ahead, not fully knowing where he was going….”
Bill Fletcher Jr., a senior scholar at the Institute for Policy Studies, is co-author, with Fernando Gapasin, of Solidarity Divided, The Crisis in Organized Labor and A New Path Toward Social Justice, published by the University of California Press in 2009.
He named Marcus Rediker and Peter Linebaugh’s The Many-Headed Hydra: The Hidden History of the Revolutionary Atlantic (Beacon, 2001), calling it “a fascinating look at the development of capitalism in the North Atlantic. It is about class struggle, the anti-racist struggle, gender, forms of organization, and the methods used by the ruling elites to divide the oppressed. It was a GREAT book.”
Astra Taylor has directed two documentaries, Zizek! and Examined Life. She got hold of the bound galleys for James Miller’s Examined Lives: From Socrates to Nietzsche, out next month from Farrar Straus and Giroux. She called it “a book by the last guy I took a university course with and one I've been eagerly awaiting for years. Like a modern day Diogenes Laertius, Miller presents 12 biographical sketches of philosophers, an exploration of self-knowledge and its limits. As anyone who read his biography of Foucault knows, Miller's a master of this sort of thing. The profiles are full of insight and sometimes hilarious.”
Arthur Goldhammer is a senior affiliate of the Center for European Studies at Harvard University and a prolific translator, and he runs an engaging blog called French Politics.
“I would say that Florence Aubenas' Le Quai de Ouistreham (2010) deserves to be better known,” he told me. “Aubenas is a journalist who was held prisoner in Iraq for many months, but upon returning to France she did not choose to sit behind a desk. Rather, she elected to explore the plight of France's ‘precarious’ workers -- those who accept temporary work contracts to perform unskilled labor for low pay and no job security. The indignities she endures in her months of janitorial work make vivid the abstract concept of a ‘dual labor market.’ Astonishingly, despite her fame, only one person recognized her, in itself evidence of the invisibility of social misery in our ‘advanced’ societies.”
The book that made the biggest impression on her this year was Judith Giesberg's Army at Home: Women and the Civil War on the Northern Home Front, published by the University of North Carolina Press in 2009. “Too often,” Rubin told me, “historians ignore the lives of working-class women, arguing that we don't have the sources to get inside their lives, but Giesberg proves us wrong. She tells us about women working in Union armories, about soldiers' wives forced to move into almshouses, and African Americans protesting segregated streetcars. This book expands our understanding of the Civil War North, and I am telling everyone about it.”
Siva Vaidhyanathan is a professor of media studies and law at the University of Virginia. His next book, The Googlization of Everything: (And Why We Should Worry), will be published by the University of California Press in March.
He thinks there should have been more attention for Carolyn de la Pena's Empty Pleasures: The Story of Artificial Sweeteners from Saccharin to Splenda, published this year by the University of North Carolina Press: “De la Pena (who is a friend and graduate-school colleague) shows artificial sweeteners have had a powerful cultural influence -- one that far exceeds their power to help people lose weight. In fact, as she demonstrates, there is no empirical reason to believe that using artificial sweeteners helps one lose weight. One clear effect, de la Pena shows, is that artificial sweeteners extend the pernicious notion that we Americans can have something for nothing. And we know how that turns out.”
Vaidhyanathan noted a parallel with his own recent research: “de la Pena's critique of our indulgent dependence on Splenda echoes the argument I make about how the speed and simplicity of Google degrades our own abilities to judge and deliberate about knowledge. Google does not help people lose weight either, it turns out.”
Michael Tomasky covers U.S. politics for The Guardian and is editor-in-chief of Democracy: A Journal of Ideas.
“On my beat,” he said, “the best book I read in 2010 was The Spirit Level (Bloomsbury, 2009), by the British social scientists Richard Wilkinson and Kate Pickett, whose message is summed up in the book's subtitle, which is far better than its execrable title: ‘Why Greater Equality Makes Societies Stronger.’ In non-work life, I'm working my way through Vasily Grossman's Life and Fate from 1959; it's centered around the battle of Stalingrad and is often called the War and Peace of the 20th century. I'm just realizing as I type this how sad it is that Stalingrad is my escape from American politics.”
I was a graduate student in the 1980s, during the heyday of the so-called “culture wars” and the curricular attacks on "Western civilization." Those days were punctuated by some Stanford students chanting slogans like "Hey hey, ho ho, Western Civ has got to go," and by fiery debates about Allan Bloom’s book The Closing of the American Mind, which appeared in 1987, toward the end of my years in graduate school. Back then the battle lines seemed clear: conservatives were for Western civilization courses and the traditional literary canon, while liberals and progressives were against those things and for a new, more liberating approach to education.
In retrospect I find that decade and its arguments increasingly difficult to comprehend, even though I experienced them firsthand. I ask myself: What on earth were we thinking? Exactly why was it considered progressive in the 1980s to get rid of courses like Western civilization (courses that frequently included both progressives and conservatives on their reading lists)? And why did supporting a traditional liberal arts education automatically make one a conservative — especially if such an education included philosophers like Jean-Jacques Rousseau and Karl Marx?
A quarter of a century later, with the humanities in crisis across the country and students and parents demanding ever more pragmatic, ever more job-oriented kinds of education, the curricular debates of the 1980s over courses about Western civilization and the canon seem as if they had happened on another planet, with completely different preconceptions and assumptions than the ones that prevail today. We now live in a radically different world, one in which most students are not forced to take courses like Western civilization or, most of the time, in foreign languages or cultures, or even the supposedly more progressive courses that were designed to replace them. And whereas as late as the 1980s English was the most popular major at many colleges and universities, by far the most popular undergraduate major in the country now is business.
The battle between self-identified conservatives and progressives in the 1980s seems increasingly like rearranging the deck chairs on the Titanic. While humanists were busy arguing amongst themselves, American college students and their families were turning in ever-increasing numbers away from the humanities and toward seemingly more pragmatic, more vocational concerns.
And who can really blame them? If humanists themselves could not even agree on the basic value, structure, and content of a liberal arts education — if some saw the tradition of Western civilization as one of oppression and tyranny, while others defended and validated it; if some argued that a humanistic education ought to be devoted to the voices of those previously excluded from "civilized" discussion, such as people of color and women, while others argued that such changes constituted a betrayal of the liberal arts — is it any wonder that students and their families began turning away from the humanities?
After all, economics and business professors did not fight about the basic structure of business or economics majors, even though there were differences among Keynesian and Friedmanite economists, for instance, over monetary policy. And physics professors did not engage in fundamental debates about physics curriculums — which should one teach, quantum mechanics or relativity? — in spite of Einstein’s problems with quantum mechanics ("God does not play dice with the universe"). In the 1980s the humanities as a whole seemed to be the only field where even experts were unable to agree on what constituted the appropriate object of study.
If I go to a doctor’s office and witness doctors and nurses fighting about whether or not I should take a particular medication, I’m likely to go elsewhere for my health care needs. I think something analogous happened to the humanities in the 1980s, and it is continuing to happen today, although by now the humanities are so diminished institutionally that these changes no longer have the overall significance they had in the 1980s. In the 1980s the humanities still constituted the core of most major universities; by now, at most universities, even major ones, the humanities are relatively marginal, far surpassed, in institutional strength, by business, medical, and law schools.
One of the core functions of the humanities for centuries was the passing down of a tradition from one generation to the next. The idea behind Western civilization courses was supposed to be that students needed them in order to understand the origins and development of their own culture. In the 1980s three developments worked against that idea. The first was an educational establishment that was no longer content simply to pass knowledge down from one generation to the next, and that wanted to create new knowledge. The second development, which dovetailed with the first, was the emergence of new approaches to the humanities that examined structures of oppression and domination in traditions previously viewed as unimpeachable. One could examine women's history, for instance, or non-Western cultures. The third development, which dovetailed with the first and second, was the increasing demand for “relevance” in higher education, with "relevance" being understood as present-oriented and pragmatic, i.e. job-related.
The conflation of these three developments led to the widespread perception — and not just among self-proclaimed progressives — that anything traditional or old was also, almost by definition, conservative, fuddy-duddy, and impractical. In essence those three developments have now long since triumphed, and the educational world of today is largely the result of that triumph.
Unfortunately, however, traditions that are not passed on from one generation to the next die. If an entire generation grows up largely unexposed to a particular tradition, then that tradition can in essence be said to be dead, because it is no longer capable of reproducing itself. It does not matter whether the tradition in question is imagined as the Western tradition, the Christian tradition, or the Marxist tradition (and of course both Christianity and Marxism are part of the Western tradition). Traditions are like languages: if they are not passed on, they die. Most traditions, of course, have good and bad elements in them (some might argue for Christianity, some for Marxism, relatively few for both), and what dies when a tradition dies is therefore often both good and bad, no matter what one’s perspective. But what also dies with a tradition is any possibility of self-critique from within the tradition (in the sense that Marxism, for instance, constituted a self-critique from within the Western tradition), since a tradition’s self-critique presupposes the existence of the tradition. Therefore the death of a tradition is not just the death of the oppression and tyranny that might be associated with the tradition, but also the death of progressive and liberating impulses within the tradition.
We all know, of course, that nature abhors a vacuum, and for that reason when a tradition dies, what fills in the vacuum where the tradition used to be is whatever is strongest in the surrounding culture. In our culture we know quite well what that is: the belief in money, in business, in economics, and in popular culture. That is our real religion, and it has largely triumphed over any tradition, either progressive or tyrannical. It is no more a coincidence that business is the most popular major in the United States today than it was that theology was one of the major fields of the 1700s.
As a result of the triumph of relevance and pragmatism over tradition, the ivy-covered walls of academia, which once seemed so separated from what is often called the “real world,” now offer very little protection from it. In fact the so-called "real world" almost entirely dominates the supposedly unreal world of academia. It may have once been true that academia offered at least a temporary sanctuary for American students on their way to being productive, hard-working contributors to a booming economy; now, however, academia offers very little refuge to students on their way into a shaky, shell-shocked economy where even the seemingly rock-solid belief in the “free market” has been thrown into question. In 1987 Allan Bloom wrote: "Education is not sermonizing to children against their instincts and pleasures, but providing a natural continuity between what they feel and what they can and should be. But this is a lost art. Now we have come to exactly the opposite point." Over two decades later, it seems to me that Bloom was right, and that indeed we have come “to exactly the opposite point.” Unfortunately now, neither self-styled conservatives nor self-styled progressives are likely to want to defend a vision of education that even in Bloom’s view was long gone. And sadder still is the fact that few of our students will even realize what has been lost.
And so I think we owe an apology to our students. We humanists inherited a tradition more or less intact, with all its strengths and weaknesses, but it appears highly likely that we will not be able or willing to pass it on to them. That is a signal failure, and it is one for which we will pay dearly. No doubt there is lots of blame to go around, but instead of looking around for people to blame, it would be more constructive to save what we can and pass it along to the next generation. They are waiting, and we have a responsibility.
Stephen Brockmann is professor of German at Carnegie Mellon University and president of the German Studies Association.
“Only boring people get bored.” Google offers no settled judgment about who coined this aphorism, but it came to my attention via Arline Tehan, the Rodin scholar, who happens to be my mother-in-law. It was the law of her household, in decades past, giving the kids an incentive to use the library, since otherwise some bit of housework could always be found to occupy their attention. All of this was well before the advent of the Internet, of course -- and I’m told that the TV set was off-limits except during specific evening hours. Then again, staring at a screen never relieves boredom, only anesthetizes it.
The propensity of children to get bored is well-known. Peter Toohey notes in Boredom: A Lively History (Yale University Press) that many adults will insist, out of pride, that they never succumb to it. But thinking of boredom as childish is too simplistic, he argues, while claiming immunity from it is seldom convincing. He admits to being bored “for very large tracts of my life,” and so one may regard his subtitle with a degree of suspicion. If he is prone to boredom, wouldn’t that make him boring? How lively can the book be?
Plenty, as it happens -- though the subtitle is still a bit misleading. The book’s approach is historical only in part. Toohey, a professor of Greek and Roman studies at the University of Calgary, draws on research in such fields as neurochemistry (the relationship between boredom and low dopamine levels) and penology (prisoners in solitary confinement are pushed to the extremes of tedium). He is skeptical, though not dismissive, of the trend in much humanities scholarship of late that treats emotions as so deeply embedded in specific social and cultural contexts as to be inseparable from them.
The most emphatically historicist understanding of boredom, for example, treats it as one of the side effects of modernity, kicking in sometime after the middle of the 18th century. The word “bore” in this sense, whether as a noun or a verb, doesn’t appear in Samuel Johnson’s dictionary from 1755. It seems to enter the English language in the late 1760s, with no clear etymological provenance. And for the symptom-of-modernity interpretation, the timing of its arrival is no accident.
The possibility of boredom only emerges once enough people have the security, leisure, and comfort to complain that security, leisure, and comfort aren’t everything. This coincides with, and is reinforced by, the rapidly expanding market for novels, with their reminder that one’s life could be much more interesting than (alas) it usually is. By the late 18th century, then, the conditions existed for a new sort of unhappiness, requiring a new word to name it. Until then, boredom was not really a problem. Things like famine and religious warfare had made life altogether too exciting.
This insistence on historical context runs against the more commonplace understanding of emotions -- the assumption that they are essentially timeless and universal. To be sure, the factors eliciting admiration, fear, anger, etc., vary from culture to culture, and so does their expression. The disgraced Samurai of the 16th century would commit seppuku; the disgraced American politician of the 21st calls a press conference. But the feelings themselves are the same -- and arguably, there are certain aspects of how we exhibit them cut across any cultural and historical barriers.
The late Silvan Tomkins argued that a few basic affects are hard-wired into us as organisms; they are part of how our nervous systems respond to signals from the environment. Disgust, for example, involves a rapid assessment that something is toxic or contaminating; this induces an involuntary impulse to pull away, with a tendency for the upper lip to curl, as when you smell something foul. The kinds of things inducing that feeling vary from society to society -- and within a society, for that matter. But the curled lip is nature, not culture, at work.
We are complex organisms -- our experience mediated by language and memory, not just sensory impressions. And we are capable of more than one response to a given stimulus. And so the hard-wired affective tendencies identified by Tomkins interact in all kinds of subtle ways -- creating a broad spectrum of human emotions.
Toohey doesn’t discuss affect theory. But he does mention the rather Tomkinsian speculation that boredom may not be a distinct feeling. Instead, it could result from a mixture of “frustration, surfeit, depression, disgust, indifference, apathy, and the feeling of being trapped or confined.” Marshaling evidence for art and literature over the centuries, Toohey makes the case that variations on this combination of feelings can be found well before modernity. Certain similarities of bodily expression of boredom cut across various cultural divides, such as a certain way of slumping while resting head on hands. Boredom is, he argues, a feeling akin to depression and anger, but also a kind of emotional signal telling you to remove yourself from a situation that might overwhelm you with depression and anger.
For what it's worth, I find this intuitively persuasive. At any movie where there turns out to be a car chase, for example, my response always involves “frustration, surfeit, depression, disgust, indifference, apathy, and the feeling of being trapped or confined.” And Toohey tends to confirm my mother-in-law's adage. If you wallow in boredom, or try to evade it by mind-numbing expedients -- rather than cultivating the skills needed to redirect your attention to something else -- there are other soul-depleting forces ready to kick in and make things worse. Clearly the author knows this; he's written an interesting book. And when you finish it, there's one on a recent volume on Rodin by another author that I'll recommend.