In 1917 John Dewey published “The Need for a Recovery of Philosophy.” The essay consists of a reflection on the role of philosophy in early 20th century American life, expressing Dewey’s concern that philosophy had become antiquated, “sidetracked from the main currents of contemporary life,” too much the domain of professionals and adepts. While taking pains to note that the classic questions of philosophy make contributions to culture both past and present, Dewey felt that the topics being raised by professional philosophers were too often “discussed mainly because they have been discussed rather than because contemporary conditions of life suggest them.”
Dewey soon traveled to China, where he delivered nearly 200 lectures on education and democracy to large crowds across a two-year stay. Back in America Dewey commented on the public questions of the day, a role that he inhabited until his death in 1952. Since then, however, professional philosophers have followed W.V.O. Quine’s path in treating philosophy as a technical exercise of no particular interest to the layman:
Think of organic chemistry; I recognize its importance, but I am not curious about it, nor do I see why the layman should care about much of what concerns me in philosophy.
But is philosophy really analogous to chemistry, a domain of expertise populated by specialists? Or are philosophical questions part and parcel of everyone’s life, as far from a specialist’s tasks as anything can be?
Nearly 100 years after Dewey’s essay, it’s time for another reconstruction of philosophy.
While it is possible to point to philosophers who work with (rather than merely talk about) the concerns of non-philosophers, among the mass of philosophers societal irrelevance is often treated as a sign of intellectual seriousness.
This is a shame, since we are surrounded by phenomena crying out for philosophic reflection. Today we are constantly confronted by philosophic questions, in many cases created by advances in science and technology. Open your computer and you can find thoughtful exploration of issues as varied as the creation of autonomous killing machines, the loss of privacy in a digital age, the remaking of friendship via Facebook, and the refashioning of human nature via biotechnology. In this sense philosophy abounds. But professional philosophers have remained largely on the margins of this growing cultural conversation.
It needn’t be this way. Take the subject matter of metaphysics. Every philosophy department teaches courses in metaphysics. But how is the subject handled? As evidenced by a sample of university syllabuses posted online, metaphysics classes are overwhelmingly exercises in professional philosophy. Just as Dewey complained, classes begin from the concerns of philosophers rather than from contemporary problems. This can be seen in the leading textbooks. Consider as magisterial a source as the Oxford Handbook of Metaphysics, Loux and Zimmerman, eds. Their introduction begins so:
Its detractors often characterize analytical philosophy as anti-metaphysical. After all, we are told, it was born at the hands of Moore and Russell, who were reacting against the metaphysical systems of idealists like Bosanquet and Bradley…
The discussion is entirely framed in terms of the disciplinary concerns of philosophy – and only 20th century analytic philosophy at that. We find no reference to people’s actual lives, to the metaphysical issues tied to the births and transformations and deaths that we all endure, no acknowledgement that questions of metaphysics involve some of the most intimate and transcendent questions of our lives. Instead, metaphysics is a tale told in terms of professionals: Moore and Russell, Bosanquet and Bradley, Quine and Lewis.
We are not claiming that the matters addressed by such essays are insignificant. But it takes one adept in philosophy to extract the nut of existential meaning from the disciplinary shell. No wonder even the best students walk away.
Why do philosophers begin with insider topics when issues laden with metaphysics are in the news every day? The May 25, 2014 issue of The Washington Post describes a patient taking heart pills that include ingestible chips: the chips link up with her computer so that she and her doctor can see that she has taken her medicine. The story also describes soon-to-be marketed nanosensors that live in the bloodstream and will be able to spot the signs of a heart attack before it occurs. These are issues that could fall under “Existence and Identity,” one of the sections of the Oxford Handbook: at stake here are metaphysical questions about the nature of self and the boundary between organism and machine.
This needs to change, for the health of our culture, and for the health of philosophy itself. Unless professional philosophy embraces and institutionalizes an engaged approach to philosophizing, working alongside other disciplines and abroad in the world at large, it will become a casualty of history.
In our opinion, the single greatest impediment to philosophy’s greater relevance is the institutional situation of philosophy. The early 20th century research university disciplined philosophers, placing them in departments, where they wrote for and were judged by their disciplinary peers. Oddly, this change was unremarked upon, or was treated as simply the professionalization of another academic field of research. It continues to be passed over in silence today. Like Moliere’s Bourgeois Gentleman, who did not know that he had been speaking prose, philosophers seem innocent of the fact that they have been doing disciplinary philosophy, or that one might have reasons to object to this fact. And so even when their subject matter consists of something of real significance to the wider world, philosophers typically discuss the topic in a way that precludes the active interest of and involvement by non-philosophers.
Philosophers view themselves as critical thinkers par excellence who have been trained to question everything; but they have overlooked the institutional arrangements that govern their lives. The department is seen as a neutral space from which thought germinates, not itself the object of reflection. One finds no exploration of the effects that disciplining might have had on philosophical theorizing, or of where else philosophers could be housed, or of how philosophers, by being located elsewhere, might have developed alternative accounts of the world or have come up with new ways of philosophizing. In fact, the epistemic implications of the current institutional housing of philosophy are profound.
Philosophers once recognized that there is something problematic about treating philosophy as simply one discipline alongside the others. It was once understood that in addition to fine-grained analyses philosophy offered perspectives that undergirded, capped off, or synthesized the work of other disciplines such as physics or biology, and then connected those insights to our larger concerns. Such work lost favor in the 20th century – dismissed as Weltanschauung philosophy by analytic philosophers, and as foundationalism by continental philosophers. But reopen this perspective and questions abound: if philosophy is not, or not exclusively a regional ontology, why are philosophers housed within one region of the university?
Why is peer-reviewed scholarship the sole standard for judging philosophic work, rather than also the effects that such work has on the larger world? And why is there only one social role for those with Ph.D.s in philosophy – namely, to talk to other Ph.D.s in philosophy?
Philosophers may have ignored their institutional placement, but for other disciplines critical reflection on the structures of knowledge production has become par for the course. Perhaps the most important site for such analysis is the interdisciplinary field of science, technology, and society studies (STS). One influential book in STS – Gibbons et al.’s 1994 The New Production of Knowledge – chronicles the shift in late 20th century science from “Mode 1” to “Mode 2” knowledge production. Mode 1 is academic, investigator-initiated, and discipline-based. By contrast, Mode 2 knowledge production is context-driven, problem-focused, and interdisciplinary. This framework is a good rough sketch of our basic point: we are tracing and promoting the 21st century development of Mode 2 philosophy.
But make no mistake. We are pluralists on this point. We believe Mode 1 or disciplinary scholarship should continue to have a central place in philosophy. But Mode 1 thinking needs to be counter-balanced by an equal focus within the philosophical community on conducting work that is socially engaged. In part this is simply recognizing a new reality: increasingly society is demanding that academics demonstrate their broader relevance. This demand has so far largely skipped over philosophy and the humanities, but this is unlikely to remain the case for much longer. Philosophy needs to demonstrate its bona fides by showing how it can make timely and effective contributions to contemporary debates. We believe that this is best done in a way that also shows that Mode 2 philosophizing is enriched by the insights of Mode 1 or traditional philosophy.
While Mode 1 philosophy is still the reigning orthodoxy, there is a growing heterodoxy within the ranks of philosophers, sometimes lumped under the title of “public philosophy.” We call our own version of Mode 2 work “field philosophy.” There are a number of similar approaches in areas such as environmental justice, critical race theory, feminism, and bioethics that we recognize as allies. We celebrate these diverse approaches to Mode 2 philosophizing, whether they go by the name of ‘public’, ‘applied’, or by some other title. But we believe that the lack of thought given to the institutional dimensions of philosophizing has limited the effectiveness of this work. A new philosophical practice, where philosophers work in real time with a variety of audiences and stakeholders, will lead to new theoretical forms of philosophy – once we break the stranglehold that disciplinary norms have upon the profession.
It will take a community to institutionalize Mode 2 practices. As it stands now, heterodox practitioners (however they self-identify) exist on the margins and lead professional lives that run against the grain. As the feminist public philosopher Linda Martín Alcoff notes, many Mode 2 philosophers try to “walk a fine line between responsiveness to community needs and employment survival, pushing the boundaries of academic respectability even while trying to establish their credentials in conventional ways." It is these “conventional ways” that must change. We have to invent a philosophy where responsiveness to community needs (not just disciplinary interests and imperatives) is an integral part of one’s employment and is viewed as academically respectable.
In practice, this will require many changes, from revised promotion and tenure criteria to alternative metrics for excellence and impact. As these changes are implemented, it will be important to consider at what point the chasm has been reduced to a suitable-sized gap. After all, we don’t want to eliminate the space between philosophy and society altogether. Socrates was engaged, but still an outsider. He certainly was no pundit looking to score the most outrageous sound bite and rack up the most “likes” on Facebook. We need a people’s philosophy that reserves every right to be unpopular.
Robert Frodeman is a professor of philosophy and religion studies at North Texas and director of its Center for the Study of Interdisciplinarity. Adam Briggle is an associate professor of philosophy and religion studies at the University of North Texas.
About 10 years ago, I was an admissions officer at a university in London, where (typical of the British system of admitting major by major) I read essays from those who wanted to study philosophy. To be honest, the essays were largely indistinguishable from one another, presumably because the applicants were all given identical advice about what they should say.
But my interest peaked when the applicants mentioned what drew them to philosophy in the first place. Often, they cited a work of “popular” philosophy, perhaps Sophie’s World by Jostein Gaarder, or something by A.C. Grayling or Alain de Botton. The students would not be reading such works once they arrived to do their degree. Rather they would read the philosophical classics – Plato, Aristotle, Hume, Kant – and cutting edge philosophical papers from the more recent past. They had been pulled in by popular philosophy, but at university they would experience professionalized philosophy, learning its special jargon, conceptual tools, and history.
There has long been a gulf between the public experience of philosophy and philosophy as it is pursued among the experts. Like other academics, philosophers focus on sharing research with colleagues, and draw on it when they teach the students who have shown enough aptitude (and paid or borrowed enough money) to get into their classrooms. Only a minority of academics try to speak to a broader audience, and when they do, the link to what they do in their professional life is presumed to be rather indirect. Knowledge trickles down from the ivory tower to the public sphere, but what comes out has typically been just that: a trickle.
This is beginning to change. The reason can be summed up in an unlovely, two-word phrase: “new media.” With tools like blogs and podcasts, platforms like iTunesU and “massive open online courses” (MOOCs), academics now have the opportunity to reach an enormous audience of people who need only an Internet connection and a modicum of curiosity. There are online interviews with leading philosophers (Radio 4’s “In Our Time,” “Philosophy Bites,” “Philosopher’s Zone,” “Elucidations”) and themed series like my History of Philosophy podcast. You can also find free philosophy instruction on YouTube and on iTunesU (traditional university lectures recorded and put online), while many conferences and professional lectures are also appearing on the internet (for instance from the Aristotelian Society, or the Center for Mathematical Philosophy in Munich).
It’s an unprecedented opportunity. So why don’t more academics take advantage of it? Many of the podcasters who host series on topics in history, for instance, are not university lecturers but independent scholars. I know, because I met them on Facebook (of course).
Of course there are plenty of practical explanations for this reluctance. New media projects require a certain degree of fearlessness when it comes to technology, and can be very time-consuming. With the heavy demands of teaching, research and administration, it’s no surprise that launching such a project may not rise to the top of an academic’s “to do” list. In theory, there could be rewards to balance the costs in time and energy. We are frequently asked to demonstrate the wider social “impact” of our work these days, on grant applications or in the Britain’s Research Excellence Framework survey. But “impact” is a rather ill-defined notion. When I first launched my own podcast, I was warned that it would not necessarily make a good impact case study in the REF: how exactly does one document the “impact” of a podcast? In any case, hosting a podcast is unlikely to help your career as much as writing a good journal article or two, which could easily take less time.
Beyond the practical issues, I suspect most academics still assume that media projects are inevitably “popular,” in the pejorative sense of being strictly introductory. A podcast or blog isn’t the place to do real philosophy or history – this view holds -- that happens in the classroom, or in the pages of peer-reviewed journals and monographs. But such worries miss the promise of new media. With no time limits and no editorial constraints, academics can make any ideas they choose freely available on the Internet. If that content isn’t for everyone, so what?
My own podcast covers the history of philosophy “without any gaps,” moving chronologically at a slow (some might say excruciatingly slow) rate. (Obviously this sort of thing isn’t for everyone. But my listeners are not just fellow academics and undergraduates. They are commuters, truck drivers, homemakers, retirees, high school students – as I say, anyone with an Internet connection and curiosity about the subject. We should not underestimate how widespread that curiosity might be, even when it comes to rather recondite topics.
Furthermore, just as students in a university setting helping their teachers to see things in a new way, the audience for a new media project will respond with corrections, comments, and other sorts of feedback. So there is a chance here for a democratic and open conversation in which knowledge is shared among many more people, not just those among the academic community. I believe that more and more academics will seize that chance, even if the use of new media raises questions about the role of universities and academic experts.
Why, for instance, should students pay high tuition to learn the same things they could be downloading for free? Yet this worry too, I think, is misplaced.
If anything, following a blog, taking a MOOC, or subscribing to a podcast will bring potential students to fields of study they would not otherwise have considered. I don’t read admissions essays anymore, but I like to imagine that some of the applicants say they’ve been inspired to pursue philosophy because of something they found online.
In December, the journal Brain Connectivity published a paper called "Short- and Long-Term Effects of a Novel on Connectivity in the Brain," based on a study conducted at Emory University. The researchers did MRI scans of the brains of 21 undergraduate students over a period of days before, during, and after they read a best-selling historical page-turner called Pompeii over the course of nine evenings. A significant increase of activity in "the left angular supramarginal gyri and right posterior temporal gyri" occurred during the novel-reading phase of the experiment, which fell off rapidly once they finished the book -- the gyri being, the report explained, regions of the brain "associated with perspective taking and story comprehension."
Not a big surprise; you'd figure as much. But the researchers also found that an elevated level of activity continued in the bilateral somatosensory cortex for some time after the subjects were done reading. In the novel, a young Roman aqueduct engineer visiting the ancient vacation resort of Pompeii "soon discovers that there are powerful forces at work -- both natural and man-made -- threatening to destroy him." Presumably the readers had identified with the protagonist, and parts of their brains were still running away from the volcano for up to five days after they finished the book.
So one might construe the findings, anyway. The authors are more cautious. But they raise the question of whether the experience of reading novels "is sufficiently powerful to cause a detectable reorganization of cortical networks" -- what they call a "hybrid mentalizing-narrative network configuration." Or to put it another way, a long-term rearrangement of the mind's furniture.
It isn't a work of fiction, and I am but a solitary reader without so much as access to an electroencephalograph, but A Philosophy of Walking by Frédéric Gros, a French best-seller from 2011 just published in English by Verso, seems to have been setting up its own "hybrid mentalizing-narrative network configuration" within my head over the past few days. Maybe it's the weather. After so many months of cold weather and leaden skies, Gros's evocation of the pleasures of being outside, moving freely, in no particular hurry, stirs something deep within.
The author, a professor of philosophy at the University of Paris, has, among other things, edited volumes in the posthumous edition of Michel Foucault's lectures at the College de France. But the authority Gros brings to his reflections on walking comes only in part from knowing the lives and writings of ambulatory thinkers across the millennia, beginning in ancient Greece. He is a scholar but also a connoisseur -- someone who has hiked and wandered enough in his time, over a sufficient variety of terrains, to know at first hand the range of moods (ecstasy, monotony, exhaustion) that go with long walks.
It is a work of advocacy, and of propaganda against sedentary thinking. The first of Gros's biographical essays is on Nietzsche, who took up walking in the open air while suffering from migraine headaches, eyestrain, and late-night vomiting spasms. It did not cure him, but it did transform him. He might be the one spending time at health resorts, but it was contemporary intellectual life that manifested invalidism.
"We do not belong to those who have ideas only among books, when stimulated by books," Nietzsche wrote. "It is our habit to think outdoors -- walking, leaping, climbing, dancing, preferably on lonely mountains or near the sea where even the trails become thoughtful. Our first questions about the value of a book, of a human being, or of a musical composition, are: Can they walk? Even more, can they dance?" Long, solitary hikes such as those taken by Nietzsche -- and also by Rousseau, the subject of another essay -- are only one mode of philosophical pedestrianism. The precisely timed daily constitutional that Kant took each day, so regular that his neighbors could set their watches by it, has gone down in history as an example of his extreme rigor (one easily recognized even by the layman who can't tell his an a posteriori from his elbow). Gros adds a telling detail to this otherwise commonplace biographical fact: Kant took care to walk at a measured, even pace, since he was profoundly averse to sweating.
At the other extreme was the ancient philosophical school known as the Cynics, with its cultivation of an almost violent indifference to comfort and propriety. The Cynics were homeless vagrants on principle. They denied themselves, as much as possible, every luxury, or even convenience, taken for granted by their fellow Greeks.
That included footwear: "They had done so much walking," Gros says, "that they hardly needed shoes or even sandals, the soles of their feet being much like leather." When the Cynics showed up in a town square, their constant exposure to nature's elements gave a jagged edge to the harangues in which they attacked commonplace ideas and values. Gros sees walking, then, as the foundation of the Cynics' philosophical method:
"Philosophers of the type one might call sedentary enjoy contrasting the appearance with the essence of things. Behind the curtain of tangible sights, behind the veil of visibilities, they try to identify what is pure and essential, hoping perhaps to display, above the colors of the world, the glittering, timeless transparency of their own thought…. The Cynic cut through that classic opposition. He was not out to seek or reconstruct some truth behind appearances. He would flush it out from the radical nature of immanence: just below the world's images, he was searching for what supported them. The elemental: nothing true but sun, wind, earth and sky; their truth residing in their unsurpassable vigor."
Walking is not a sport, Gros takes care to emphasize. You don't need any equipment (not even shoes, for an old-school Cynic) nor is any instruction required. The skill set is extremely limited and mastered by most people in infancy. Its practice is noncompetitive.
But in a paradox that gives the book much of its force, we don't all do it equally well. It's not just that some of us are clumsy or susceptible to blisters. Gros contrasts the experience of a group of people talking to one another while marching their way through a walking tour (an example of goal-driven and efficiency-minded behavior) and the unhurried pace of someone for whom the walk has become an end in itself, a point of access to the sublimely ordinary. And so he has been able to give the matter a lot of thought:
"Basically, walking is always the same, putting one foot in front of the other. But the secret of that monotony is that it constitutes a remedy for boredom. Boredom is immobility of the body confronted with emptiness of mind. The repetitiveness of walking eliminates boredom, for, with the body active, the mind is no longer affected by its lassitude, no longer drawn from its inertia the vague vertigo in an endless spiral.… The body's monotonous duty liberates thought. While walking, one is not obliged to think, to think this or that. During that continuous but automatic effort of the body, the mind is placed at one's disposal. It is then that thoughts can arise, surface or take shape."
As for the clumsiness and blisters, I hope they will disappear soon. It's the practice of walking, not reading about it, that makes all the difference. But no book has rewired my bilateral somatosensory cortex so thoroughly in a long while.
Nothing sharpens memory quite like regret, so I cannot help noting the anniversary of a tossed-off phrase that has come back to haunt me many times over the past 10 years.
In early 2004, I began writing an occasional series of two- or three-paragraph squibs on the latest publications and doings of the Slovenian thinker Slavoj Žižek for The Chronicle of Higher Education, where it ran under the title "Žižek Watch." In the subhead for one such mini-article, I referred to him as "the Elvis of cultural theory." The expression took wings and has been repeated on more occasions than any sane person could track. (As of this writing, it gets 79,000 returns from Google.)
The phrase will outlive me. Last year it appeared in an article in the journal Critical Inquiry, as well as in a Canadian dissertation on the concept of totalitarian evil in the work Hannah Arendt. Someone will eventually write a book using it as a title. Remembering the line always make me cringe, as if from mild food poisoning. For the most salient quality of "the Elvis of cultural theory" -- judged, by any standard, as a characterization of Žižek's work or career -- is its near perfect meaninglessness, verging on hopeless and absolute stupidity.
Unless you know the inside joke, anyway. By my count, roughly two people in the world are in on it. So to mark the anniversary, it is time finally to put the backstory on the record.
The idea for "Žižek Watch" came from my editor at the time, Richard Byrne, an estimable playwright and cultural journalist with family roots in the Balkans. These days Rich is at the helm of the University of Maryland Baltimore County's UMBC Magazine, of which he is the founding editor. We shared a fascination with Žižek's close but complex relationship with the Slovenian post-punk band Laibach and the avant garde movement around it. Given the pace of his output (two or three books a year, just in English) and the growing frequency with which he had begun appearing in odd corners of the mass media, it felt like a matter of time before he graced The National Enquirer, or at least Weekly World News.
So it was that through a chain of associations that "Žižek Watch" alluded -- very much in passing -- to the definitive song about the improbable ubiquity of a tabloid phenomenon: "Elvis is Everywhere" by Mojo Nixon & Skid Roper.
And the rest is, if not history, at least a decade-long lesson in the sliding of the signifier across the greased skids of digital-age publicity.
A footnote in one article from 2005 did trace "the Elvis of cultural theory" back to its first appearance, albeit without identifying the origins of the phrase as such. But by now, context is irrelevant. The expression has long since escaped meaning. And even though nobody seems to get it, does not the very circulation of my remark participate in what Žižek identifies as the "mystery" of jokes -- that they seemingly appear "all of a sudden out of nowhere," produced by "the anonymous symbolic order" through "the very unfathomable contingent generative power of language"?
So writes Elvis, or somebody, in the introduction to Žižek's Jokes (Did you hear the one about Hegel and negation?), published by MIT Press. It is an anthology of the theorist's shtick, not an analysis of it. The cover describes it as "contain[ing] every joke cited, paraphrased, or narrated in Žižek's work in English (including some in unpublished manuscripts), including different versions of the same joke that make different points in different contexts." The sources of the collected passages are given in the book's endnotes, followed with a brief yet oddly repetitive afterword by a novelist and songwriter from Scotland who lives in Japan and writes under the pen name Momus.
The claim to be exhaustive is difficult to credit, and so is the rationale offered for its existence: "The larger point being that comedy is central to Žižek's seriousness." Along with his frequent digressions into popular culture, Žižek's use of jokes has lent his books an appearance of accessibility that accounts for his fame with a broad audience. But that quality is misleading. Žižek practices a form of what Freud called "wild psychoanalysis," with contemporary culture as the analysand. The remarks, quoted earlier, about the free-floating and anonymous nature of jokes are just Žižek's paraphrase of a point made in Jokes and Their Relation to the Unconscious, where Freud interpreted the erotic and aggressive drives manifested through manipulation of the funny bone.
By spelling that out, I've just told you more about why "comedy is central to Žižek's seriousness" than Žižek's Jokes ever does. In the afterword, Momus speculates that "the joke has become for Zizek what algebra is for his old ally and rival Badiou: the most concise way Žižek knows how to sum up a universal situational shape." The idea might well be developed further, preferably by someone who knows that Badiou's interest is in formalized set theory rather than algebra. But as formulated it is more a gesture than an insight
A gesture serving mainly to distract attention from two striking things about the book. The first is that Žižek's Jokes makes unavoidably obvious something that it was still possible to overlook 10 years ago: the dynamic role of cut-and-paste in Žižekian production.
Žižek once said that his completed theoretical edifice, spanning several volumes, would amount to a Summa Lacanica rivaling Aquinas's Summa for both scope and cohesion. But along the way, he has met the growing demand for his work from the editors of books, magazines, and newspapers by tearing off suitably sized chunks of whatever manuscript he had in progress. Sometimes he tweaked things to make it appear like freestanding essay or topical news commentary. And sometimes he did not, though publication was almost certain either way. (I know of one case where the author of a book tried, without success, to have the introduction commissioned from Žižek removed since it had nothing to do with the volume in question.)
Over time, reading Žižek became an experience in déjà vu, with passages from one volume reappearing in others or, in one case, twice in the same book. Žižek's Jokes takes this to a new level. He wrote nothing new for it. Even his two-page introduction consists of one long paragraph from an earlier book. It is a remarkable accomplishment and I do not imagine he will be able to surpass it.
The other striking feature of Žižek's Jokes is how grim the experience of reading it quickly proves to be. In accord with Freudian principles, they revolve almost entirely around sex and/or aggression, often involving racist or misogynist sentiments. All of which is fine when they appear as specimens in a cultural critique -- where they might even elicit a laugh, given the incongruity of seeing them in a context where Hegel or Heidegger have set the terms for analysis. But running through them one after another, in the service of no argument, is deadening. It ceases to be shocking. It just seems lame. Maybe he should be known as "the Jay Leno of cultural theory?" (If, you know, Leno had Tourettes.)
Of course it's also possible that Žižek has a hidden agenda -- that he's sick of being considered hilarious by people who aren't really interested in Hegel, et al., and so has decided to destroy that reputation in the most efficient way possible. And I'm not even joking about that. It makes a certain amount of sense.
Over the last year there has been a steady stream of articles about the “crisis in the humanities,” fostering a sense that students are stampeding from liberal education toward more vocationally oriented studies. In fact, the decline in humanities enrollments, as some have pointed out, is wildly overstated, and much of that decline occurred in the 1970s and 1980s. Still, the press is filled with tales about parents riding herd on their offspring lest they be attracted to literature or history rather than to courses that teach them to develop new apps for the next, smarter phone.
America has long been ambivalent about learning for its own sake, at times investing heavily in free inquiry and lifelong learning, and at other times worrying that we need more specialized training to be economically competitive. A century ago these worries were intense, and then, as now, pundits talked about a flight from the humanities toward the hard sciences.
Liberal education was a core American value in the first half of the 20th century, but a value under enormous pressure from demographic expansion and the development of more consistent public schooling. The increase in the population considering postsecondary education was dramatic. In 1910 only 9 percent of students received a high school diploma; by 1940 it was 50 percent. For the great majority of those who went on to college, that education would be primarily vocational, whether in agriculture, business, or the mechanical arts. But even vocationally oriented programs usually included a liberal curriculum -- a curriculum that would provide an educational base on which one could continue to learn -- rather than just skills for the next job. Still, there were some then (as now) who worried that the lower classes were getting “too much education.”
Within the academy, between the World Wars, the sciences assumed greater and greater importance. Discoveries in physics, chemistry, and biology did not seem to depend on the moral, political, or cultural education of the researchers – specialization seemed to trump broad humanistic learning. These discoveries had a powerful impact on industry, the military, and health care; they created jobs! Specialized scientific research at universities produced tangible results, and its methodologies – especially rigorous experimentation – could be exported to transform private industry and the public sphere. Science was seen to be racing into the future, and some questioned whether the traditional ideas of liberal learning were merely archaic vestiges of a mode of education that should be left behind.
In reaction to this ascendancy of the sciences, many literature departments reimagined themselves as realms of value and heightened subjectivity, as opposed to so-called value-free, objective work. These “new humanists” of the 1920s portrayed the study of literature as an antidote to the spiritual vacuum left by hyperspecialization. They saw the study of literature as leading to a greater appreciation of cultural significance and a personal search for meaning, and these notions quickly spilled over into other areas of humanistic study. Historians and philosophers emphasized the synthetic dimensions of their endeavors, pointing out how they were able to bring ideas and facts together to help students create meaning. And arts instruction was reimagined as part of the development of a student’s ability to explore great works that expressed the highest values of a civilization. Artists were brought to campuses to inspire students rather than to teach them the nuances of their craft. During this interwar period a liberal education surely included the sciences, but many educators insisted that it not be reduced to them. The critical development of values and meaning was a core function of education.
Thus, despite the pressures of social change and of the compelling results of specialized scientific research, there remained strong support for the notion that liberal education and learning for its own sake were essential for an educated citizenry. And rather than restrict a nonvocational education to established elites, many saw this broad teaching as a vehicle for ensuring commonality in a country of immigrants. Free inquiry would model basic democratic values, and young people would be socialized to American civil society by learning to think for themselves.
By the 1930s, an era in which ideological indoctrination and fanaticism were recognized as antithetical to American civil society, liberal education was acclaimed as key to the development of free citizens. Totalitarian regimes embraced technological development, but they could not tolerate the free discussion that led to a critical appraisal of civic values. Here is the president of Harvard, James Bryant Conant, speaking to undergraduates just two years after Hitler had come to power in Germany:
To my mind, one of the most important aspects of a college education is that it provides a vigorous stimulus to independent thinking.... The desire to know more about the different sides of a question, a craving to understand something of the opinions of other peoples and other times mark the educated man. Education should not put the mind in a straitjacket of conventional formulas but should provide it with the nourishment on which it may unceasingly expand and grow. Think for yourselves! Absorb knowledge wherever possible and listen to the opinions of those more experienced than yourself, but don’t let any one do your thinking for you.
This was the 1930s version of liberal learning, and in it you can hear echoes of Thomas Jefferson’s idea of autonomy and Ralph Waldo Emerson’s thoughts on self-reliance.
In the interwar period the emphasis on science did not, in fact, lead to a rejection of broad humanistic education. Science was a facet of this education. Today, we must not let our embrace of STEM fields undermine our well-founded faith in the capacity of the humanities to help us resist “the straitjackets of conventional formulas.” Our independence, our freedom, has depended on not letting anyone else do our thinking for us. And that has demanded learning for its own sake; it has demanded a liberal education. It still does.
Michael Roth is president of Wesleyan University. His new book, Beyond the University: Why Liberal Education Matters, will be published next year by Yale University Press.His Twitter handle is@mroth78