You have /5 articles left.
Sign up for a free account or log in.
Empires are not the only entities that rise and fall. So too are academic fields of study.
When I was a doctoral student, many of the foremost U.S. historians were intellectual historians: David Brion Davis at Cornell and Yale, George Fredrickson at Northwestern and Stanford, John Higham at Michigan and Hopkins, and Henry May at Berkeley, among others. Other leading scholars, including Bernard Bailyn, Eric Foner, Winthrop Jordan and Gordon S. Wood, foregrounded ideas (or, in the parlance of the time, ideology) in their interpretations of slavery, the causes of the American Revolution, the drafting of the Constitution and the coming of the American Civil War.
Even though American intellectual history as a research field persists, replete with journals, professional societies, exemplary scholars like Jennifer Ratner-Rosenhagen and big books, it has certainly shifted from the center to the periphery of the discipline. Leading practitioners, like Robert Abzug and James Kloppenberg, have retired, while others, like Richard Wightman Fox and Jackson Lears, near the end of their illustrious professional careers.
It’s possible, of course, to argue that intellectual history hasn’t faded at all. Some would claim that it flourishes under a new name: cultural history. It might also be argued that intellectual history has been absorbed by various subfields. After all, there’s a flourishing field of African American and U.S. women’s intellectual history.
But I’m convinced that something real has occurred. No longer are large departments like mine committed to having a slot in American intellectual history or looking for scholars with expertise in the history of ideas.
American intellectual history isn’t the only field to experience decline. A few years ago, Robert B. Townsend, now the director of the American Academy of Arts & Sciences’ humanities, arts and culture programs, charted shifts in the discipline of history over 40 years. The growth areas included women’s history and cultural history, with legal, economic, diplomatic and intellectual history fading.
I suspect that many students interested in diplomatic history gravitated toward degrees in international relations or area studies. The decline in economic history no doubt underscored a more general failure within the discipline to train students in social science methodologies, in demography and statistics as well as econometrics. Flagging interest in legal history is harder to explain, since the field aligns so well with students’ pre-professional interests in law and public policy.
As for American intellectual history, I’d speculate that that the field is much too frequently and wrongly dismissed as elitist, ethnocentric and excessively abstract, as the study largely of a white, male intelligentsia. In fact, anyone who has followed the field knows that it has become increasingly democratic and now includes many examples of intellectual history from below (for example, studies of the ideas of the working class), more global and comparative and more wide-ranging (including the intellectual history of sexuality).
I guess it’s no surprise that pragmatic people view fancy ideas as immaterial.
But the field has also suffered from a deeper problem: certain assumptions that undergirded intellectual history were cast into doubt:
- Generalizability
Many of the greatest American historians of the 1960s and 1970s argued that intellectual history allowed later scholars to understand the logic of earlier generations, the conceptual lenses through which they made sense of events and the values that drove their decisions. Later scholars questioned whether the ideas found in the writings of political elites (for example, about republicanism) could be extended to a much broader population.
- Accessibility
Over time, there was a growing sense that serious intellectual history—with its preoccupation with political languages, linguistic paradigms, discursive conventions—was becoming increasingly arcane and inaccessible to any but the initiates. No longer was intellectual history simply the study of the development and modification of constellations of ideas over time or of the cultural discourses of a particular era or the close reading and contextualization of individual texts. Intellectual history, in the eyes of some critics, was more an adjunct to philosophy or political theory or postmodern hermeneutics than immediately recognizable as history.
- Ideas as drivers of change
It is certainly the case that because human beings have minds, all human actions are inevitably filtered through people’s perceptions, emotions and values. But, critics charged, such an idea-centered framework downplays the importance of the host of other factors, demographic, economic, political, social and structural, that invariably shape people’s behavior.
Those criticisms are certainly exaggerated, and I write here not to criticize American intellectual history but to argue that it’s more important than ever.
Take the example of Robert Abzug’s eloquent, profoundly moving biography of the pioneering humanistic psychologist Rollo May. Once a household name, May is now relegated to a netherworld of vague familiarity along with such near contemporaries as Erich Fromm, Carl Rogers and Harry Stack Sullivan and such theologians as Reinhold Niebuhr and Paul Tillich.
Abzug’s book actually succeeds in accomplishing what Philip Rieff’s evocatively titled 1966 classic work of social criticism promised: it traces the triumph of the therapeutic, as psychological language and concepts radically reshaped the practice of religion and as a culture based on faith gave way to a me culture centered on the self’s needs and desires. More than that, Abzug’s biography traces the Americanization of existentialism and the ways this set of ideas became a kind of common sense absorbed into popular self-help literature in a simplified, sanitized form.
To turn to another example, Mark Greif’s The Age of the Crisis of Man, a pathbreaking study of American thought and fiction from the Great Depression to the early 1970s.
I used to quip that those of us at Yale in the early 1970s were studying with the greatest minds of the 1940s. There was a grain of truth in that sophomoric wisecrack. Luminaries like C. Vann Woodward, Robert Penn Warren and John Hersey shared certain traits in common. All were morally serious and psychologically observant. All were concerned, in their own way, with “the crisis of man”: not just the crisis of the liberal state or of the capitalist economy or the world order, but something deeper, “the demolition of the certainties about human nature, which had been pillars of certainty for optimistic thinkers for two centuries.”
Greif reminds his readers of the number of books that wrestle with that crisis: Ellison’s Invisible Man, Saul Bellow’s Dangling Man, Flannery O’Connor’s A Good Man Is Hard to Find, Niebuhr’s Moral Man and Immoral Society and The Nature and Destiny of Man.
To my mind, few books better exemplify Greif’s thesis than All the King’s Men. Much more than a fictionalized study of the assassination of a charismatic but corrupt populist Southern politician, the book wrestles with human nature and Willie Stark’s claim that “Man is conceived in sin and born in corruption and he passeth from the stink of the didie to the stench of the shroud. There is always something.”
The novel also grapples with the ways that the past haunts and inevitably intrudes upon and shapes the present, as well as with the moral implications of behaviorist psychology for personal responsibility and what the author calls “The Great Twitch”: that “all the words we speak meant nothing and there was only the pulse in the blood and the twitch of the nerve, like a dead frog’s leg in the experiment when the electric current goes through.”
As Greif points out, the discourse on the crisis of man aged poorly. By the 1960s, such language struck many as overly grave, excessively serious and much too earnest and humorless. The use of the word “man” pointed to the problem: such works tended to obscure the lived realities and the subjective interior of class, gender and race.
Words like those in William Faulkner’s Nobel Prize speech, with its famous phrase “man will not merely endure, he will prevail,” came to sound as dated as when Hemingway, a generation earlier, had written in the wake of World War I, “Abstract words such as glory, honor, courage or hallow were obscene beside the concrete names of villages, the numbers of roads, the names of rivers, the numbers of regiments and the dates.”
Intellectual history is invaluable because ideas matter. Lord Keynes put it best in his General Theory of Employment, Interest and Money:
“Practical men who believe themselves to be quite exempt from any intellectual influence, are usually the slaves of some defunct economist. Madmen in authority, who hear voices in the air, are distilling their frenzy from some academic scribbler of a few years back.”
We live in cynical times, and it’s easy today to think of ideas as the Russian-born syndicated columnist Max Lerner did, as weapons—pragmatic tools used sometimes to enlighten but also to manipulate, incite or deceive. To be sure, in today’s age of social media, fake news and misinformation, ideas are widely used tactically and strategically.
But as the eminent Berkeley sociologist Claude S. Fischer observed, ideas often do influence people as much as “material circumstances such as economic incentives, physical constraints and military force.” In addition, ideas are the way that humans as thinking being make sense of the world around them. Ideas “inform, create fuel for thought and inspire actions.”
Intellectual history is among the few fields that takes the power of ideas and discourse seriously. We neglect that field at our peril.
Steven Mintz is professor of history at the University of Texas at Austin.