You have /5 articles left.
Sign up for a free account or log in.

Recently, a leading editor at a major academic press said bluntly that she wasn’t aware of any breakthrough scholars in U.S. history under the age of 50. In response, I mentioned several names of younger scholars I greatly admire. The editor wasn’t moved.

For the sad facts are these: the major U.S. history journals are receiving fewer submissions, and fewer of their published articles have had significant impact on the field. Leading younger U.S. historians are much less prolific than their predecessors. Not only are fewer new subfields of U.S. history opening up, but there have been very few radical reinterpretations of major issues or topics. When, after all, have you or I read a fresh interpretation of the Revolution, the age of Jackson, the Civil War, the Gilded Age, the Progressive Era or the New Deal—or even of slavery, reform or family, immigration, legal or urban history?

This brings to mind the apocryphal quote attributed to Henry L. Ellsworth, the first commissioner of the U.S. Patent Office, that “the work of the patent department must soon come to an end, because the inventive power of the human mind had reached its limit.”

Have we, in some sense, reached the end of (U.S.) history? Has the discipline exhausted itself? Has the time come for the field’s scholars to move in a very different direction?

This wouldn’t be the first time a scholarly discipline has disappeared, undergone significant transformations or merged with other disciplines. Examples include natural philosophy and cybernetics—not to mention those archaic fields now dismissed as especially unscholarly, like alchemy, astrology, eugenics and phrenology.

My fellow U.S. historians might well respond that historical inquiry is limitless and dynamic, not static; that previously neglected topics need to be re-examined and re-interpreted; that eventually new subfields will inevitably emerge; that even fields of study that seem well explored can yield fresh insights as new theoretical frameworks and new evidence arise.

Perhaps. Still, it appears that research productivity has slowed despite the fact that there are more highly trained academic historians than ever before.

In a recent essay in Aeon, Rachael Scarborough King, a professor of English at the University of California, Santa Barbara, and Seth Rudy, an associate professor of English at Rhodes College, suggest that the time has come for certain “knowledge projects to confront their ends.” After all, it’s not enough for U.S. historians to accumulate more information on previously neglected topics. That’s merely scholasticism and antiquarianism. If these historians fail to revise our broader understanding of the past, if they merely regurgitate earlier interpretations using the language of today, then U.S. history as a distinct scholarly project has truly reached its end.

Perhaps, then, we are indeed witnessing (U.S.) history’s end. This disturbing thought brings to mind F. Scott Fitzgerald’s observation about his generation: “grown up to find all Gods dead, all wars fought, all faiths in man shaken.”

Rudy and King's edited volume, The Ends of Knowledge: Outcomes and Endpoints Across the Arts and Sciences, argues that humanities disciplines need to rethink their aims and ask themselves, frankly and forthrightly, whether their existing purpose is now complete. Has the time come to close up shop? After all, “The disciplines as we currently occupy them are artefacts of the 19th-century origins of the research university.”

U.S. history may not be the only field experiencing a period of torpor, unproductivity and an apparent lack of progress. King cites a number of other examples, including analytic philosophy, cultural studies and even her own field, literary studies, which may help explain the drift toward scholarly activism. These fields of study are institutionally entrenched—after all, they serve as administrative units, are “reified by architecture and [campus] geography,” and are maintained by various graduation requirements. But that may not be enough.

Since 2012, when I first arrived in Austin, my campus, arguably the nation’s wealthiest public university, has shrunk the history department from 80 professors and instructors to just 54 today—about a one-third decline. When the baby boomers finally retire, we might well be far smaller.

Universities have responded to the insularity of existing fields of study by creating a host of interdisciplinary centers and programs. Area studies and diasporic, ethnic, gender and sexuality studies are among the most obvious examples. But as King quite rightly observes, “such efforts are frequently additive rather than interactive.” I don’t see any evidence that such programs have made my campus more integrated or multidisciplinary.

George Eliot begins Middlemarch’s conclusion with these evocative words: “Every limit is a beginning as well as an ending.” Perhaps the end of U.S. history, as I knew it, can be “a launchpad for new ideas, methods and paradigms.” King calls for “new systems and organisations of knowledge production” that are “reoriented around emergent ends rather than inherited structures.” I’m certainly sympathetic to that view that we need to organize knowledge production in ways that are more integrative and synergistic—as well as more self-critical and self-reflective and more focused on ethical issues.

I can’t tell you what form that history might take. But I can suggest some possibilities. These possibilities will require U.S. historians to shed the insularity that has, since the 1970s, severed much of U.S. historical scholarship from anthropology, demography, economics, political science and sociology. See, for example, differences in how historians and economists explain the causes and severity of the Great Depression or the economics of slavery.

It’s past time, I am convinced, for U.S. historians to once again embrace social science history, with our muse, Clio, in dialogue with her sister muses.

Let me suggest an example of what I mean. U.S. historians are especially well positioned to speak to issues involving contemporary childhood: about how child rearing, family life, play, schooling, peer relations, media and product consumption have changed over time and the behavioral, emotional and psychological consequences of these shifts.

I myself am especially interested in what historians might contribute to an understanding of the surge in diagnosed psychological and behavioral disorders among children and variation in rates across cultures. However, if historians are to speak with authority about such issues as the increased diagnosis of anxiety and depressive disorders or of autism spectrum, attention deficit and hyperactivity disorders or of reading, writing and mathematics disorders (including dyslexia and dyscalculia), then their training needs to change and become much more cross-disciplinary.

This country has a propensity to treat conditions such as autism or attention deficit disorder as essentially neurological, genetic or biological, even though the underlying causes remain obscure. I’d argue that even if these conditions’ origins are, at root, bio-neurological or genetic, the ways that that these disorders develop, manifest themselves and are diagnosed, managed and treated are certainly influenced by social, cultural and environmental factors—the factors that historians study.

Currently, there is strong resistance to alternative explanations among many practitioners, arising, at least partly, out of a concern that social and cultural explanations are reductionist and might cast blame on parents. This concern doesn’t arise out of nowhere. It grows out of facile and profoundly pejorative explanations in the past that blamed “refrigerator mothers” for autism or maternal overprotection (“momism”) for producing sons who were tied to their apron strings, contributing, supposedly, to homosexuality, schizophrenia and identity diffusion.

Yet I am convinced that the medical, psychological and brain sciences would benefit enormously by taking historical research into account. I have been struck, for instance, by historical scholarship that indicates that U.S. and Swedish researchers and practitioners have long understood and treated dyslexia differently, with American authorities stressing brain dysfunction while their Swedish counterparts emphasize brain plasticity.

Somewhat similarly, Matthew Smith, the leading authority on the history of attention deficit hyperactivity disorder, asks why biological explanations and drug treatment are preferred in the United States and why alternate explanations, focusing on diet, educational practices and other variables, failed to achieve legitimacy.

Then there is a recent shift in the understanding of the pronounced increase in allergies, especially among young children, with medical researchers increasingly recognizing the crucial role of epigenetics: how environmental changes—a historical topic—influence gene expression.

My takeaway: diagnosticians and treatment providers need to supplement neuro-biological and genetic explanations by also viewing these conditions through social, cultural, dietary, environmental, cognitive and developmental lenses that historians can contribute to.

Let me suggest another area where historians might contribute to social scientific understanding: by giving a historical dimension to the study of social and cultural psychology. Social cognition—how people form impressions and make inferences—changes over time. Ditto for how attitudes and identities are formed, change and influence behavior. Group dynamics, including prosocial and aggressive behavior, as well as attitudes toward conformity, compliance, obedience and social norms, have also undergone far-reaching shifts. However, the historical study of such topics will only be meaningful if historians actively engage with colleagues outside their discipline.

I was recently asked by a journalist at The Atlantic to comment on a recent trend among 20-somethings to live alone, as opposed to marrying, cohabitating, living with roommates or returning to their childhood home.

Should we think of that trend as the sociologist Eric Klinenberg and the psychologist Bella DePaulo generally do—as liberatory? Or should we view this trend more negatively, as physically, socially and psychologically isolating? Is this trend a manifestation of a cultural drift toward a hyperindividualism that has been accompanied by a diminished capacity for intimacy, empathy and engaged social interactions?

Inquiring minds would like to know.

We must, of course, be cautious in making any generalizations. To paraphrase Twelfth Night, some young people are single by choice; others have singleness thrust upon them. Intersectional identities, rooted in class, ethnicity, gender, race and sexuality, render any sweeping statements problematic. For instance, Klinenberg’s important and influential 2012 book Going Solo focuses primarily on the upper reaches of the middle class, not on the poor or marginalized who experience singleness less as a benefit than as a burden.

As I told The Atlantic’s journalist, my research suggests that we may be seeing an increase in autistic-like behaviors on a societal level. I make that claim cautiously and fully aware that this statement can be easily misconstrued. But it does seem to me that many of my students find in-person face-to-face communication difficult and uncomfortable. A significant number are anxiously “living inside their heads.” Rather than feeling free from arbitrary obligations and unyielding parental or cultural expectations, all too many are experiencing disconnection, loneliness, social isolation and anomie—which has, in turn, contributed to a variety of dysfunctional coping strategies, including social media and video game addiction and reliance on the use of prescription and illicit drugs.

I speculated that these behaviors may well be related to the ways that parents raise their children in today’s hyperindividualistic, highly commodified postfamily society. Ours is a society in which many more children than in the past are growing up without siblings. More of their time is spent alone and much less time is spent in free, unstructured, unsupervised outdoor group play. More of their social interactions are technology mediated. Their school experiences are more stressful and less joyful. Parents and children alike have grown more risk averse. These trends, in turn, may well be intensified by the growth of social media, streaming and the influencer economy.

I am well aware of the risks and unintended consequences that might accompany a shift toward a more explicitly social science history. This might well intensify the disturbing trend toward ignoring the more distant past. There’s also a danger that a heightened focus on theory, structures, systems, institutions, processes, trends, organizations and movements might lead historians to downplay the “pastness” of the past and the forms of cultural expression that help us understand how our predecessors understood their world and articulated their identities.

But change inevitably entails trade-offs. If U.S. history is to remain relevant, it must do more to connect the past to the present in meaningful ways, to address big conceptual, theoretical and ethical questions; place U.S. history in comparative perspective; and engage students more actively in the learning process with activities and assignments that strike the students as important, pertinent and meaningful.

It’s not enough, in my view, to apply social science theories, models and methodologies to past events—even though I certainly favor using historical data to test and refine economic, political science and sociological theories.

All of us who teach U.S. history do address topics that speak to the social sciences, from the treatment of Indigenous peoples to slavery and its legacies, expansion and its connections to American nationalism and imperialism, labor and the rise of industrial America, the struggles of various groups for civil rights, social movements and cultural change, and economic inequality and social justice. But let’s also produce a history that is more comparative and more attuned to theoretical issues, even at the expense of narrative and detailed description.

History provides the empirical grounding essential to understanding change over time. As a result, the discipline can contribute significantly to our understanding of the origins, dynamics, patterns and processes of social, cultural, institutional and political change and shifts in social practices, beliefs and relationships. We may be living through the end of a certain kind of history, but the contours of a new history that is less blinkered, narrow, detached and atheoretical is, I believe, poised to emerge.

Steven Mintz is professor of history at the University of Texas at Austin.

Next Story

Written By

Found In

More from Higher Ed Gamma