You have /5 articles left.
Sign up for a free account or log in.

I don’t know what you think, but I’m convinced that college faculty members ought to know much more than they do about the history and distinctiveness of American higher education. I think it’s fair to say that very few professors and virtually no undergraduates know much at all about how colleges and universities have changed over time or how they differ from their foreign counterparts.

Why does this matter? Because ignorance and insularity breed complacency and narrowness. Those who lack historical or comparative perspective can’t effectively defend the defining features of American higher education or debate which characteristics might be jettisoned, modified, reformed or improved.

I’m well aware of the many ways that the notion of American exceptionalism has been abused and misused. Too often, the notion of national singularity contributes to arrogance, smugness and self-congratulation. But there are many ways that American higher education is unique—for good and ill.

I have no illusions that most faculty members will ever read extensively about the history of American higher education. But there are certain facts that I think they should know. Let me offer a few examples of the ways that American colleges and universities differ from their international counterparts.

First of all, American higher education is distinctive in the sheer number of colleges and universities—over 4,000. It’s also unique in the variety of institutions. It encompasses two- and four-year schools, public and private institutions, secular and religious colleges, residential and commuter institutions, research and liberal arts campuses, technical institutes, military academies, tribal colleges, HBCUs, fully online institutions, and more.

The United States is also distinguished by the share of the population that attends college, which now exceeds 70 percent of recent high school graduates, and who transfer from one institution to another—over 40 percent. But the country is also striking in the share of college goers who drop out, a figure exceeding 30 percent and at many institutions exceeding 40 percent. Then there’s the variety of students who seek higher education, including many working adults and family caregivers.

Then there’s cost. Even public colleges and universities generally cost at least twice as much as their international counterparts. A big reason: the size, scope and scale of campuses’ administrative and nonteaching staff, the multiplicity of functions these institutions serve and the range of services they offer, and the number of subunits they encompass.

It’s hard to imagine a brick-and-mortar campus with the extensive services that American colleges and universities provide: career services, disability services, health services, psychological services, technology services, transportation services, housing, dining, tutoring, food pantries, student life office and much more—all of which seek to fill gaps in this country’s social services net or make the student experience more immersive and supportive.

Nor can one easily envisage a serious university without centers for basic and applied research, technology transfer, and entrepreneurship acceleration, functions that, in many countries outside the United States, are assigned to specialized research centers and institutes that exist independently of universities. 

The distinctiveness of American higher education also extends to the curriculum, with its staunch commitment to a lower-division liberal arts core and its extensive electives intended to maximize student options.

Then there’s the distinctive American emphasis on student life: on dorms and dining halls, fraternities and sororities, intercollegiate athletics, and campus-sponsored extracurriculars, including a host of clubs and student organizations. These are elements that generally have no counterpart elsewhere, but that greatly contribute to a college’s popularity.

Also, I should add, the United States differs from many other countries in its faculty-hiring practices and especially in the belief that faculty should be hired in a highly competitive search process involving a nationwide (or even broader) pool of applicants.

Especially noteworthy is the ever-increasing breadth of the college curriculum, as colleges add new majors and fields of study, now encompassing cannabis studies, computer security, game design, health informatics, human-computer interaction and nanotechnology.

Whereas the colonial colleges, like their English counterparts of the time, sought to educate gentlemen and leaders in various realms of life, and the Humboldt-inspired German universities sought to train civil servants, administrators and scholars, American institutions’ purpose has long been quite diffuse. In addition to training those aspiring to enter the learned professions, from early on these schools also trained businesspeople, the new professionals needed by an industrial society (like accountants, architects, chemists, engineers and managers) and those who would enter the helping professions and the arts.

Then there’s one other distinctive feature of American higher education that mustn’t be ignored: its highly stratified, hierarchical nature, which has grown increasingly important as talented students have grown more willing to travel long distances to attend a more prestigious college or university. It’s not just that the American higher ed landscape is more status conscious, but that the institutions with the highest reputation have limited admission rather than growing in response to population growth and an increase in applications.

It’s also a tiered system that is highly inequitable in its allocation of resources and in the composition of the student body. Those students with the greatest financial need are concentrated in the least-resourced institutions, which are all too often unable to provide the academic and nonacademic supports that these students need to graduate.

The exceptionalism of American higher education is, of course, a product of a distinctive history. From the early 19th century onward, as the great historian and sociologist David F. Labaree has shown, American colleges—which were often founded for noneconomic reasons by religious denominations and local boosters—were enmeshed in a competitive marketplace, vying for students, faculty, resources and reputation. The result: these institutions were more responsive than their foreign counterparts to market pressures, which influenced their admissions practices, their curriculum, their emphasis on campus life and, in many instances, their early embrace of coeducation.

These institutions’ tuition dependence, in turn, gave students unusual power. From the mid-18th century onward, this power was reflected in various campus upheavals and protests, beginning with Harvard’s 1766 Butter Rebellion. In the early 19th century, some of the protests turned violent (a Harvard professor lost an eye, and a student murdered a University of Virginia professor), and the 1820s saw the appearance of extracurricular organizations (including literary and debating societies and the first fraternities) controlled by the students themselves. By the century’s end, institutions had succeeded in suppressing much of this unrest, combining sticks, like letter grades, with carrots, evident in the colleges’ embrace of intercollegiate athletics, liberalization of requirements and curricular expansion.

American colleges and universities also enjoyed a greater independence from state control, establishing a principle of institutional self-governance and internal shared governance that has persisted, despite repeated challenges, to this day.

It’s certainly the case that precedents established over 200 years ago continue to shape higher education today. These include the four-year bachelor’s degree, the combination of living and learning, the fairly rigid division between town and gown, and strong presidents and boards of trustees.

Yet change has been as important as continuity in American higher education’s history: including the sharp growth of public institutions; the demise of the classical curriculum; the emergence of the modern research university; the expansion of graduate and professional education; the introduction of departments, majors, electives, credit hours and gen ed requirements; and, more recently, the sharp decline of the small liberal arts college.

An awareness of higher education’s history is filled with lessons that we ignore at our peril. For example, academic tenure and academic freedom are relatively recent developments. As recently as 1960, even many elite institutions hadn’t yet adopted tenure. Far from falling like manna from heaven, tenure arose initially as a way to improve the quality of the professoriate and to attract talent at a time when institutions were struggling to keep up with rapidly expanding enrollments and to improve their competitive profile.

Today’s drift away from tenure—evident in the sharp increase in the number of adjuncts, increased reliance on postdocs and visiting faculty, lecturers and professors of practice—is in part a response to resource constraints and the “overproduction” of Ph.D.s, but also to the failure of tenured faculty to insist on adequate course staffing, including their willingness to embrace very large lecture classes without substantial faculty-student interaction. The fact is that share of institutional budgets spent on instruction, between roughly 20 and 30 percent, has remained constant over time.

Perhaps our current campus priorities make sense, but if enhanced student learning is our goal, the instructional budget must increase—as must the faculty’s responsibility for student success.

One of American higher education’s historic legacies is the notion that American colleges and universities should be able to define their mission, admissions standards, curriculum, requirements and accountability measures independently, subject only to oversight by independent accrediting agencies whose goal is ensure that institutions meet peer standards and engage in a process of self-evaluation and continuous improvement.

A commitment to institutional autonomy was a bedrock principle of American higher education even before the Supreme Court’s landmark 1819 Dartmouth v. Woodward decision, but it has had negative as well as positive consequences. For example, it has resulted in barriers to transfer that result in substantial loss of credits for those who move from one college to another. It has also made it difficult to hold to account nonprofit institutions with exceedingly low graduation rates.

The history of American higher education can be viewed from a variety of conflicting perspectives, whether by scholars or the broader public.

This history is sometimes understood as a Whiggish story of progress—as elite education gradually gives way to mass higher education and then to near-universal higher education, and as the curriculum has expanded in scope, rote learning has declined and the professoriate has become more specialized and research-focused.

To many of higher ed’s critics, this history is regarded, conversely, as a story of decline—as an intellectually serious education with a focus on character development has been diluted by grade inflation; faddish, trendy courses; and shrinking reading and writing requirements, and as a college education allegedly became a five- or six-year holiday from adulthood that pampers and indulges students and wastes resources by providing an ever-increasing number of services and country club amenities.

There are other ways to conceptualize American higher education’s history. Some scholars treat this history as gradual, incremental, evolutionary process, while others stress disjunctive change, as the intensely hierarchical, white male–only colonial colleges and their classical curriculum gave way to radically different institutions.

Other scholars emphasize the transformational role of reformers, innovators, philanthropic foundations and donors and the ideas they advocated, while still others focus instead on the ways that colleges and universities have been incentivized and pressured to adapt to an evolving capitalist economy, in which academic credentials loomed ever larger and colleges, depending on their level of resources and selectivity, have taken on the role of socializing their student body for membership in a particular social and occupational niche.

Then there are still other approaches to higher ed’s history. This includes an approach that focuses on higher education’s long history of exclusion, discrimination and inequality, apparent in vast disparities in institutional resources and in the demographic and socioeconomic makeup of today’s colleges and universities. This story also stresses elite higher education’s role in promoting various, often illiberal, ideologies, from pseudo-scientific racism to eugenics to neoliberalism.

But I think it’s best to think about the history of American higher education as a story of conflict, struggle and contestation. This history is an ongoing battle to define the fundamental purpose of college and the best ways to fulfill that purpose.

Is our goal to produce well-rounded graduates who have been exposed to the liberal arts and who will be knowledgeable and responsible citizens? To provide a coming-of-age experience and help adolescents make the transition to an independent adulthood? To ignite students’ interests, passions and intellect and help them develop a philosophy of life? Or, as many legislators believe, to contribute to human capital formation and economic development?

You might well respond, it’s all of the above and more, and I’d basically agree. But then we need to ask whether we are actually accomplishing those goals. Are we graduating students with writing and oral communication proficiencies and the cultural, scientific and social science literacies we expect of a bachelor’s degree holder? Are we producing career-ready graduates? Are the educational and other campus experiences we offer adequately preparing students for postcollege life?

I, for one, don’t think so. If those are our goals, we need to rethink the education we offer, rebalance our institutional priorities, re-engineer our administrative processes and support services. We need to:

  • Reaffirm a commitment to educating the whole student, academically, but in other ways that matter—physically, morally and socially.
  • Become more accountable for student outcomes, both in terms of knowledge and skills and postgraduation employment.
  • Help students develop more mature interpersonal relationships; acquire resilience in the face of setbacks; cope with anxiety, stress and pressure; and define a sense of purpose and direction in life.
  • Do more to prepare students for the 21st-century economy and society, which demands much higher levels of facility with data, diversity and information (as well as misinformation) than in the past.
  • Enhance access to the high-impact learning experiences that involve active, experiential and project-based learning and that give students opportunities to apply skills to authentic problems, preferably in real-world settings.
  • Produce graduates who are conversant with and thoughtful about the defining issues of our time, including those that involve equity, identity, language and power, who are globally aware and who have at least a basic level of understanding of the frontiers of science, social science and humanistic inquiry.

Given the very large number of transfer students and gross inequities in the distribution of institutional resources, I believe it’s imperative that American higher education act more like an integrated system. For example, to combat credit loss, campuses need to agree on the competencies that students need to demonstrate for credits to transfer and ensure that these apply to major as well as gen ed requirements. To better support underfunded institutions, wealthier campuses need to share instructional tools and content and welcome neighboring academics to their lectures and libraries and archives. Cross-campus collaboration, including some course sharing, also makes sense.

A famous line in Shakespeare’s Tempest reads, “What’s past is prologue.” That’s certainly true: the past has set the stage for all that’s to come. The past’s dead hand can, of course, be an impediment to change. Past decisions and precedents can constrain and manipulate us from the grave.

But if history can be a burden, it can also be a goad and an inspiration. Whatever else the history of American higher education has been, it’s also a great democratic story—a story of increasing access and inclusion, of broadening fields of study and of highly productive and dedicated teacher scholars spreading across the entire higher ed landscape. Digitized and online resources mean that students and scholars everywhere can retrieve information once confined to the best funded research libraries.

But the next steps in democratizing American higher education lie ahead. It’s up to us to bring many more undergraduates to academic and postgraduation success, especially in the challenging fields that will shape the future. Several chapters in the history of American higher ed may have ended, but the future is far from over. It’s up to us, to you, to write it.

Steven Mintz is professor of history at the University of Texas at Austin

Next Story

Written By

More from Higher Ed Gamma