You have /5 articles left.
Sign up for a free account or log in.

We live in a society acutely conscious of age. Ours is also an intensely age-segregated society and a society that denigrates whole groups of people based on their age.

Age can be a milestone and a guidepost—nothing wrong with that. But age can also play a much more negative role. It can be exclusionary: an artificial barrier that separates one age group off from another. It can also be coercive: telling you what you can or can’t do based on your age. We tell people that they’re too old to continue working or to live alone.

Between the colonial era and the late 20th century, age became the dominant instrument for organizing society. Even more than gender, it became the chief way to understand the process of maturation and to allot legal status and categories of responsibility. In recent years, however, age has lost some of its prescriptive power, and colleges, as well as other social institutions, need to adapt to this pivotal development.

It was only beginning in the middle of the 19th century that Americans became preoccupied by age. Over the past century and a half, age became institutionalized, rigidly dividing age cohorts from one another.

Before the mid-19th century, age was not an especially significant in American society. It comes as a bit of a shock to discover that many Americans a century and a half ago had no idea of their birth date and didn’t celebrate their birthdays. Some school classrooms contained children as young as 2 and as old as 25. Colonial colleges admitted students as young as 14.

In colonial and early-19th-century America, as the historians Joseph F. Kett and Howard P. Chudacoff have shown, the language of age was vague. Infancy was the period when a young girl or boy was under a mother’s care, a period that typically stretched from birth to the age of 5 or 6. The word “child” applied to those as young as 2 or 3 to those of 10, 11 or 12. Youth encompassed the age range from 10 or 12 to the mid-20s. Size and strength mattered more than chronological age.

Educators took the lead in identifying age as a crucial way of organizing schools, creating the first age-graded classrooms during the 1840s. “Child savers”—reformers committed to child protection and uplift—upset by the indiscriminate mixing of ages in almshouses and jails, created a host of specialized institutions for the young: orphan asylums, Sunday schools and houses of refuge. Around the middle of the century, the first children’s hospitals appeared, as did a branch of medicine focused on children and their diseases, pediatrics.

Meanwhile, over the course of the 19th century, judges, attorneys, legal scholars and legislators developed a host of new concepts—including the tender years doctrine, the best interests of the child standard and in loco parentis—that treated childhood as a distinct legal category. The legal system also identified status offenses that only applied to the young (like truancy). It established new legal arrangements, including adoption; raised the age of consent and marriage; and imposed the first compulsory school attendance laws and restrictions on childhood labor. New legal institutions, like the juvenile reformatory and the juvenile court, emerged. Each of these developments made age more saliant than earlier in time.

The late 19th century witnessed the rise of the child study movement, a campaign by educators, educated mothers and pioneering child psychologists to closely observe and scientifically study children’s development. The effect was to identify age norms and create new age categories.

Especially influential was America’s first psychologist and the man who brought Freud to the United States, G. Stanley Hall. In 1904, he popularized a new age category in a book entitled Adolescence. During the 1920s, the first child guidance clinics appeared to tackle the problems posed by nondelinquent youth, like moodiness and rebelliousness. At the same time, developmental psychologists and pediatricians like Arnold Gesell established age norms, expectations about how young people at certain ages are supposed to behave. Infants and toddlers were either on or off schedule. Age-linked generalizations, like the “terrible twos,” appeared.

From Hall and Freud to Piaget and Erikson, age became central to various theories of children’s physical, emotional and intellectual development.

The rapid spread of high schools in the early 20th century helped make the peer group the chief way that young people socialized. At the same time, age became a crucial legal category, defining when one could work, drink, smoke, marry, enter the military and even have sexual relations.

At first, age categories were largely applied to the young. But soon age was applied to the elderly, generally in a pejorative way.

  • During the 19th century, attitudes toward the elderly shifted from respect for their experience and wisdom to disrespect and hostility.
  • Old age became associated with debility, dependency, disease, degeneration, frailty and a lack of adaptability, evident in disparaging, derisive words and phrases like “geezer” or “old fogie.”
  • The term “senile” began to be used to denote mental deterioration.

Old age, like childhood and youth, was gradually institutionalized. The early 20th century saw the emergence of retirement as an expected stage of life. Pensions began to proliferate, as did old age homes.

Advertisers were especially significant in establishing age norms. The most notable example is the emergence of the “toddler stage” as a creation of department stores that in the 1930s were seeking ways to expand sales of clothing and other children’s goods. Marketers subsequently identified other categories, like teenyboppers and tweens, along with distinct categories of goods, like young adult fiction.

From the 1950s into the 1980s, colleges and universities played a pivotal role in reinforcing a generational divide. Their focus on the traditional college-age population extended age segmentation from the teens into the 20s.

In recent years there has been a breakdown of age norms, as growing numbers of Americans refuse to “act their age.” Middle-class children have grown more knowledgeable and socially and electronically connected. Young adults have delayed many of the traditional trappings of adulthood. And many seniors continue to work and remain physically active well after the traditional age of retirement.

The breakdown of firmly entrenched age norms is especially evident on broad-access college campuses, where nontraditional undergraduates over the age of 25—consisting of veterans, parents, working adults—make up an increasing share of the college-going population. Unfortunately, most four-year institutions have not adapted sufficiently to this new reality. It’s been online institutions, not brick-and-mortar campuses, that have done the most to adjust their schedules, course delivery modes and curricula to accommodate these students’ needs.

Today, it is common to think of age segregation as “natural” and a product of personal preference. We take it for granted that people want to hang out with others of their own age. In fact, however, age segmentation is not a timeless reality; it is, as such historians as W. Andrew Achenbaum, Corrine T. Field, William Graebner, Nicholas L. Syrett and others have shown, a product of the era of the Industrial Revolution.

Nor is age segregation benign. Age segmentation fosters distrust, stereotypic thinking and cross-generational misunderstanding. It accentuates competition over public priorities: whether public resources should be devoted to Social Security, Medicare and services to seniors or to education and childcare. The growth of age consciousness was accompanied by the growth of ageism: disparaging groups of based on their age—from adolescents to the elderly.

Age segregation is not inevitable. Nor is it intrinsically desirable.

We must find ways to promote generational equity in the distribution of resources and seek to bridge the generational divide.

Colleges and universities need to step up to the plate and assume a special role and responsibility for mitigating age segregation. There are many ways to do this, but certainly a first step is to enroll more older students, whether these are transfer students, veterans, stop-outs or adults who are eager to acquire a degree, upskill, retool or simply to learn. Invite more adults, including alumni, to campus to share practical tips and advice.

I think our campuses will discover that the presence of more nontraditional students can do a lot to alleviate some of college life’s worst features, including a juvenile culture of extended adolescence whose consequences include excessive drinking and a lack of academic seriousness. Confining a single age cohort onto our campuses has reinforced immaturity in thought and conduct.

Over the course of the past two centuries, rigid age categories became integral parts of what Max Weber called modern society’s iron cage—the dehumanizing system of bureaucratic organization, rational calculation, institutionalized power and economic efficiency that traps individuals and prevents them from reaching their full potential. Today’s colleges and universities should play a role in breaking free from the iron cage.

Ryan Craig, among the shrewdest and most perceptive observers of higher education, recently wrote a pointed critique of High Point University entitled “When the College of Last Resort Becomes a Resort.” High Point is perhaps best known for its luxury amenities: its manicured grounds; Doric columned, cupola-capped buildings; high-end steak house and car wash.

But let’s not delude ourselves. High Point is only the most extreme example of a campus vision that too many colleges and universities aspire to: a sort of Club Med or summer camp or Disneyland for those in their late teens and early 20s distinguished not by a serious and demanding intellectual or cultural and artistic life but by its comforts, physical beauty and services.

The alternative, we often hear, is a more practical, applied, pre-vocational or career-focused education. Surely, we can define an alternative vision: a learning-centered institution that is developmental across multiple dimensions, that has a transformational goal that goes beyond job training and that doesn’t confine late adolescents and young adults in a bubble, but rather strives to integrate them into adult society.

Steven Mintz is professor of history at the University of Texas at Austin.

Next Story

Written By

More from Higher Ed Gamma