Essay on the nature of change in American higher education
Massive open online courses (MOOCs) have captured the nation’s imagination. The notion of online classes enrolling more than 100,000 students is staggering. Companies are springing up to sponsor MOOCs, growing numbers of universities are offering them, and the rest of America’s colleges are afraid they will be left behind if they don’t.
But MOOCs alone are unlikely to reshape American higher education. When history looks back on them, they may receive no more than a footnote. However, they mark a revolution in higher education that is already occurring and which will continue.
America is shifting from a national, analog, industrial economy to a global, digital, information economy. Our social institutions, colleges and universities included, were created for the former. Today they all seem to be broken. They work less well than they once did. Through either repair or replacement — more likely a combination — they need to be refitted for a new age.
Higher education underwent this kind of evolution in the past as the United States shifted from an agricultural to an industrial economy. The classical agrarian college, imported from 17th-century England with a curriculum rooted in the Middle Ages, was established to educate a learned clergy to govern the colonies. This model held sway until the early 19th century.
In the years before the Civil War, the gap between colleges and society grew larger. European higher education modernized, creating models that would inspire America to grow our own. Innovations, mostly small, were attempted; many failed. During and after the war, the scale of experimentation increased with the founding of universities such as Cornell University, Johns Hopkins University and the University of Chicago a few decades later. Other institutions, such as Harvard University, remade themselves. The innovations spread. By the mid-20th century a new model of higher education for an industrial era coalesced. It was codified in California’s 1960 master plan, balancing selectivity with access and workforce development.
This transition brought new institutions that better met the needs of an industrializing America.
- An entity called the university was imported from Germany, with what would become a mission of teaching, research and service. It offered instruction in professions essential to an industrial society, organized knowledge into relevant specialties, and hired expert faculty in those areas. It not only transmitted the knowledge of the past, but advanced the frontiers of knowledge for the future.
- The federal government created the land-grant college to bridge between the old agrarian America and the emerging worlds, agrarian and industrial America. Now found in all 50 states, the land-grant college was designed to provide instruction in agriculture and the mechanic arts without excluding classical studies.
- Specialized institutions emerged. Some, like the Massachusetts Institute of Technology, were modeled on the European polytechnics; they promoted industrial science and technology and prepared leaders in these fields. Others, the normal schools, sought to provide more and better teachers as the evolving economy demanded more education of its citizenry.
- The two-year college — originally called a junior college, later a community college, sometimes Democracy’s College — was initially established to offer lower-division undergraduate education in the local community.
As these institutions emerged, the curriculum changed. Graduate studies were introduced. New professional schools in fields like engineering, business and education became staples. Continuing education and correspondence courses were added. Elective courses and majors arose. Disputation, recitation, and memorization, the teaching methods of the agrarian college, gave way to lectures, seminars, and laboratories.
The colleges that persisted adopted many of the era’s changes, and the classical curriculum largely disappeared.
This is the history of higher education in America. Change has occurred by accretion. The new has been added to the old and the old, over time, modernized. Change occurs with no grand vision of the system that the future will require. New ideas are tried; some succeed, many fail. By successive approximations, what emerges is the higher education system necessary to serve the evolved society.
Social change is a constant, and so is the need for higher education to adapt to it. When the change in society is deleterious, as in the McCarthy era, it is the responsibility of higher education to resist it and right the society. It is a natural process, almost like a dance. However, in times of massive social change like the transformation of America to an information economy, a commensurate transformation on the part of higher education is required.
We are witnessing precisely that today. MOOCs, like the university itself or graduate education or technology institutes, are one element of the change. They may or may not persist or be recognizable in the future that unfolds.
What does seem probable is this. As in the industrial era, the primary changes in higher education are unlikely to occur from within. Some institutions will certainly transform themselves as Harvard did after the Civil War, but the boldest innovations are likelier to come from outside or from the periphery of existing higher education, unencumbered by the need to slough off current practice. They may be not-for-profits, for-profits or hybrids. Names like Western Governors University, Coursera, and Udacity leap to mind.
We are likely to see one or more new types of institution emerge. As each economic and technological revolution creates new needs for higher education, unique institutions emerge to meet them. In the agrarian era, only a tiny percentage of the population needed higher education, and the college served these elite few. When industrial America required more education, more research, and mass access to college, two major institutions were established: the university and the community college.
The information economy, which requires a more educated population than ever before in history, will seek universal postsecondary education and is likely to create new institutions to establish college access for all at low cost. These institutions will operate globally, not locally, which will dictate a digital format. Because information economies emphasize time-variable, common outcomes — unlike the industrial era’s common processes and fixed times (think assembly lines) — universal-access institutions will offer individualized, time-variable instruction, rooted in mastery of explicit learning outcomes. Degrees and credits are likely to give way to competency certification and badges.
Traditional higher education institutions — universities and colleges—will continue, evolving as did their colonial predecessors. Their numbers will likely decline. At greatest risk will be regional, part-time commuter universities and less-selective, low-endowment private colleges, particularly in New England, the Mid-Atlantic, and the Midwest. The future of the community college and its relationship to the universal-access university is a question mark. It is possible that sprawling campuses will shed real estate in favor of more online programs, more compact learning centers and closer connections with employers and other higher education units.
In this era of change, traditional higher education—often criticized for being low in productivity, being high in cost, and making limited use of technology — will be under enormous pressure to change.
Policy makers and investors are among those forces outside of education bringing that pressure to bear. It’s time for higher education to be equally aware and responsive.
Arthur Levine, a former president of Teachers College, Columbia University, is president of the Woodrow Wilson National Fellowship Foundation.