You have /5 articles left.
Sign up for a free account or log in.

I, perhaps like you, am a sucker for articles with titles like “14 spectacularly wrong predictions” or “Wrong again: 50 years of failed doomsday predictions” or “Oops! Failed predictions from history.”

In 2013, in The Chronicle of Higher Education, I identified 15 innovations that were likely to transform the higher education landscape. Now, nine years later, it’s high time to look backward and see where I was right and where my crystal ball proved cloudy and distorted.

In that essay, entitled “The Future Is Now,” I argued that profound transformation reshaped the higher education landscape at roughly 50-year intervals. These included:

  • The first stage in the democratization of higher education, with a proliferation of small colleges, founded by religious denomination and local boosters, and the appearance of the first public universities in the early 19th century.
  • The emergence of the earliest alternatives to the classical curriculum appeared, and the first federal support for higher education emerged with the Morrill Act and the growing number of courses in agricultural, modern history and foreign languages, the natural and social sciences, and technology.
  • The late-19th-century rise of the modern research university, of college majors and elective courses, and of “new” professional schools in architecture, business and engineering.
  • The Progressive Era emergence of the Wisconsin Idea, that public universities should serve the public, along with the development of extension services and junior colleges.
  • The post–World War II transformation of normal colleges into regional public universities, the end of legal segregation of public higher education in the South, the advent of state and federal financial aid, and the sharp increase federal support for university-based research.

If that pattern persisted, then the 2010s, I thought, would witness yet another era of transformation. It certainly did, but not necessarily for the reasons or in the ways that I imagined.

My basic argument was that a series of long-term developments—demographic, economic and technological—would fuel or foment transformation. These included the need to:

  • Tap new sources of revenue to meet the ever-rising costs of new programs, information technology, student life and support services, utilities, facilities maintenance and more.
  • Better serve the growing number of nontraditional students, whether working adults, family caregivers, part-timers, commuters, first-generation college students and students with disabilities.
  • Compete with the online for-profit and nonprofit providers who threatened traditional institutions’ monopoly over credentialing, including at the master’s level.
  • Exploit the potential of digital technologies to control costs, serve more diverse student markets, raise completion rates and improve student learning and employment outcomes.

I also argued that among the most significant drivers of change was a mounting political challenge: the argument that graduation rates were too low, that levels of student engagement and learning outcomes were unacceptably poor, and that a college education did not provide good value for the money.

All that was true, but in one respect I was wrong, or, if not wrong, premature. I was convinced that even then, students, in growing numbers, were embracing or poised to embrace faster and cheaper alternative paths to attainment, including such alternate providers as MOOCs, boot camps and various skills academies.

So what were the transformations that I thought lay ahead?

1. E-Advising

At the time, I was thinking largely about predictive analytics and course recommendation tools, like Austin Peay’s Degree Compass and Purdue’s Course Signals and the Bill & Melinda Gates Foundation’s InBloom, a $100 million initiative to aggregate student data. It turned out instead that the future lay in data-driven advising. Georgia State would serve as the model to emulate: monitoring student engagement, sending out automated warnings and signaling faculty and academic advisers about impending trouble, thus helping to ensure that students remained on a path to graduation.

2. Evidence-based pedagogy

I was convinced that higher education was poised to adopt insights from the learning sciences and would place a greater emphasis on learning objectives, mastery of key competencies and assessments closely aligned to learning goals. I also thought instructors would adopt more social learning, more active learning and more real-world assessments. Certainly, many instructors did incorporate more evidence-based practices into their teaching. Nevertheless, the instructor-centered classroom, and the lecture, the seminar and the cookie-cutter lab, remain instructional mainstays.

And yet, I do think that the long-term trend is toward more inquiry-, case-, project- and team-based learning and more experiential learning, including more applied learning, service learning, field-based learning and maker spaces.

3. The decline of the lone-eagle approach to teaching

I thought, mistakenly, that we’d see much more resource sharing and more course sharing and a greater embrace of collaboratively developed interactive courseware and simulations and virtual labs. To be sure, instances of team teaching persist, but resistance to a more collaborative approach to course development remains more intense than I expected.

4. Optimized class time

When I wrote in 2013, the flipped classroom was still an emerging idea. Yet despite the efforts of figures like Harvard’s Eric Mazur, the earlier model, in which the instructor-centered classroom is supplemented by various kinds of homework, remains dominant.

5. Seamless credit transfer

Given the growing attention to the student swirl—the movement of students from one institution to another—and the expansion of access to Advanced Placement courses and the emergence of early-college/dual-degree programs, I thought, again in error, that we’d see a much stronger embrace of efforts to make credit transfer, not only to gen ed but to requirements, automatic. Despite pioneering models, including the Interstate Passport and CUNY’s Pathways program, barriers to credit transfer, of course, remain.

6. Fewer large lecture classes

Whew, was I mistaken. I thought colleges and universities would follow the example of medical schools and adopt new ways to offer foundational courses, for example, by developing self-paced, self-directed introductory courses, or competency-based modules, or adopting wholly new online or hybrid formats. This hasn’t happened yet.

7. New frontiers for online learning

Here, I was referring to more collaborative learning (along the lines of the c-MOOCs, which create communities of inquiry surrounding a topic of interest), immersive learning environments (modeled on Second Life), hands-on simulations and serious games. Innovations like these always seem to lie five years in the future.

I also thought that many more instructors would quickly embrace approaches to assess student learning, beyond the traditional research paper, lab report and exam. Some have. There are a growing number of examples of learning assessments based on digital stories, collaboratively developed class websites, student-written annotated texts and encyclopedias, and multimedia projects like virtual tours or podcasts. But this frontier still remains, to my regret, far too barren.

8. Personalized adaptive learning

I was dazzled by the prospect of tailoring education to better meet individual student needs. I thought by now we’d have many examples of interactive courseware that provides personalized learning pathways, customized content and embedded remediation and that adjusts pace to students’ learning needs.

It turns out that developing personalized adaptive learning tools is far harder than I thought, and the demand for such tools hasn’t grown as rapidly as I expected. This, I suspect, is an area whose time will come.

9. Competency-based learning and credit for prior learning

I thought that pressure to accelerate time to degree, better measure student learning, and place a greater emphasis on student skills and learning outcomes would lead to an embrace of a competency-based approach that allowed students to advance based on their ability to demonstrate mastery of a particular skill or competency. True, most institutions do offer credit by examination, but that wasn’t what I meant.

It turns out that despite isolated efforts like the American Historical Association’s Tuning Project, U.S. colleges and universities, accreditors, or scholarly societies have not sought to follow the example of Europe’s Bologna Process, which has resulted in a series of international agreements to ensure course quality and credit transfer.

10. Data-driven instruction

I thought that by now instructors (and students) would have ready access to data dashboards that would make it easy to track student engagement and areas of student confusion and therefore allow faculty members to focus instruction to better meet student needs and to improve courses over time.

I also thought department chairs and executive committees would have the information needed to conduct equity audits, exposing variances in grading and withdrawal rates and performance in subsequent classes to scrutiny.

The tools to embrace data-driven instruction already exist, but in the absence of pressure to make use of these tools, practices are unlikely to change.

11. Aggressive pursuit of new revenue streams

This has certainly occurred. Departments have become much more entrepreneurial. And yet, I remain struck by lost opportunities. I, for one, don’t see sufficient incentives for faculty to pursue external funding to strengthen outreach in admissions or to enrich the curriculum or to offer summer programs for high school, undergraduate and graduate students from underrepresented groups.

12. Online and low-residency undergraduate degrees at flagships

I should have known better than to think that many selective institutions, including publics, would risk “diluting” or “diminishing” their brand by aggressively expanding access. But maybe, just maybe, however, these institutions will take alternative steps to increase enrollment. For example, flagship and land-grant universities might greatly expand off-campus learning opportunities, including study abroad, making it possible for these institutions to admit perhaps as many as 25 percent more students.

13. More certificates and badges

Alas, in most cases alternate credentials have not been viewed as a way to broaden undergraduates’ education or to build essential, career-aligned skills, but, rather, as a way for institutions to make a quick buck by partnering with the big tech companies or with various boot camps and skills academies.

14. Free and open textbooks

Pressure to adopt open educational resources is intense, and I am certainly not alone in only assigning readings that are available for free. The range of open textbooks, offered by providers like OpenStax, is extraordinary.

But let’s be honest and recognize that this shift has only marginally reduced the cost of a higher education while devastating the market for scholarly monographs. It has, almost certainly, contributed to a reduction in the amount of assigned reading. Worse yet, the pursuit of free textbooks has meant that the kinds of instructional materials that we really need—that are highly immersive and interactive and personalized and make extensive use of advanced simulations—aren’t produced because there is no way for writers or publishers to recoup the development and production costs.

15. Public-private partnerships

I originally wrote at a time when many ed-tech firms considered themselves disrupters, capable of upending and displacing insufficiently innovative incumbent institutions. In the years since, these firms have touted themselves as educational partners capable of providing a stack of services that existing institutions can’t. Among the services they provide involve enrollment management, data analytics, technology platforms, online program management and even experiential learning opportunities.

Far too often, institutions, unable to build internal campus capacities, become heavily dependent on these partners, entering into contracts that are difficult to break, ceding control over institutional data and, to our horror, letting OPMs not only define standards for admission into online programs but design the programs themselves. In short, we’ve learned a great deal over the past decade about the down sides of public-private collaboration.

In the years since my Chronicle article appeared, higher education has undergone far-reaching transformations for good and ill. On the positive side of the ledger, access has increased and completion rates have risen. In addition, student bodies have grown increasingly diverse. But, more negatively, the ecosystem has become more stratified not only in terms of prestige or reputation, but in resources, facilities, the range of majors, student qualifications, the undergraduate experience, student support services and even the availability of financial aid.

In a recent Washington Post opinion piece, the conservative columnist George Will makes an argument that our colleges and universities ignore at their peril. The column questions a series of self-serving assumptions that higher education has propagated, but that increasingly draw a skeptical response:

  • That ever-higher college enrollments are necessary for a healthy economy.

Will notes that according to the Federal Reserve, 41 percent of college graduates hold jobs that do not require a college degree.

  • That a degree is necessary for a fulfilling life.

As he observes wryly, 62 percent of American adults do not hold degrees, and many are quite contented.

  • That undergraduate degrees have a high return on investment.

Here, he cites recent reports that 40 percent of college graduates earn no more than the average high school graduate a decade after leaving school.

  • That in many cases the pursuit of master’s degrees, enabled by excessive student borrowing, is financially dubious.

Many of these programs, Will argues, are motivated not by a demonstrated return on investment but, rather, by greedy institutions eager to siphon off “the ocean of cash available through subsidized student loans.”

Before you dismiss these assertions out of hand, do remember this: those who ignore widely held opinions are like those policy makers and military officers who ignore intelligence assessments. They set themselves up for a fall.

Steven Mintz is professor of history at the University of Texas at Austin.

Next Story

Written By

More from Higher Ed Gamma