You have /5 articles left.
Sign up for a free account or log in.

Follow-up stories are not the news media’s forte. Headlines blare, innovations are hyped, then silence ensues.

Think of the classic mysteries that remain unsolved. Inquiring minds want to know whatever happened to Judge Crater or Jimmy Hoffa, or to the artworks taken in the Isabella Stewart Gardner Museum heist.

Then there are somewhat more recent stories that cry out for follow-up. Were the paintings stolen from the Kunsthal museum in Rotterdam in 2013, including major works by Picasso, Matisse and Monet, burned? Why hasn’t the CIA yet released all its records about the assassination of President John F. Kennedy? Why did the incidence of reported peanut allergies begin to surge in the United States after 1997?

The pundit Ross Douthat mentioned a few fresher stories that beg for elaboration:

  • Who blew up the Nord Stream pipelines?
  • What secrets did Jeffrey Epstein’s death conceal?
  • What exactly happened between Brett Kavanaugh and Christine Blasey Ford?

A failure to follow up is not a victimless crime. It leaves a lot of misimpressions in its wake. For example, many on the right and the left believe mistakenly that the wars on poverty and cancer failed. But as the economist Noah Smith has argued persuasively, the evidence shows that they’re dead wrong. When one takes into account the impact of Medicare, Medicaid and food stamps or looks at sharply falling rates of lung cancer, colorectal cancer, pancreatic cancer, breast cancer, prostate cancer and liver cancer, it’s clear that genuine progress was made.

Follow-up stories, in short, can sate readers’ curiosity, provide balance and tie up loose ends. Follow-up studies are especially important in policy matters. Such studies can:

  • Validate or invalidate initial claims.
  • Monitor a particular intervention’s impact.
  • Contribute to a more comprehensive understanding of an issue.
  • Help stakeholders decide whether a specific strategy is worth pursuing or needs to be adjusted.

A recent piece by Goldie Blumenstyk on the demise of a highly touted credentialing collaboration between colleges and employers illustrates the value of follow-up stories. The Digital Tech Credential program involving 19 colleges between Baltimore and Richmond held out the promise of better meeting pressing workforce needs in data analysis, visualization and cybersecurity. But despite a $5 million Digital Tech Credential Scholarship for women and underrepresented students of color, the initiative failed.

Understanding why this career-readiness initiative bit the dust is essential if future efforts to promote alternate industry-recognized credentials are to work. Is it because:

  • The curriculum wasn’t well aligned with job market needs?
  • Of inadequate engagement between the institution and industry?
  • A campus failure to make students aware of the program’s potential payoff?
  • The initiative’s outcomes weren’t well documented or substantiated?

The most important recent follow-up study that I have recently read is an evaluation of the impact of college career-preparation programs by the Harvard economist David J. Deming and his colleagues. Entitled Delivering on the Degree: The College to Jobs Playbook, this report finds that:

  • Colleges and universities rarely make career outcomes or economic mobility a top institutional priority, nor do they hold departments responsible for achieving this goal.
  • College-to-career interventions are often implemented inequitably, with students of color and those from low-income backgrounds underrepresented.
  • Poor coordination and a lack of accountability characterize partnerships between colleges and employers.
  • There is a dearth of research about which programs work and which do not.

The study looks at 13 specific career-readiness interventions that involve career exploration, skill building and job immersion. These include apprenticeships, internships, job shadowing, skills-specific boot camps, individualized career coaching and mentorship, guided pathways, experiential coursework, and industry-recognized credentials.

It’s noteworthy that most of the research and examples that the study reviews focus on two-year, not four-year, institutions.

Also striking is the fact that many of the initiatives offer minimal evidence of effectiveness in terms of job placement or future earnings. Thus, for example, career coaching and mentoring, job shadowing and last-mile boot camps have only a limited short- or long-term economic impact. Interestingly, such programs do contribute to other key student outcomes: academic persistence and performance, course completion and degree attainment, and to career decision-making skills, self-efficacy, career confidence, vocational identity and career satisfaction, especially among students of color.

The career-preparation programs that do have the greatest economic impact are apprenticeships, internships and co-ops.

What lessons should colleges and universities take away from the Deming study? Here are eight takeaways.

  1. Career guidance needs to become a higher campus priority. Rather than treating career advising as optional, relegated to a stand-alone office, campuses need to treat this as essential service that is integrated across the undergraduate experience. Bunker Hill Community College, which integrates career counseling along with academic and financial advising, might serve as an example.
  2. Campuses need to do a much better job of exposing new students to job market trends and making them more aware of the kinds of careers that are available and the skills and coursework needed to enter into those careers. A promising model is the City University of New York Guttman College Ethnographies of Work courses, in which freshmen observe and analyze workplace culture, expectations and dynamics.
  3. Campuses also need to more firmly embed professional identity formation and career-specific and personal skills-building within the existing curriculum, including training in industry-aligned software and in project management.
  4. Departments need to create degree maps that lead to in-demand careers and better align course content and major requirements with career success.
  5. Faculty need to integrate more experiential, project-based, hands-on activities with real-world applicability into their courses.
  6. Campuses need to work with industry, businesses, government agencies and nonprofits to expand the number of structured, paid career-immersion experiences, including internships, microinternships and serving-learning opportunities.
  7. Institutions need to ensure that women and students of color receive equal access to paid internships and other college-to-career programming.
  8. Administrators and accreditors need to hold departments and programs accountable for their students’ economic outcomes, including disparities across demographic lines.

I consider each of these recommendations eminently reasonable and well worth implementing.

I understand that many fellow academics look at existing trends in higher education and are deeply disheartened. I too share their concerns with the post-pandemic surge in disengaged students, downsized humanities programs regarded as service departments, growing numbers of instructors working outside the tenure system and the growth of asynchronous online programs lacking regular, substantive interaction with a teacher-scholar and classmates.

In a gloss on Gayle Greene’s recently published book, Immeasurable Outcomes: Teaching Shakespeare in the Age of the Algorithm, Johann Neem writes, “The classroom is threatened by false understandings of what can and should be assessed, by online education and by the world’s distractions. It needs to be protected. It is, in Greene’s words, ‘a site of resistance against the dehumanization that’s hollowing out our lives.’”

I’d be the last to disagree. But I also worry that all too many of my students—who come from the top 6 percent of their high school graduating class—exit the university without the essential knowledge, skills and cultural, historical and technological competencies, and life skills that I’d expect from a college degree holder.

That’s not a surprise: you don’t learn what you’re not taught.

In a recent opinion essay, Mark Garrett Cooper, a professor of film and media studies at the University of South Carolina at Columbia, pushes back against hyperbolic fears about “skillification” and “higher ed’s grim, soulless, ed-techified future.” As Cooper notes, “Anyone paying attention to the nonacademic job market will know that skills, rather than specific majors, are the predominant currency.” He adds that at the entry level, “Employers express relative indifference with respect to undergraduate major, but relative precision with respect to required skills.”

Rather than treating a heightened emphasis on skills and employment outcomes “as a sneak attack” on higher education’s “heart and soul,” Cooper concludes, campuses need to do a better job of defining and promoting the value of the education they offer.

I wholeheartedly agree. But that need not mean focusing intensively on skills deemed “transferrable” or “job-ready” or “workforce aligned.” The knowledge and skills we need to impart are far broader, but too often neglected.

Think of the things that largely aren’t taught: not just personal finances or leadership skills or listening and negotiation skills or public speaking, but mind-set, self-assessment, self-awareness, stress and emotional management, relationship skills, self-advocacy, networking, bouncing back from failure, and, yes, career planning.

Let’s teach what students need to know.

Steven Mintz is professor of history at the University of Texas at Austin.

Next Story

Written By

More from Higher Ed Gamma