No concept is arguably more popular in higher education policy today or seems to have broader consensus than institutional “skin in the game”: the idea that colleges need to be on some sort of financial hook when their students don’t succeed.
Students and families are spending near-record amounts on postsecondary training, yet students are dropping out and defaulting on loans at disturbingly high rates. Mix in high-profile collapses like Corinthian Colleges and near-daily stories of college graduates struggling to find employment, and we get policy makers coming to the disheartening conclusion that our higher education institutions are incapable of doing the very thing we expect of them -- creating capable graduates -- unless threatened with financial sanctions.
Yet is this really the case? Colleges spend a lot to recruit and retain students, and every one that leaves without completing represents lost time, money and effort that require more recruitment and retention dollars to replace him or her. Students who don’t finish or who complete but struggle to find employment create nothing but negative reputational outcomes that institutions must invest both time and resources to counteract. Plus, when those same students leave with loan debt and struggle to repay it, the institution may yet again spend dollars and effort on default prevention services.
Put it all together and it’s pretty clear that when students fall off a successful education path, institutions pay a very real financial price. But this is exactly what having skin in the game entails. So why are we pushing for policy and regulation to accomplish what’s already taking place?
Making colleges pay a second time for poor outcomes doesn’t make much sense, although critics will say that market-driven financial penalties are obviously just not doing enough to change institutional behavior. To believe that, however, we have to believe institutions, as producers, actually prefer to see some of their education outputs fail.
That’s awfully strange. If institutions could control how much students learn, then why would they consciously choose to send unprepared graduates into the labor market where they struggle to find and keep employment? And if they could control who graduates and who doesn’t, what economic rationale do they have for producing a mix of graduates and dropouts? If they really had a choice, why would they ever produce anything other than graduates?
Colleges today face a continuous barrage of criticism about whether they provide value for money, and so we’re left to ask under what circumstances colleges that capably control learning, degree completion and postgraduate employment outcomes would actually opt to produce substandard products. Does a business approach that thrives on threats of greater regulatory scrutiny exist? Does a “student failure” model bringing about additional enrollment management, default prevention and reputational costs make operational sense?
It’s pretty obvious that if institutions could control the types of outcomes that skin-in-the-game proposals wanted to see improvements on they’d already be doing so. What colleges and universities wouldn’t benefit from high graduation rates, stellar job placement statistics and graduates who earned enough money to comfortably pay off their student loans?
It’s also why the argument that the financial costs institutions already face just aren’t harsh enough doesn’t make much sense. It’s like suggesting that my dog doesn’t speak English because I’m just not spending enough time teaching him. The outcome and process we’re trying to link don’t fit the way we think they do.
What’s missing from the equation is the idea that academic success is a two-way street where students’ academic preparation, motivation and effort do as much to shape the outcomes we care about as the resources institutions provide them. In its absence, the obvious consequences of policies that only hold colleges accountable for outcomes that they share control over is that they put their effort into the things they can control -- which, in this case, is picking students they think are most likely to succeed.
All of this means that the losers from skin-in-the-game proposals end up being students who have less academic preparation and who come from underresourced school districts. We actually end up creating undermatching by putting greater pressure on colleges to pick “winners” and discouraging them from taking chances on individuals who may benefit the most from the type of education they offer.
It’s also likely to hurt institutions with open admissions policies and that currently enroll larger percentages of minority and nontraditional students. Community colleges, with their limited state budgets and high transfer rates, would suffer most, but so would any college drawing large populations of students from disadvantaged communities. In the long run, those institutions could face unsustainable financial and reputational costs.
There’s certainly a place for risk sharing in higher education, which is why institutions currently pay the real financial costs I described earlier. But if what we care about is making institutions more responsive to students’ long-run needs and expectations, then the solution lies in policies and practices that make such goals their focus.
Income-Share Agreements (ISAs) -- whereby colleges finance their students’ education in return for a fractional share of those students’ future income -- are a good example. They create not only financial penalties but also financial rewards for institutions that help students achieve long-term, sustained success. Driving more institutional revenues through ISA-style agreements also discourages the kinds of deceptive marketing practices that policy makers believe institutions engage in since colleges and universities would, over time, end up having to financially absorb the costs of misrepresenting their programs’ job placement prospects.
The fact is that it’s easy to think that simply imposing penalties on bad actors will fix the problem, yet the logic has to be there to justify the approach. The basis on which risk sharing proposals today are being crafted doesn’t meet the standards of sound policy. We owe it to both colleges and students to craft policies that work toward, not against, the system’s overall objectives.
Carlo Salerno is a Washington, D.C.-based education economist and private consultant.
We need only two things to convince our communities, public officials, local employers and parents of students and prospective students about the value of a degree in the humanities: stories and data.
In the humanities, we have always used stories well. We can assemble lots of anecdotes about our graduates and how, now that they’re gainfully employed, they use what they learned in our classes. Anecdotes are clearly not enough, however. We’re definitely not winning the public relations contest about what aspects of public higher education are worth investing in. So how can we supplement our good stories with good data, while keeping the discussion firmly rooted in the humanities?
In an effort to share strategies and to get better at making the case for the value of humanities education, a group of about 40 humanities faculty members and administrators, local employers, and public humanities representatives in southern New England got together recently. We talked about what student success in the humanities looks like, how we could measure what it gives students and how we would know when we’ve helped students to achieve it.
The question of student success is on everyone’s radar these days, and the discussion usually refers to retention and graduation rates. Our discussion in New England pointed a different way, however. We wanted to bring employers into the conversation to help them to understand what our students are learning and to help us to learn what they value in new employees. That is especially important for those of us who take issues of racial and economic diversity seriously. As Karen Cardozo, assistant professor of interdisciplinary studies at Massachusetts College of Liberal Arts, pointed out at the meeting, if we can show that humanities degrees have value in the workplace, we can assure working-class students, first-generation students and students of color that following a passion for history, philosophy, literature or music can lead to a good job, too.
Here’s how our meeting went:
First, we assembled by tables, trying to making sure an employer and a public humanities representative were at each table. (Public humanities representatives include those who work at museums, state National Endowment for the Humanities affiliates, cultural councils and the like.) Employers from publishing, local government and local small businesses also participated. (We hope to involve some larger employers next time we meet.) We also mixed in representatives of two- and four-year colleges, as well as public and private institutions.
Each table considered one question at a time, and we then discussed our answers in the group as a whole. Here are the questions:
What can a humanities graduate do?
What (else) should a humanities graduate be able to do?
How can we make sure students graduate with this knowledge or these skills?
How can we measure or assess whether they can do what we say they can do?
It was great to have employers at each table, and we moved them around between groups for each question so the tables could get different perspectives. Some of the employers were already savvy about what a humanities education delivers; others weren’t sure what exactly constitutes the humanities.
Together, we compiled a list of the skills that we think graduates have cultivated in their humanities education:
Writing skills, with style
Cultural competencies, intercultural sensitivity and an understanding of cultural and historical context, including on global topics
As part of our list, we also agreed that graduates should have the ability to:
Construct complex arguments
Provide attention to detail and nuance (close reading)
Ask the big questions about meaning, purpose, the human condition
Communicate in more than one language
Understand differences in genre (mode of communication)
Identify and communicate appropriate to each audience
Be comfortable dealing with gray areas
Think abstractly beyond an immediate case
Appreciate differences and conflicting perspectives
Identify problems as well as solving them
Read between the lines
Receive and respond to feedback
Then we asked what we think our graduates should be able to do but perhaps can’t -- or not as a result of anything we’ve taught them, anyway. The employers were especially valuable here, highlighting the ability to:
Use new media, technologies and social media
Work with the aesthetics of communication, such as design
Perform a visual presentation and analysis
Identify, translate and apply skills from course work
Perform data analysis and quantitative research
Be comfortable with numbers
Work well in groups, as leader and as collaborator
Identify processes and structures
Write and speak from a variety of rhetorical positions or voices
Support an argument
Identify an audience, research it and know how to address it
Know how to locate one’s own values in relation to a task one has been asked to perform
They also mentioned a need for better technological, project-management and conversational and interview skills.
We also discussed creating tables that would link the knowledge, skills and aptitudes of the first two questions to the kinds of work students might do after graduation, task by task. We’ve assigned that work to the participating employers.
To make sure that our students can graduate with the knowledge and skills we want to see, we know we would have to make some changes to the way our degrees are structured. Some of the changes we talked about at the meeting were:
Providing more faculty development to help professors be more explicit and intentional in language about the skills being taught
Creating a one-credit course on the relation of humanities to work and the professions
Using required courses (general education) and events (orientation) to introduce the need to connect courses and skills
Being intentional about double majoring, adding minors that enable students to pair professional training with humanities
Using successful alumni in programming
Integrating student employment with academics, through course work or portfolio reflection
Infusing reflective writing into courses
Encouraging community engagement with the curriculum
Providing avenues for student creativity to demonstrate higher-order skills
Taking on the idea of maker space—what are the humanities making?
Giving students self-assessment skills
Developing portfolios that include both work and reflection linking course work to other kinds of engagement, such as employment and student activities
Structured work shadowing opportunities
Creating local employer/faculty advisory groups to determine workforce needs and establish a common language
Building reflection, work, community engagement and shadowing into the credit structure
Capitalizing in four-year colleges and universities on work already being done at two-year institutions
The final task at our meeting was to come up with ways to measure whether we are doing what we say we are doing now, as well as if we pursue the changes we want to make. We developed the following list:
Alumni surveys, to determine short- and long-term impact of humanities education
Student surveys, at entry and exit, about how their ways of thinking have changed
Internship supervisor surveys
Determining whether local employers hire our graduates, why or why not, and whether those graduates have the needed knowledge and skills
Using capstone courses to assess ways students have been asked to combine humanities and work
Gathering information that can contribute to big data. Who else is collecting what we seek, and how can we combine their data with ours?
That was the most difficult assignment, and it’s the shortest list that our group developed. That, of course, was not surprising. Assessment has always been challenging, as any regional accreditation team can tell you.
But we had started the afternoon asserting that we want the general public to support humanities education and to understand the value of what we do, and so we knew we must to find good ways to collect evidence. That’ll be a topic in our next meeting.
We agreed that the next step, when we reconvene in May, will be for all of us to have made some progress on our own campuses toward both adding education in the new skills we think humanities students need and finding ways of measuring our success.
If you’re working on humanities student success initiatives, what tactics are you trying? With whom are you working? Are you getting any traction in your institution or region?
Making the case for the humanities can start on the campus, but it ultimately has to convince funders, parents and employers, too. We’re hoping to make southern New England the first Humanities Success Zone in the country -- where an employer with some job openings asks, “What kind of person would add some real value to our company beyond the specific skills we need for this job?” We want the answer that springs to mind to be: a humanities graduate.
Paula M. Krebs is dean of the College of Humanities and Social Sciences at Bridgewater State University, in Massachusetts. On Twitter, she is @PaulaKrebs.
It has long been a truism in American higher education that junior and senior year are seen as at the top of the curricular pecking order. That is when the major is taken and, frankly, that is where most of our senior faculty really prefer to teach.
First year, on the other hand, is seen by many of us as less important. And because of this, guess who is often assigned general education and introductory courses? Adjuncts, graduate assistants and our most junior faculty.
It’s almost as though introductory and general education courses that define the first two years of college are what students get through as quickly as possible so that they can get to the good stuff in their third and fourth years -- that is, upper-level courses and the major.
But this view is out of sync with what many prospective college students and their parents are thinking. In a book I recently wrote about the transition from high school to college, virtually all of the high school seniors I interviewed, along with their parents, hoped that the first year of college would be a major step up from what they were doing in high school. But they are often disappointed.
At many colleges and universities, first-year students take large introductory courses in classes of 100 or more. Teaching is usually done by an instructor lecturing in front of the classroom while students dutifully take notes later to be regurgitated on a quiz. There is very little class participation involving discussion and debate. Writing anything over a few pages is unusual.
Arizona State University has gone even further. They are offering a Global Freshman Academy that allows first-year students to take their courses by the use of MOOCs (massive open online courses). Students won’t even have to leave the comfort of home to complete their first year! First year is seen as a means to an end, with the end being upper-level courses and the major.
But I would argue that the first year of college is far more important than this -- perhaps, in some ways, just as important as the final years of college.
Why do I believe this?
First year is when college students get a sound, cross-disciplinary grounding in the liberal arts and sciences, especially those who go on to vocational majors like engineering or nursing. The liberal arts are where they learn how to think critically and how to communicate effectively, skills that are crucial for a generation that will have many different careers in their lifetime.
First year to sophomore year is when attrition is at its highest. When I was a college president, 20 percent of first-year students at my institution didn’t return for their sophomore year. Some transferred, but many dropped out of college altogether. Why does this happen? In far too many exit interviews I have seen, dropouts say that they found their first-year classes meaningless.
I will never forget the admissions tour I took at a well-known university with my youngest daughter. We were in the university’s amazing library, and the tour guide, a sophomore, was bragging about the fact that most of his teachers were graduate assistants. “They’re really cool,” he said, “and understand our generation,” whereupon a mother standing next to me uttered sotto voce (but loud enough for everyone to hear), “Why am I paying a small fortune to have my child taught by someone who is only a couple years older than she is?”
That parent was articulating what many parents I interviewed for my book were saying: for $50,000 or more per year, the expectation is that their children will be taught by experienced faculty with the requisite credentials, not by part-time employees or graduate students.
Of course, many of the instructors assigned to introductory or general education courses including adjuncts and graduate students are quite capable teachers. But I believe that first-year students could really benefit from also being taught by senior faculty members who excel in the classroom. In many ways -- and I know this is heretical -- assistant professors who just completed their Ph.D. dissertations are probably the most capable of teaching the major that requires up-to-date knowledge of their discipline. Senior faculty, on the other hand, who through wisdom and experience have a wider view of the world are, in my opinion, the most qualified to teach general education courses designed to give first-year students a broader perspective on human knowledge and, in the process, excite them about what will come later.
Increasingly, colleges are coming to see the crucial importance of the first year. At one college I feature in my book, the freshman writing seminar is largely taught by the college’s most distinguished and experienced senior faculty, who are handpicked because they are also master teachers. First-year advising is also being given a new emphasis. At far too many colleges, advising is relegated to new faculty who have limited knowledge of the curriculum or to adjuncts who have equally limited office hours. But many colleges, realizing that solid advising reduces attrition, are assigning experienced faculty who are skilled at advising or professional advisers to first-year students.
For these colleges and universities, the first year has been given a new priority.
I’d like to end by saying that there is money to be raised by rethinking the first year, which should make presidents who are reading this article happy. I believe that philanthropic individuals and foundations, concerned about the cost of higher education and the human waste when students prematurely drop out and don’t graduate, will resonate to programs that support first-year students and keep them in college. I’m talking about:
Innovative first-year general education programs that challenge and excite first-year students through active learning (including discussion, debate and writing) so that they don’t want to leave college.
Endowed writing centers and other support systems that can save kids who come to college with academic deficiencies.
Endowed first-year opportunity programs that keep underserved and first-generation students in college.
Attrition is enormously expensive. A college of 2,000 students like my own that loses 20 percent of the first-year class potentially forgoes $5 million or more in tuition, room and board, which for many colleges is more than the development office raises each year in the annual fund.
In summary, by putting more energy and resources into the first year I believe we keep more of our students in college and thereby cut down on the enormous human waste when otherwise good students prematurely leave college with outsize debts they can’t pay back because they are unemployable. At the same time we improve our bottom line by not losing so much in tuition dollars. Most important, we graduate students for whom education from the very beginning is a pleasure, not a hardship to be endured.
Roger Martin is president emeritus and professor of history at Randolph-Macon College. He is the author of Off to College: A Guide for Parents. This essay is based on a presentation at the Council of Independent Colleges’ Institute for Chief Academic and Chief Advancement Officers.