Assessment

Let's use available income data to judge value of college degrees (essay)

From health care to major league baseball, entire industries are being shaped by the evolving use of data to drive results. One sector that remains largely untouched by the effective use of data is higher education. Fortunately, a recent regulation from the Department of Education offers a potential new tool that could begin driving critical income data into conversations about higher education programs and policies.

Last year, the Department of Education put forward a regulation called gainful employment. It was designed to crack down on bad actors in investor-funded higher education (sometimes called for-profit higher education). It set standards for student loan repayment and debt-to-income ratios that institutions must meet in order for students attending a specific institution to remain eligible for federal funds.

In order to implement the debt-to-income metric, the Obama administration created a system by which schools submitted social security data for a cohort of graduates from specific programs.  As long as the program had over 30 graduates, the Department of Education could then work with the Social Security Administration to produce an aggregated income for the cohort. Department officials used this to determine a program-level debt-to-income metric against which institutions would be assessed. This summer, the income data was released publicly along with the rest of the gainful employment metrics.

Unfortunately, the future of the gainful employment regulation is unclear. A federal court judge has effectively invalidated it. We, at Capella University, welcome being held accountable for whether our graduates can use their degree to earn a living and pay back their loans. While we think that standard should be applied to all of higher education, we also believe there is an opportunity for department officials to take the lemons of the federal court’s ruling and make lemonade.

They have already created a system by which any institution can submit a program-level cohort of graduates (as long as it has a minimum number of graduates in order to ensure privacy) and receive aggregate income data. Rather than letting this system sit on the shelf and gather dust while the gainful employment regulations wind their way through the courts, they should put it to good use. The Department of Education could open this system up and make it available to any institution that wants to receive hard income data on their graduates.

I’m not proposing a new regulation or a requirement that institutions use this system. It could be completely voluntary. Ultimately, it is hard to believe that any institution, whether for-profit or traditional, would seek to ignore this important data if it were available to them. Just as importantly, it is hard to believe that students wouldn’t expect an institution to provide this information if they knew it was available. 

Historically, the only tool for an institution to understand the earnings of its graduates has been self-reported alumni surveys. While we at Capella did the best we could with surveys, they are at best educated guesswork. Now, thanks to gainful employment, any potential student who wants to get an M.B.A. in finance from Capella can know exactly what graduates from that program earned on average in the 2010 tax year, which in this case is $95,459. Prospective students can also compare this and other programs, which may not see similar incomes, against competitors.

For those programs where graduates are earning strong incomes, the data can validate the value of the program and drive important conversations about best practices and employer alignment. For those programs whose graduates are not receiving the kinds of incomes expected, it can drive the right conversations about what needs to be done to increase the economic value of a degree. Perhaps most importantly, hard data about graduate incomes can lead to productive public policy conversations about student debt and student financing across all higher education.

That said, the value of higher education is not only measured by the economic return it provides. For example, some career paths that are critical to our society do not necessarily lead to high-paying jobs. All of higher education needs to come up with better ways to measure a wide spectrum of outcomes, but just because we don’t yet have all those measurements doesn’t mean we shouldn’t seize an a good opportunity to use at least one important data point. The Department of Education has created a potentially powerful tool to increase the amount of data around a degree’s return on investment. They should put this tool to work for institutions and students so that everyone can drive toward informed decisions and improved outcomes.

It should become standard practice for incoming college students or adults looking to further their education to have an answer to this simple question: What do graduates from this program earn annually? We welcome that conversation.

Scott Kinney is president of Capella University.

6th Annual Sloan-C/MERLOT Emerging Technologies for Online Learning International Symposium

Date: 
Tue, 04/09/2013 to Thu, 04/11/2013

Location

3667 Las Vegas Boulevard South Planet Hollywood Resort
89109 Las Vegas , Nevada
United States

Institute for the Development of Excellence in Assessment Leadership

Date: 
Tue, 08/06/2013 to Fri, 08/09/2013

Location

Baltimore , Maryland
United States

Study tracks European tracking of university students

Smart Title: 

Universities and governments on the continent exhibit many of the same data limitations as U.S. colleges in gauging student outcomes, study shows.

International educators debate mass vs. elite higher education

Smart Title: 

International educators at a meeting of the Organization for Economic Cooperation and Development put a new twist on an old debate, prodded by New York University's provocative president.

Pulse podcast examines Blackboard Analytics for Learn

Smart Title: 

This month's edition of The Pulse podcast features a conversation with Mark Max, vice president of Blackboard Analytics for Learn.

Data show key role for community colleges in 4-year degree production

Smart Title: 

Study shows that 45 percent of bachelor's degree recipients studied at two-year institutions first -- as many as three-quarters in some states.

Why assessment isn't about certainty (essay)

When I first floated the idea of writing a weekly column from my perch as director of institutional research and assessment at my college, everyone in the dean’s office seemed to be on board.  But when I proposed calling it “Delicious Ambiguity,” I got more than a few funny looks.  
 
Although these looks could have been a mere byproduct of the low-grade bewilderment that I normally inspire, let’s just say for the sake of argument that they were largely triggered by the apparent paradox of a column written by the measurement guy that seems to advocate winging it. But strange as it may seem, I think the phrase “Delicious Ambiguity” embodies the real purpose of Institutional Research and Assessment. Let me explain why.
 
This particular phrase is part of a longer quote from Gilda Radner – a brilliant improvisational comedian and one of the early stars of “Saturday Night Live.” The line goes like this:
 
“Life is about not knowing, having to change, taking the moment and making the best of it, without knowing what’s going to happen next. Delicious Ambiguity.”
 
For those of you who chose a career in academia specifically to reduce ambiguity – to use scholarly research methods to discover truths and uncover new knowledge -- this statement probably inspires a measure of discomfort.  And there is a part of me that admittedly finds some solace in the task of isolating statistically significant “truths.”  I suppose I could have decided to name my column “Bland Certainty,” but – in addition to single-handedly squelching reader interest – such a title would suggest that my only role is to provide final answers – nuggets of fact that function like the period at the end of a sentence.
 
Radner’s view of life is even more intriguing because she wrote this sentence as her body succumbed to cancer.  For me, her words exemplify intentional – if not stubborn – optimism in the face of darkly discouraging odds. I have seen this trait repeatedly demonstrated in many of the faculty and staff members I know over the last several years as you have committed yourself to helping a particular student even as that student seems entirely uninterested in  learning.
 
Some have asserted that a college education is a black box; some good can happen, some good does happen – we just don’t know how it happens. On the contrary, we actually know a lot about how student learning and development happens – it’s just that student learning doesn’t work like an assembly line.  
 
Instead, student learning is like a budding organism that depends on the conduciveness of its environment; a condition that emerges through the interaction between the learner and the learning context.  And because both of these factors perpetually influence each other, we are most successful in our work to the degree that we know which educational ingredients to introduce, how to introduce them, and when to stir them into the mix.  The exact sequence of the student learning process is, by its very nature, ambiguous because it is unique to each individual learner.
 
In my mind, the act of educating is deeply satisfying precisely because of its unpredictability.  Knowing that we can make a profound difference in a young person’s life – a difference that will ripple forward and touch the lives of many more long after a student graduates – has driven many of us to extraordinary effort and sacrifice even as the ultimate outcome remains admittedly unknown.  What’s more, we look forward to that moment when our perseverance suddenly sparks a flicker of unexpected light that we know increases the likelihood – no matter how small – that this person will blossom into the lifelong student we believe they can be.
 
The purpose of collecting educational data should be to propel us – the teacher and the student – through this unpredictability, to help us navigate the uncertainty that comes with a process that is so utterly dependent upon the perpetually reconstituted synergy between teacher and student. The primary role of institutional research and assessment is to help us figure out the very best ways to cultivate – and in just the right ways – manipulate this process.  
 
The evidence of our success isn’t a result at the end of this process.  The evidence of our success is the process.  And pooling our collective expertise, if we focus on cultivating the quality, depth, and inclusiveness of that process, it isn’t outlandish at all to believe that our efforts can put our students on a path that someday just might change the world.
 
To me, this is delicious ambiguity.

Mark Salisbury is director of institutional research and assessment at Augustana College, in Illinois. This essay is adapted from the first post on his new blog.

Editorial Tags: 

Program Assessment Workshop

Date: 
Sat, 10/27/2012

Location

Embassy Suites Hotel San Diego Bay - Downtown 601 Pacific Highway
92101-5914 San Diego , California
United States

Program Assessment Workshop

Date: 
Sat, 09/22/2012

Location

Tod Wehr Conference Center, Milwaukee School of Engineering (MSOE) 1025 North Broadway
53202 Milwaukee , Wisconsin
United States

Pages

Subscribe to RSS - Assessment
Back to Top