"Student success" programs of various types -- learning communities, first-year experience programs, and the like -- have proliferated on college campuses, driven by the reality that it's easier to keep current students than recruit new ones. The programs are popular, but as is true of just about all campus efforts these days, they are open to scrutiny about their effectiveness -- and their cost effectiveness.
Given that climate, many student affairs officials would probably be wary of looking too closely at what their programs cost and whether they provide a meaningful return on that "investment," for fear that the data, if they don't look good, might be used against them in the fight for resources. But putting those trepidations aside, 13 colleges -- as part of a project sponsored by Jobs for the Future and the Delta Project on Postsecondary Education Costs, Productivity and Accountability -- agreed to examine both the full costs of first-year retention efforts focused on first-generation and low-income students, and the extent to which their success in keeping students enrolled produces revenue to help pay for themselves.
A report on the Investing in Student Success effort, published this week by the two organizations, suggested that a majority of the programs had produced gains in retention that went a long way toward offsetting their costs. Most of the others could not complete the analysis, using a "cost return calculator" that includes a wide range of data, because they didn't have all the necessary cost and retention statistics (for students in the programs and for a comparison group).
But by looking at their first-year programs through the prisms of cost and return on investment, all 13 of the participating colleges experienced what the sponsors of the initiative call a "change in conversation" about those programs -- with some saying that they planned to apply similar scrutiny to other academic programs on their campuses. Officials at several of the campuses said they believed it was important to take financial considerations into account in assessing these programs, as long as they didn't overwhelm other, more qualitative factors, such as the programs' quality and the beneficial impact on students.
"It really meant that there was a new way of thinking about these programs," said Jennifer Poulos, senior program manager at Jobs for the Future.
What that meant differed somewhat depending on the campus. At Valencia Community College, in Orlando, campus officials had been trying to decide whether to expand a mandatory student life skills course for academically at-risk students that they had developed as part of the Lumina Foundation for Education's Achieving the Dream initiative. Valencia required the course for all students who tested into developmental classes in reading, writing and mathematics, and the skills class was deemed successful in helping students advance to credit-bearing work.
As Valencia considered requiring the course for the larger pool of students who tested into remedial courses in two of the three areas, some campus officials were wary because "there was a general sense that expanding the program would be prohibitively costly to the school," said Kurt Ewen, assistant vice president for academic learning support at Valencia. This was especially concerning, he said, at a time when the economy had turned down and Valencia's resources were stretched increasingly thin.
Through its grant from the Jobs for the Future/Delta program, which was sponsored by Lumina and the Walmart Foundation, Valencia was able to get a much clearer picture of the program's costs and returns, and "putting it on the table dispelled a lot of the concerns," Ewen said. Valencia officials have opted not to expand the study skills course to the larger pool of students at this time for other reasons, he said, but the cost-return review was deemed so useful that the college has decided to apply the Delta project's template within the college-wide process for reviewing other types of programs.
"We discovered that a conversation about the real costs for programs ought to be part of our ongoing conversation," Ewen said. "For a discussion about the cost/benefit of student success oriented programs to have meaning, we need to have comparisons to other types of programs."
Appalachian State University accepted Jobs for the Future's invitation to participate in the pilot project, but Cama Duke, director of learning skills services at the North Carolina institution, admits to having had "some sweaty palms" about how its student support services program for first-generation and low-income students would fare.
Duke was confident in the value of the federally and institutionally funded program, which provides orientation-to-graduation support to about 200 such students a year, but at a time when campus administrators are increasingly looking for ways to assess programs other than purely by whether they produce tuition revenue, she was nervous that the data might end up making the program look inefficient. "You don't want this to be a reason for people to say, 'Let's not fund it,' " she said.
When Appalachian State ran the numbers, "we looked good, and showed a return on investment," Duke said. "But what was best about it was that when I sat down with the vice provost and the CFO and presented the data, they made clear that they were committed to it, but we had a really good discussion about 'is this the best way to do it?'
"It's good to know how much you're spending, and you want to be realistic," she said. "You want to do both what's right and what can be sustained."
George D. Kuh, who as Chancellor’s Professor and director of Indiana University's Center for Postsecondary Research has written widely about the use and value of many of the student persistence efforts examined in the Investing in Student Success effort, said he agreed with the report's finding that for all the praise that student and academic affairs administrators heap on learning communities and first-year experience programs, "there is almost no mention about what these programs cost to operate."
He noted that the pilot project, like most inquiries that depend on existing higher education information, are constrained by the limitations of existing data, such as the budget information in the federal Integrated Postsecondary Education Data System, on which the Delta/Jobs for the Future project leans. And Kuh said that this project, like other efforts at bolstering higher education accountability, has the potential to be used for disruptive as well as beneficial purposes.
"If the idea in using data like this is to that we're out to cut programs [just because they don't have a large return on investment], then it's not a good thing," he said. "But if we're doing it to try to get as big a bang of the buck as possible, then it's a very helpful way of looking at things."
"What seems to have happened on a lot of these campuses is that has changed the nature of the conversation, and that, in and of itself, is a great side effect. If you don't know what the investment you're returning is, that makes us look really silly."
Many participants in the project said they recognized that they were equipped to make much better decisions by injecting costs, and cost/benefit considerations, into the equation. But they also said, to a person, that they were much more interested in "data-informed" decision making than data-driven decisions, as Duke of Appalachian State put it.
"The kind of cost analysis that this project allowed us to begin is an essential component of the conversation we need to have about programs on our campuses," said Valencia's Ewen, who was also on an advisory committee that has helped Delta and Jobs for the Future on the project. "But it can't be the deciding voice in the conversation. Alongside the hard quantitative numbers, we have to consider more qualitative pieces like the mission-specific needs of the institution in question, and the human impact on students.
"This is a helpful piece of the conversation that we need to continue to refine."