New and improved data show that America is doing better on college completion than had previously been revealed, according to a major report from the National Student Clearinghouse Research Center. But plenty of work remains on degree production, and it’s unclear how the more reliable numbers will affect the “completion agenda” and a related push for more accountability in higher education.
The statistics released today shift some of the focus to students from colleges when it comes to tracking graduation rates. That’s a welcome change, experts said, because students move around so much these days. About 22 percent earn a degree from a different college than the one where they first enrolled, according to the report, and one-quarter of transfer students cross state lines.
The six-year completion rate for first-time students nationwide is 54 percent, according to the report, with 12 percent of that total being students who transferred. And one particularly positive finding from the research is that three-quarters of full-time students complete college within six years.
Both rates are higher than other published data on college completion. The numbers are based on “near-census national coverage,” according to the center, which tracks 94 percent of students at all types of institutions, except those that do not participate in federal aid programs.
Higher education leaders often complain about incomplete federal data, which is often used by policy makers to beat colleges up on their graduation rates and other performance indicators. In contrast to the newly released data, the federal government’s tabulations of degree production and graduation rates generally do not capture transfer or other student “swirl” factors. The Integrated Postsecondary Education Data System (IPEDS) also misses half of all students with institution-level data based only on first-time, full-time students.
Community colleges in particular come out looking better in the new report.
“The clearinghouse report both provides a significantly more accurate and much more positive picture of community college completion than the graduation rates from the U.S. Department of Education,” said David Baime, senior vice president of government relations and research for the American Association of Community Colleges.
Some of the commonly used numbers used to exhort America to produce more college degrees and certificates are also controversial.
Take the U.S.’s 15th-place ranking by the Organization for Economic Co-operation and Development (OECD) on degree production among developed countries. Clifford Adelman has attempted to dismantle that ranking with his research. And the former federal data guru, who is a senior associate at the Institute for Higher Education Policy, uses indelicate language that roughly translates to “meaningless" when describing the OECD data and a few other figures often used by the completion agenda’s advocates. (The OECD ranking looks at the national proportion of degree holders, which differs from the center’s analysis of completion rates.)
Adelman praised the new report, with a few quibbles, like the center’s decision to use 24 as the dividing age between traditional and adult students. (He would use a younger age.) Adelman called the data a big step up from IPEDS and said it should lead to a “more sophisticated discussion” about completion rates.
Stan Jones agreed. The president of Complete College America, a prominent completion-oriented group, Jones said the data were more fully formed and complete than other measures. “We give them straight As for accuracy,” he said.
But Jones and several other observers said the clearinghouse is doing a job that the federal government should already be doing. “This is not data that’s unreasonable to ask institutions to report,” he said.
Understanding the System
The clearinghouse is a nonprofit group that conducts verification and research services for 3,300 member colleges, which voluntarily hand over enrollment and degree information. Its research center began following almost 1.9 million students at those institutions in 2006. The National Center for Higher Education Management Systems did a review of the data and how they were collected, giving the center a thumbs-up for its methodology.
The group finished the six-year study earlier this year. Since then it has released a drumbeat of reports on the data. This one, however, is the big kahuna.
In addition to giving broad national completion rates, the data are broken down into many different categories and subcategories. Student performance is tracked by age, institution type and whether students enrolled part-time, full-time or a combination of the two. It even looks at whether students who transfer from community colleges to four-year institutions earned associate degrees before transferring.
“You start to get a sense of what’s really going on” by looking at the data, said Dewayne Matthews, vice president for policy and strategy for the Lumina Foundation.
For example, roughly 49 percent of all students at four-year public institutions graduated from the same institution where they started. But another 9 percent completed at a different four-year, while 3 percent completed at a community college, bringing that number up to 60 percent. And 16 percent were still enrolled in college after six years.
At community colleges, 24 percent completed at their "starting institution" within six years. That number rose to 36 percent with completers elsewhere, with a whopping 20 percent of students still enrolled. The figures at four-year for-profits were 38 percent at the starting institution, with 43 percent overall completers and 14 percent still enrolled.
Representatives from the clearinghouse and center will discuss the report’s findings on Capitol Hill today, at an event that is sure to pique the attention of the completion agenda’s supporters, as well as its critics.
For its part, the center said hopes the new data will help close the “information gap” and contribute to the national discussion around completion, as well as related policy decisions.
The college completion agenda and its “ambitious timeline,” the report said, “can succeed only with comprehensive and timely measures of student outcomes to inform all stakeholders about the progress made and to identify areas for further improvement.”
Several states have begun collecting similar data on their own, sometimes attached to the earnings of college graduates and students who failed to earn credentials. Florida, Virginia and Texas are among those leading the pack.
The new data from the center may factor into ongoing discussions about how the federal government should track completion, including the work of the federal Committee on Measures of Student Success. It also shares similarities to the currently scrapped idea for a national/federal system of tracking students, dubbed the federal “unit record” system.
The clearinghouse had not made its data publicly available until recently, which had drawn some criticism. But the group drew wide praise for contributing what many see as a valuable resource for understanding how the nation’s higher education system works. State-level reports are also in the works, according to the center.
In addition to praising the data, several leaders of the completion push said the findings are far from being all good news, particularly for lower-income students and those from racial and ethnic minority groups.
“We’ve got lots of challenges here,” said Matthews.