Calculating the Cost of Dropouts

Report aims to put a price tag on students who don't complete the first year -- using a method others denounce.
October 11, 2010

Report after report of late has drawn attention to the extent to which many American colleges are failing to graduate their students, at a time when the Obama administration and leading foundations are trying to ramp up the number of Americans earning a postsecondary credential. In a new report, one well-known analyst seeks to shift the focus to how many students are falling off track in the first year -- but his approach is drawing a sharp rebuke from a former colleague.

In the report, published today by the American Institutes for Research, Mark D. Schneider, a vice president there and former commissioner of the Education Department's National Center for Education Statistics, cites data suggesting that 30 percent of first-year students at four-year colleges do not return to their original institution for a second year, and that states and the federal government provided more than $9 billion in aid to institutions and students to support those students.

The critique of the report -- from another leading researcher with a background at the Education Department -- centers on its focus on students who return to their original institution, thus ignoring the many transfer students on whom that first year is not "wasted," as the report suggests.

In addition to those national totals, the report, "Finishing the First Lap: The Cost of First-Year Student Attrition in America’s Four-Year Colleges and Universities," provides state-by-state figures designed to show legislators and taxpayers how much public institutions in their states are spending on students who are in college for only a year. The numbers run from nearly half a billion dollars in states like California, New York and Texas to just a few million in small states such as Delaware and Rhode Island.

"We believe that these numbers should alert taxpayers and their representatives to the high costs a state incurs when, as is unfortunately the case, large numbers of students fail to return to the college or university for a second year," Schneider writes in the report.

And the point of the report over all, he argues, is that "[t]he nation will have a difficult time reaching the administration’s policy goals unless we find ways to increase the number of students who return to complete their college degrees. In the meantime, we continue to spend far too much money on students who don’t even finish the first lap, let alone fail to cross the finish line."

Clifford Adelman, a research associate at the Institute for Higher Education Policy and a former colleague of Schneider's at the U.S. Education Department, says the issue Schneider is raising is legitimate -- but he is far less generous about his methods and conclusions, suggesting that Schneider has framed the study to get headlines instead of reflect the truth.

According to Adelman, the Education Department's Integrated Postsecondary Education Data System, which Schneider used to determine the dropout rates, is incapable of providing accurate information to produce those numbers, because it covers only a narrow group of students and tracks them only to the extent they remain enrolled at their original institutions.

"You get only first-time, full-time students who enrolled in the fall semester (not winter or spring) who showed up at the same school (not somewhere else) as full-time students (not part-time) in the next fall semester (not winter or spring)," Adelman said via e-mail. "This data story, already distant from the realities of student attendance patterns by three galaxies, is further compounded by a state level analysis which pretends that students never, never cross state lines to attend a second school."

While other Education Department databases have their own flaws, Adelman argues, the transcript-based National Education Longitudinal Study suggests that 94 percent of first-time fall enrollees at four-year colleges enroll in some kind of postsecondary institution at some point the next year. There are many things those data don't reveal -- how many credits those students pursued at their second institution, for instance, or what happens at the state level -- and the numbers are old.

But "despite its shortcomings, it’s a lot better than IPEDS, and can give us an honest story, not swill," Adelman said.

That's among the kinder things he says about Schneider's study.


Back to Top