You have /5 articles left.
Sign up for a free account or log in.

It's hard to think of a study in the last decade that has had a bigger impact on public discourse about higher education and the internal workings of colleges and universities alike than has Academically Adrift.

The 2011 book, among the most extensive analyses of the extent and quality of college-level learning in many years, found that many students showed no meaningful gains on key measures of learning during their college years. The findings fed a burgeoning critique of higher education -- adding doubts about student learning to existing qualms about rising costs and degree completion -- and led many campuses to take a closer look at their students' performance and at the rigor and depth of their curriculums.

Two recent reports by a prominent researcher purport to challenge Academically Adrift's underlying conclusions about students' critical thinking gains in college, and especially the extent to which others have seized on those findings to suggest that too little learning takes place in college. The studies by the Council for Aid to Education show that students taking the Collegiate Learning Assessment made an average gain of 0.73 of a standard deviation in their critical thinking scores, significantly more than that found by the authors of Academically Adrift.

"The notion that college doesn't matter is inaccurate," Roger Benjamin, president of the Council for Aid to Education, said in an interview. (The council produces the CLA.) In the paper and a recent presentation of its data, Benjamin said that CAE's findings contrast with Academically Adrift's, though in the interview he sought to play down the extent to which his findings undermine those of the book.

Not much at all, one of the authors of Academically Adrift said in an e-mail message upon reviewing the CAE studies. Richard Arum, professor of sociology and education at New York University, noted methodological differences in how the two sets of data were drawn -- most significantly that the CAE study does not follow the same group of students over time -- and questioned the figures from Academically Adrift that Benjamin used to draw his conclusions.

He also took issue with the suggestion that his study, with the University of Virginia's Josipa Roksa, questioned the contribution that college makes to student learning ("Neither Roksa nor I have made such a claim," he wrote) -- although certainly some of the book's champions have done so.

First of Its Kind

The 2011 publication of Academically Adrift was noteworthy for several reasons. First, it was among the first to make use of results from the CLA or another of a relatively new set of assessment tools that specifically seek to measure the "added value" that colleges impart to their students' learning, by allowing for the comparison of the performance of students over time.

Second, it was injected into the public discourse at a time of intensifying political and public criticism of higher education on a range of fronts, especially tied to rising tuition and student debt levels and apparent declining international performance of the American college system in the postsecondary attainment of 25- to 34-year-olds.

Making use of transcript data from 25 selective colleges that had given the same cohorts of students the Collegiate Learning Assessment at various points over the course of their college careers, the study found that significant minorities of students "did not demonstrate any significant improvement in learning" over two or four years of college, and that students on average showed only very limited gains in learning.

The findings punched some data-driven holes in what historically has been the seemingly unassailable strength of American higher education -- the quality and rigor of the learning required  by professors and gained by students.

"How much are students actually learning in contemporary higher education? The answer for many undergraduates, we have concluded, is not much," Arum and Roksa wrote.

The study came under some criticism, mostly for depending so heavily on the Collegiate Learning Assessment as evidence of whether students have learned. But the findings by the well-regarded researchers prompted significant soul searching on the part of many college administrators and professors, providing evidence to confirm their worst fears. And as pointed as the authors' assertions were, they emboldened much more sweeping condemnations by those whose philosophical world views (namely, that higher education is broken) were reinforced by actual data.

"Academically Adrift ... documents the ugly underbelly of American higher education -- a culture that is anti-intellectual and that often produces students who have neither the skills or knowledge they will need to succeed after graduation," Anne Neal, president of the American Council of Trustees and Alumni, wrote in early 2011.

"College leaders have long excused decades of relentlessly rising prices, exploding student-loan debt, and alarmingly high dropout rates with the assumption that students are learning," Kevin Carey, who is now director of the education policy program at the New America Foundation, wrote in 2012. "The prices are reasonable and the loans repayable, they say, because of the skills and knowledge that students acquire in exchange. And while dropouts are regrettable, we are told, that's an unavoidable -- nay, admirable -- consequence of maintaining high academic standards.

"Academically Adrift exposed the bankruptcy of those assertions."

It was in response to commentaries like those -- in many ways more than to the more measured findings of Academically Adrift itself -- that CAE aimed its new reports, Benjamin said.

The council's study examines a much broader group of students, drawn from more than 1,300 colleges and universities that had used the CLA to test the critical thinking skills of groups of freshmen and seniors between 2005 and 2012. While there were significant differences among institutions, the "more robust" analysis found average gains of 0.73 of a standard deviation over several test administrations, with colleges showing "similar levels of growth regardless of sector (public v. private), institution size, Carnegie Classification, selectivity, or minority-serving status," the authors write.

"This stands in contrast to the findings of Academically Adrift," the authors add. "They suggest that there is little growth in critical thinking as measured by CLA. They report an effect size of 0.18, or less than 20 percent of a standard deviation," though the CAE paper notes that Arum and Roksa use different methods of estimating the growth, including tracking it over two years (rather than four).

Those differences are not to be minimized, Arum said in response. He said he was perplexed why Benjamin and his colleagues used the two-year figure in their analysis rather than the four-year figure that was also included in Academically Adrift, which showed a gain of 0.47 standard deviation -- still significantly smaller than, but closer to, the CAE finding.

Arum, in his e-mail, also noted that rather than follow the same cohort of students throughout their collegiate careers -- allowing for a direct comparison of how the same students performed on the CLA over time -- the CAE report "relies on cross-sectional data comparing a random sample of freshmen and seniors at a college. Given the high rates of attrition in higher education, such a methodology can result in upward bias of estimates. Roksa and I strongly prefer identifying growth from observing individual gains using longitudinal data."

Alexander Astin, the Allan M. Cartter Professor Emeritus at the University of California at Los Angeles, seconded Arum's view that the use of a cross-sectional rather than longitudinal comparison is significant, because at most institutions significant numbers of the entering freshmen will have dropped out. "[L]iterally hundreds of research studies over the years have clearly demonstrated that dropouts are not comparable to degree completers in several crucial respects: they are less well prepared academically (i.e., their school grades and test scores are lower), less motivated, have poorer study habits, more likely to be commuters rather than residents, more likely to be underrepresented minorities, and more likely to come from lower SES families," Astin said via e-mail. "Since each of these qualities, in turn, is likely to be associated with lower CLA scores, the study's findings could be entirely attributable to this methodological flaw."

Benjamin said that studies by CAE and others had found the use of cross-sectional rather than longitudinal data to be immaterial.

But in an e-mail response, he sought to play down his studies' differences from those of Academically Adrift. "I decided to make these data public because they suggest, overall, student learning growth in colleges is significant.

"However, until we carry out numerous studies building upon Academically Adrift, we will not have a complete enough evidence-based picture of student learning in the American collegiate landscape to permit both finer grained generalizations about the quality of student learning and also what actions we can take to continue to improve it.

"I only singled out Academically Adrift because it was a seminal attempt to examine the question of student learning on our campuses. We need many more such serious efforts."

Next Story

Written By

More from Learning & Assessment