You have /5 articles left.
Sign up for a free account or log in.

As one becomes deeply involved in an issue – becomes expert, if you will – it’s hard not to see all the ways that the broader public discussion of this issue with which one has become conversant gets distorted and developed in unproductive ways.

More plainly, I’m weary of some common tropes in education writing/reporting and would love to see them stop. Today I’ll focus on one:

Citing Academically Adrift as definitive proof that students don’t learn anything in college.

This particular bugaboo is committed frequently, and two of the most common perpetrators happen to be Kevin Carey and Jeff Selingo, two of the most widely read higher education writers in the country. They’re also both believers in the “college is broken” narrative,[1] and as such often find Academically Adrift handy evidence with which to indict higher education.

For example, in a recent Washington Post piece, “Undergraduate Education Is Broken. Solutions Start with Faculty and Rigor,” Selingo drops in Academically Adrift as dispositive proof of the brokenness of undergraduate education:

A seminal study in 2011 that resulted in the book “Academically Adrift” found that one-third of college students made no gains in their writing, complex reasoning, or critical-thinking skills after four years of college. “American higher education is characterized by limited or no learning for a large proportion of students,” wrote authors Richard Arum and Josipa Roksa. For many undergraduates, they wrote, “drifting through college without a clear sense of purpose is readily apparent.”[2]

For some, somewhere along the way, Academically Adrift has become a kind of gospel suitable to be collectively applied to all schools and all students.

What seems to have been lost is that these conclusions of Academically Adrift are pegged to a single metric, student performance on the Collegiate Learning Assessment, which claims to measure critical thinking and analytical reasoning. 

Along the way from the release of Academically Adrift to the present, the limits of the CLA seem to have been sanded away. For example, subsequent studies utilizing the same CLA data diced in different ways came to different conclusions about how much students “learn.” 

There are also numerous critiques on the limits of the Academically Adrift methodology, as well as alternative studies that try to measure what students learn in college that show different results. 

This is not to say that we should dismiss Academically Adrift; we shouldn’t. It is part of an ongoing conversation about teaching and learning in higher education that is unlikely to ever be settled. Such is the nature of scholarly inquiry. But this nuance seems to disappear when there’s a story to be told. Or sold, maybe.

There is also the underlying problem of the CLA as a tool to measure what students learn in college.

At its best, the CLA (now the CLA+) is described as a “good start” at providing a standardized assessment that measures a particular kind of critical and analytical thinking.

But the limits and criticisms of the test are numerous. For example, there is no incentive attached to the exam. In the immortal words of Bill Murray in Meatballs, “It just doesn’t matter.” Put a two-hour standardized exam in front of a college senior, tell them it doesn’t matter, and just imagine the effect on scores.

More importantly, the CLA+ “Performance Task,” which involves interacting with a number of “documents” in order to develop analytical responses to specific questions, is an example of what I call “snow globe” critical thinking, where the test creates a world that resembles ours, but must be hermetically sealed off from it lest the findings be tainted by prior or outside knowledge.

For example, the available sample exam posits a hypothetical political scenario regarding policing/drug enforcement, where the documents are supposed to help determine the appropriate stance towards the issue. 

To do well, one must ignore anything they know about the real world of politics or drug enforcement policy. The exam does require a kind of analytical thinking akin to the logic section of the LSAT, but this is not the be all and end all of critical thinking. It certainly is not a test that could capture anything a student has learned in college unless the college is focused on teaching techniques and strategies to do well on the CLA+.

To think critically we must be able to compare some new bit of information to our existing body of knowledge. To set aside what we know and have learned makes little sense. Meaningful critical thinking must also take into account and make room for our individual value systems.

When a test actually punishes existing knowledge, we have a limited test. That it is used as such a broad brush to indict college in general is simply dishonest.

If performance on the CLA+ is evidence that college is broken, the Citadel, which scores in the bottom 2% on the “value added” metric, is just about the most broken institution of them all, something no one seems to believe. 

At state flagship UT-Austin, seniors scored below the freshmen

The City College of New York, which has the second highest “mobility rate” in the country (moving graduates from the bottom 40% in income to the top 40% in income), has a negative value-added score on the CLA.

So does Ohio State.

How much faith should we be putting in this one test? How much faith do we put in one study that rests on this one test?

In the future, every time you read something like, “A seminal study in 2011 that resulted in the book Academically Adrift found that one-third of college students made no gains in their writing, complex reasoning, or critical-thinking skills after four years of college,” add the following disclaimer:

…according to a single data analysis of the Collegiate Learning Assessment, a test which is highly contested as to its own reliability, as well as its applicability to measuring student learning. 

I’m not holding my breath that our leading education journalists will start practicing such rigor, but one can hope.


[1] I am more of the “college has some significant problems, pretty much like anything else, but if we pay attention to the right things we can at least make those problems a little less problematic” school of thought. It’s much harder to put this philosophy in a snappy headline, which perhaps explains why Selingo and Carey are frequently published in major newspapers and I am not.

[2] I’m tempted to spend some time on how Selingo combines these two bits in a way distorts their original meaning, but there’s other fish to fry.

Next Story

Written By