Assessing a Hot Assessment Tool

November 10, 2008

The Collegiate Learning Assessment may be passing the largest test to date of whether it can measure growth in student learning. But the study of the CLA also found that many minority students and those who are not well prepared for college show smaller gains on the CLA -- potentially reinforcing the concerns some have about how the test may be used.

The CLA has emerged in the last year or two as a key response for colleges facing demands that they demonstrate the "learning outcomes" and "value added" that take place on their campuses. Students who take the CLA are asked to complete a series of exercises to measure critical thinking, analytic thinking and written communication. The test is offered to small, representative groups of students as freshmen and to other groups later in their college careers, in an attempt to measure growth in learning.

The theory behind CLA and similar assessment tools is that colleges need to get away from measuring their excellence only by "input" measures (students' incoming SAT scores, for example) or prestige and pay more attention to what actually takes place during college. So the CLA might find that Harvard University students have great skills upon arrival, but don't grow much, while students at other colleges see much more learning. When the Voluntary System of Accountability was announced a year ago by two groups of public universities, the CLA was designated as one of the tests that could be used to measure student learning in a comparable way.

That of course begs the question of whether the CLA can measure growth in student learning, and the new research released Saturday suggests that it can. The study was based on tracking 2,300 students at 24 four-year colleges and universities, which were not named but included a broad range of institutions by standards of mission, competitiveness and demographics. The analysis was conducted by the Social Science Research Council. While the Council for Aid to Education, which runs the CLA, cooperated with the project, the council had no control over the study or release of its findings. For the students tracked, CLA scores and transcripts were analyzed at the beginning of the freshman year and at the end of the sophomore year. Additional studies are now planned as the students are tracked through the rest of their college careers and, perhaps, beyond.

Here are some of the findings to date:

  • Changes in learning can be measured and tracked, even when various educational and socioeconomic factors are weighted, to document that learning takes place at different rates at different institutions. That significant variation can be tracked among institutions is key, as without such variation, the underlying "value added" premise of the CLA and other tests wouldn't be valid.
  • When students perceive high expectations from faculty members, there is more growth on the CLA.
  • Students who concentrate their college coursework in traditional liberal arts fields such as mathematics, science, social sciences and humanities show greater gains in reasoning and communication skills than do students in education, human services or business.
  • Students who arrive in college with better preparation (as measured by high school grades and Advanced Placement scores) show greater gains than do other students.
  • Non-white students -- including Asian students -- show lower CLA scores upon arrival in college. Except for Latino students, non-white students show smaller gains on the CLA than do white students in the period of time studied (the first half of a college career).

Richard Arun, a professor of sociology and education at New York University and program director for education at the Social Science Research Council, said that the study was significant for "moving basic social science research, where you can look at value added, from K-12 education to higher education." Although this approach has become common in elementary and secondary schools, it is "overdue" in higher education and this research suggests that it can be done.

Similarly, Roger Benjamin, president of the Council for Aid to Education, said that this research helps "to push this new testing paradigm, which gets us beyond the multiple choice test and speaks to actual cognitive outcomes."

Some educators have worried that the CLA and similar tests would end up -- like traditional measures of educational excellence -- saying that flagship universities or elite liberal colleges do a better job than institutions that admit and work with students who have not been well prepared. The new research could well add to such fears as it finds the greatest gains at institutions with a well prepared student body in a traditional curriculum.

"That reality does give me concern," said Benjamin. But he added that the CLA also demonstrated that colleges do not perform equally well at reaching minority students or students without a solid high school education. Institutions that serve such students benefit "if we really identify and describe the obstacles" and then focus on why some colleges perform better, he said.

The purpose of CLA is "to understand how a school understands where it stacks up, so that then they can improve skills in the classroom." The idea isn't to reduce colleges and their work to a number, he said.

Of course the concern of colleges and some testing critics is that however sophisticated an analysis the CLA's creators envision, many politicians will look for a number, and may not credit the college making arduous but important gains with disadvantaged students.

Robert Schaeffer, public education director of the National Center for Fair and Open Testing, said that his group's concerns about "value added" tests like the CLA "are less with the quality of those instruments than in how some proponents want the results to be used. Any attempt to impose one-size-fits-none measures on colleges and universities is sure to create even more problems than No Child Left Behind did in K-12 education."

Search for Jobs


  • Viewed
  • Commented
  • Past:
  • Day
  • Week
  • Month
  • Year
Back to Top