SHARE

CLA as 'Catalyst for Change'

November 14, 2011

A seven-year project in which dozens of small private colleges used a standardized measure of student learning to gauge and try to improve their performance accomplished many of its goals, increasing the colleges' focus on student learning outcomes and stimulating changes in practices on many campuses, says a report on the effort released today. 

"Over all, the initiative has involved a steady expansion -- one is tempted to say 'mushrooming' -- of efforts, changes, experimentation and conversation about ways to use outcomes assessment effectively to improve teaching and learning," says the report on the Council of Independent Colleges' experiment with the Collegiate Learning Assessment.

But the report,  "Catalyst for Change: The CIC/CLA Consortium." cites few examples in which the changes clearly produced measurable gains in student learning, and some of the participating campuses have stopped using the exam because of lingering concerns about its validity.

The CLA, created by the Council for Aid to Education, aims to measure the extent to which institutions help their students improve their skills in critical thinking, analytic reasoning, problem solving and written communication, through a series of prompts and performance tasks. It has been heralded (critics might say hyped) as a new breed of standardized assessment measure that more authentically captures how graduates will be expected to work, and that can provide a "value added" score for institutions. (It is typically given to 100 freshmen and 100 seniors at an institution, small numbers that make it cost-effective but also raise questions about validity.)

The Council of Independent Colleges, which represents 600 mostly small private institutions, established its consortium in 2004 with grants from the Teagle Foundation to encourage its members to administer the performance-based CLA and to use the results, individually and collectively, to improve their teaching and learning practices. The consortium began with 14 members and grew to 47 by 2010, when the project formally ended.

The group's officials are quick to note that they undertook the project, voluntarily, before the national commission appointed by former Education Secretary Margaret Spellings pressed colleges to measure student learning outcomes more aggressively, and championed the CLA as a tool. (It also unfolded years before the publication early in 2011 of Academically Adrift, which examined CLA results and other data to argue that many students do not appear to be making learning gains in their first two years of college.)

"We didn't need mandates or government pressure" to explore ways to improve teaching and learning on our campuses, said Richard Ekman, the CIC's president. "Our institutions are interested in getting it right, and they have embraced the use of data to diagnose what is and isn't working and then changing their practices."  

Through many anecdotes from individual institutions, the report presents a generally (though by no means uniformly) positive view of the colleges' experiences using the CLA. Most importantly, CIC officials say, the institutions' use of the CLA focused the attention of the colleges -- and their various constituents, including faculty members who are often skeptical of standardized student measures -- on the potential value of measuring student learning outcomes. "It's a test that for faculty, particularly, was seen as having some validity to what they do in the classroom," said Harold V. (Hal) Hartley III, senior vice president at CIC, who oversaw the CLA project.

At North Carolina's Barton College, for example, faculty members had had a "general, unscientific impression that students were not giving us writing that was as articulate as it needed to be" to represent college-level work, said Kevin Pennington, an associate professor of biology and chair of the department of science and mathematics there.

The results of Barton students on the CLA's writing prompts "confirmed the stories we had shared with real hard data," he said, which helped give professors confidence in the test's validity. That experience led Barton to make improving student writing the focus of its Quality Enhancement Plan in its accreditation by the Southern Association of Colleges and Schools; using a locally created version of the CLA's writing exercise, Barton's professors have established a common rubric for evaluating student writing across the institution.

Professors at William Woods University, in Missouri, initially felt as if the administration had decided to use the CLA without faculty buy-in, and viewed the test with deep skepticism at first, said Sherry McCarthy, the provost there. But when the institution gave instructors incentives to embed the CLA's "performance tasks" in their course curriculums, they "came to see that that's what we would like our students to be able to do," she said. "Most of our faculty understand what the test is, what it measures, and they agree that it's good."

The embrace of the CLA was far from universal. Some institutions were troubled by concerns that have dogged the CLA from the start about how difficult it is to get students to take the test seriously, resulting in questions about whether the small samples of students that take the exam are representative enough to create valid results. Faculty members at Drake University "doubted the validity of measuring something as fluid as critical thinking skills via a standardized instrument," and wide swings in year-to-year outcomes "make it difficult for many at [Texas Lutheran University] to place high confidence in CLA scores," the report says.

Now that the CLA project is over (and the financial support that institutions received to participate has ended), Juniata College has decided to stop giving the assessment to students every year, said the provost, James Lakso (though it will probably work the exam into a three-year cycle it uses for various learning measures). Professors there have two reservations, he said: "they don't really understand the methodology," and a testing expert there remains "unconvinced that he can tell the difference between the CLA prompt for writing and the prompt for critical thinking."

"Until people understand what's in the black box, you're not going to be able to get buy-in," said Lakso.

In addition to the innovations and changes that the use of the Collegiate Learning Assessment spurred on individual campuses, officials of the independent college group applauded the impact that the CLA project had on getting professors and administrators from different colleges to work together. "Common measures and common issues across institutions give the assessment process some measure of credibility," says the report on the experiment. "Having other institutions provide advice on everything from logistical challenges to the broadest ideas about curriculum and program creates a community of professional practice that makes it easier to improve an institution’s work. The work of the CIC /CL A Consortium provides a model of how undergraduate education can become more professionalized through shared understandings, measures, and practices."

Trudy W. Banta, senior adviser to the chancellor for academic planning and evaluation at Indiana University-Purdue University Indianapolis and co-author of Designing Effective Assessment:  Principles and Profiles of Good Practice, said that she was heartened to see a group of institutions taking advantage of significant funding over an extended period of time to encourage each other to focus on student learning. "When faculty get together and focus on a measure or theories or good practice, good things can happen," she said in an interview.

But Banta, who has critiqued the use of standardized tests to compare colleges, noted that many other types of measures could have produced similar momentum on campuses and within a consortium like the CIC group.

"There are many ways to achieve that focus: using other instruments with national norms such as [the National Survey of Student Engagement or the Cooperative Institutional Research Program's freshman survey]; reading about and implementing on a pilot basis various approaches to improving writing or critical thinking across the curriculum; or taking a concept such as service learning and deciding how you would know if it improves students’ ability to apply their learning, their oral communication skills, or their effectiveness in working as a team member," she said via e-mail.

Banta also said she was struck, though, by how little evidence there was in the report that the participating institutions had seen any actual improvement over the term of the CLA experiment -- exemplifying how long it takes to develop proof that initiatives work in education.

 

 

Most:

  • Viewed
  • Commented
  • Past:
  • Day
  • Week
  • Month
  • Year
Loading results...
Back to Top