Measuring the Pulse of Liberal Education

Colleges need better tools to figure out if they are making a difference, study finds
November 7, 2005

How are American students doing at achieving the key goals of a liberal education?

Depends who you ask. According to the students, they’re doing pretty well. According to standardized tests, they’re coming up short. And the bottom line is, based on available assessments, nobody really knows.

The Association of American Colleges and Universities on Friday released “Liberal Education Outcomes: A Preliminary Report on Student Achievement in College.” More clear than any of the assessments provided in the report of current achievement is the need for effective methods of assessment. The report is part of a 10-year plan launched by AAC&U in January to make sure students are getting the skills they need to thrive in a world constantly in flux.

The report identified broad areas that are “valued both by the academy and business leaders,” said Carol Geary Schneider, president of AAC&U: from long standing pillars, like basic scientific knowledge; to burgeoning needs, like teamwork and information literacy; to virtues of the citizen, such as ethical reasoning and civic engagement.

The report used data from the 2004 National Survey of Student Engagement to assess whether students felt they had made progress during college toward the goals of a liberal education. The collective answers of students asked how they were doing, resembles a collective “fine.” Eighty-six percent reported learning “very much” or “quite a bit” in basic areas of knowledge. Seventy-seven percent said the same for “written skills,” and 87 percent for critical and analytical thinking. (The 2005 version of the report, being released today, has similar results.)

Data from 2003-4 from the Educational Testing Service and the ACT College Assessment of Proficiency Test, however, did not necessarily agree with the students. The ACT CAAP math scores show a decline from freshman year, when the students first took the standard test, to senior year, when they took the same test again. Only 11 percent of seniors, according to the ETS data, were proficient at a level of writing that includes skills like recognizing the most effective revision of a sentence. The ACT data showed less than one standard deviation gain in critical thinking through college, and, according to ETS, only 6 percent of seniors were “proficient” in critical thinking, a statistic that Ross Miller, director of programs for AAC&U, called “tough to take.”

Still, Miller noted that the achievement levels dictated by standardized tests are somewhat arbitrary, and students might actually be learning a lot. The problem is, he said, there is a dearth of data that can be used to make a rigorous assessment. In seven of the key categories of achievement identified in the report, including quantitative literacy, information literacy, and intercultural knowledge, the report indicated: “no national data found.” Schneider said “we know far too little about how we are doing as a nation.”

While few systems for national assessment exist, Miller noted that some institutions are doing an excellent job with internal assessments, and that in instances when such assessments change teaching practices, “the improvements in achievement are dramatic."

At Carleton College, sophomores have to put together a writing portfolio with work from a variety of disciplines that is then assessed by faculty members from different disciplines. Miller said the portfolio not only helps students figure out where they need help, but breeds grading consistency because younger faculty members enter into dialogue with senior faculty members about assessing papers.

David J. Sill, associate provost at Southern Illinois University at Edwardsville, said the institution tried something resembling a writing portfolio, but that, with about 13,500 students, and some of the moderate “silos” that come with moderately large institutions, the project was not practical. Instead, the university has the Senior Assignment, in which students take on a project that develops a slew of skills. Sill said that one of the best assignments involves 25 art and design students taking a trip to Tlaxiaco in Oaxaca, Mexico, where the first language is Mixtec. The students learn the traditional art forms from local craft workers. “They work in a team, and learn art in a new culture, with a language they can’t understand,” Sill said. “When they get there, they’re angry. By the third week, it’s a transformative experience every time.” After the trip, the students put on a public exhibit that is assessed by faculty members from multiple disciplines. Sill said that some of the students who went on the first trip 10 years ago are still doing art influenced by the trip.

Miller lauded the State Council of Higher Education for Virginia for its requirement that public institutions in that state put their own assessment tools online. Linda Cabe Halpern, dean of general education at James Madison University, said that humanities professors often resist assessment tools, because they think they can “reduce the depth of the educational experience.” But, she added, posting goals and assessment rubrics online has opened a conversation among James Madison faculty members, as well as with feeder community colleges as to how they can send along prepared students.

Some categories of achievement will likely present years of challenges for a true assessment. Miller noted that, according to student-engagement data, only 23 percent of seniors reported voting often in local, state, or national elections. "There’s no standardized test for civic engagement,” he said. "That’s best assessed over time."


Be the first to know.
Get our free daily newsletter.


Back to Top