Data as a Tool to Improve Community Colleges

As major efforts to measure the effectiveness of two-year institutions mature, some institutions report significant, data-driven policy shifts to improve student learning.
May 27, 2008

One of the biggest fears of many colleges about efforts to assess what students learn or whether they complete their degrees is whether the data can actually accomplish meaningful change. The fear is that assessment efforts will just produce busywork and databases -- costing time and money and not actually helping anyone.

Two major efforts to use data to improve community colleges have now been around long enough that some institutions are able to report on what they did with all of the data they produced. And at several sessions Monday at the National Institute for Staff and Organizational Development, college leaders presented results. NISOD, part of the Community College Leadership Program of the University of Texas at Austin, is one of the largest gatherings of community college leaders -- both those already at the senior levels and up-and-comers. In discussions of large-scale assessment projects, the tone seemed to shift at this year's NISOD meeting from "why data collection is important" to "what we are doing on campus with the numbers."

Officials of Zane State College, a two-year institution in the Appalachian part of southeastern Ohio, spoke about how they used the framework of Achieving the Dream to tackle retention issues, especially in the first year that students are enrolled. Achieving the Dream is a multi-year program in which selected community colleges receive support to use data to identify weaknesses and fix them.

Paul R. Brown, president of the college, said that when he arrived there a little more than four years ago, he found that many people looked at Zane State's student population -- low-income, with many students arriving in need of remedial education -- and assumed that "Appalachians don't appreciate higher education." In fact, he said, "most of those assumptions were just inaccurate."

Around the time the college joined Achieving the Dream, it hired its first institutional researcher, and started digging into records in more detail. Over all, the graduation rates weren't viewed as embarrassing, but when examined closely, college officials found that they had poor retention in the first year, but then had decent retention after that. So instead of viewing retention as OK, they viewed it as poor in one year, and good after that -- leading to more data examination.

Among the discoveries: Students would take remedial English over and over again until they pass, but would give up quickly on remedial math -- and leave the college. About 20 percent of students were never taking placement tests. And the college was losing students with a grade point average about 3.0.

Chad M. Brown, dean of health and public service programs at the college (and no relation to the president), characterized these as "tough questions" that made some people at the college uncomfortable.

Among the new efforts started at Zane in the last few years are a required one-credit course in the first year in which students focus on the transition to college, a new consistency in testing to pass out of remedial programs so that standards are consistent, standardized exit exams for English and math courses to encourage consistent standards there, professional development programs three times a year for faculty members who teach first-year students, and an annual meeting at which those who teach remedial courses and those who teach college-level courses meet to align expectations and standards.

While these and other efforts are still too young to have produced definitive results, Zane officials are encouraged by several developments. While many experts fear that remedial programs can be a dead end, the college is seeing evidence that some of its remedial students are finishing credit programs, and excelling. Of those who graduated in 2007, 47 percent took at least one remedial course, and of those who took remedial courses, 21 percent graduated with honors.

In another session, the topic of discussion was the Community College Survey of Student Engagement, which has just completed its fifth year. Like the National Survey of Student Engagement (for four-year colleges), CCSSE asks students a range of questions about their academic and non-academic experiences in college and gives colleges ways to compare their results with those of similar institutions.

CCSSE's senior associate, Christine McLean, led the audience through exercises to show how the project works. Participants broke into small groups to answer questions such as whether students come to class unprepared, or whether students report that they work with other students in or out of class. The faculty members and administrators were asked to identify the percentages they believed their colleges would receive, and to indicate what they hoped the ratings could be, and to consider the significance of the gaps.

Using data to measure the difference between hopes and realities is a major theme of CCSSE. McLean reviewed real data with the audience showing that students regularly rank services such as academic advising and financial aid advising as crucial to their success -- and report using such services minimally. What does that mean, she asked? One possibility, she suggested, is that the data point to problems in providing the services in a timely way, especially the part-time student population that can be found at most community colleges. "When people have sat in line for two hours and then are told to go someplace else, they aren't going to sit in another line," she said.

This data-driven approach to program improvement is attracting more and more community colleges. In the five years that the survey has been given, the number of colleges participating has grown to 573 from 93.

Colleges that were early adopters cited a range of specific policies embraced as a result of reviewing survey results. Ali Esmaeili, an associate dean at South Texas College, said rules such as mandatory orientation for first-year students and limits on late registration grew out of analyzing questions related to retention. Further, data suggested that the faculty members played a key role in academic advising so the college has invested in training them in advising.

Shirley Gilbert, special assistant to the president at El Paso Community College, said that a range of programs in local high schools were prompted by analysis of CCSSE and other data. College officials saw that the longer students spent in remedial programs, the greater the chance of losing them. This raised the question: What if they can avoid some or all of the remediation?

So the college started doing more of its remedial placement testing in high schools, briefing students on how they could avoid remediation later. Similar programs were created for older students thinking of enrolling, who might need a little brushing up on math, which they could handle themselves, to avoid a semester of remedial math.

The idea of CCSSE, Gilbert said, is that "it forces you to use data."


Be the first to know.
Get our free daily newsletter.


Back to Top