The Higher Learning Commission placed MacMurray College on probation and extended the probation of Urbana University at the agency's June meeting, the accreditor announced. The president of MacMurray criticized the accreditor's action as "unjust," saying the agency failed to give the college sufficient time to fix problems created under a previous administration.
The commission also granted accreditation to the reborn Antioch College, in Ohio.
A request to write letters evaluating other faculty for tenure and promotion means that other people think you are qualified to make this important assessment. It can also be terrifying, write Joya Misra and Jennifer Lundquist.
A national outcry regarding the cost of education and the poor performance of institutions in graduating their students has raised questions about the extent to which accreditors are fulfilling their mission of quality assurance. Politicians have expressed outrage, for instance, at the fact that accreditors are not shutting down institutions with graduation rates in the single digits.
At the same time, accreditors and others have noted that the graduation data available from the National Center for Education Statistics’ Integrated Postsecondary Education Data System, familiarly known as IPEDS, include only first-time, full-time student cohorts and, as such, are too limited to be the measure by which institutional success is measured -- or by which accreditation is judged. But simply noting this problem does nothing to solve it. The imperative and challenge of getting reliable data on student success must be more broadly acknowledged and acted upon. The WASC Senior College and University Commission (WSCUC) has taken important steps to do just that.
As is well known, IPEDS graduation rates include only those students who enrolled as first-time, full-time students at an institution. Of the approximately 900,000 undergraduate students enrolled at institutions accredited by WSCUC, about 40 percent, or 360,000, fit this category. That means approximately 540,000 students in this region, including all transfer and part-time students, are unaccounted for by IPEDS graduation rate data.
The National Student Clearinghouse provides more helpful data regarding student success: while including full-time student cohorts, part-time students are also considered, as well as students who combine the two modes, and data include information on students who are still enrolled, have transferred and are continuing their studies elsewhere or have graduated elsewhere. Six-year student outcomes, however, are still the norm.
Since 2013, WSCUC has worked with a tool developed by one of us -- John Etchemendy, provost at Stanford University and a WSCUC commissioner -- that allows an institution and our commission to get a fuller and more inclusive picture of student completion. That tool, the graduation rate dashboard, takes into account all students who receive an undergraduate degree from an institution, regardless of how they matriculate (first time or transfer) or enroll (full time or part time). It is a rich source of information, enabling institutions to identify enrollment, retention and graduation patterns of all undergraduate students and to see how those patterns are interrelated -- potentially leading to identifying and resolving issues that may be impeding student success.
Here’s how it works.
WSCUC collects six data points from institutions via our annual report, the baseline data tracked for all accredited, candidate and eligible institutions and referenced by WSCUC staff, peer evaluators and the commission during every accreditation review. On the basis of those data points, we calculate two completion measures: the unit redemption rate and the absolute graduation rate. The unit redemption rate is the proportion of units granted by an institution that are eventually “redeemed” for a degree from that institution. The absolute graduation rate is the proportion of students entering an institution who eventually -- a key word -- graduate from that institution.
The idea of the unit redemption rate is easy to understand. Ideally, every unit granted by an institution ultimately results in a degree (or certificate). Of course, no institution actually achieves this ideal, since students who drop out never “redeem” the units they take while enrolled, resulting in a URR below 100 percent. So the URR is an alternative way to measure completion, somewhat different from the graduation rate, since it counts units rather than students. But most important, it counts units that all students -- full time and part time, first time and transfer -- take and redeem.
Interestingly, using one additional data point (the average number of units taken by students who drop out), we can convert the URR into a graduation measure, the absolute graduation rate, which estimates the proportion of students entering a college or university (whether first time or transfer) who eventually graduate. Given the relationship between annual enrollment, numbers of units taken in a given year and the length of time it takes students to complete their degrees -- all of which vary -- the absolute graduation rate is presented as an average over eight years. While not an exact measure, it can be a useful one, especially when used alongside IPEDS data to get a more nuanced and complete picture of student success at an institution.
What is the advantage to using this tool? For an institution like Stanford -- where enrollments are relatively steady and the overwhelming majority of students enter as first-time, full-time students and then graduate in four years -- there is little advantage. In fact, IPEDS data and dashboard data look very similar for that type of institution: students enter, take roughly 180 quarter credits for an undergraduate degree and redeem all or nearly all of them for a degree in four years. For an institution serving a large transfer and/or part-time population, however, the dashboard can provide a fuller picture than ever before of student success. One of our region’s large public universities has a 2015 IPEDS six-year graduation rate of 30 percent, for example, while its absolute graduation rate for the year was 61 percent.
What accounts for such large discrepancies? For many WSCUC institutions, the IPEDS graduation rate takes into account fewer than 20 percent of the students who actually graduate. The California State University system, for example, enrolls large numbers of students who transfer from community colleges and other institutions. Those students are counted in the absolute graduation rate, but not in the IPEDS six-year rate.
As the dashboard includes IPEDS graduation rate data as well as the percentage of students included in the first-time, full-time cohort, it makes it possible to get a better picture of an institution’s student population as well as the extent to which IPEDS data are more or less reliable as indicators of student success at that institution.
Here’s an example: over the years between 2006 and 2013, at California State University Dominguez Hills, the IPEDS six-year graduation rate ranged between 24 percent and 35 percent. Those numbers, however, reflect only a small percentage of the university’s student population. The low of 24 percent in 2011 reflected only 7 percent of its students; the high of 35 percent in 2009 reflected just 14 percent. The eight-year IPEDS total over those years, reflecting 10 percent of the student population, was 30 percent.
In contrast, looking at undergraduate student completion using the dashboard, we see an absolute graduation rate of 61 percent -- double the IPEDS calculation. Clearly, the dashboard gives us a significantly different picture of student completion at that institution.
And there’s more. To complement our work with the dashboard, WSCUC staff members have begun work on triangulating dashboard data with data from the National Student Clearinghouse and IPEDS to look at student success from various angles. We recognize that all three of these tools have limitations and drawbacks as well as advantages: we’ve already noted the limitations of the IPEDS and National Student Clearinghouse data, as well as the benefit of the inclusion in the latter’s data of transfer students and students still enrolled after the six-year period. In addition, the data from both IPEDS and the clearinghouse can be disaggregated by student subpopulations of gender and ethnicity, as well as by institution type, which can be very beneficial in evaluating institutional effectiveness in supporting student success.
Pilot work has been done to plot an institution’s IPEDS and dashboard data in relation to the clearinghouse data, displayed as a box-and-whisker graph that provides the distribution of graduation rates regionally by quartile in order to give an indication of an institution’s success in graduating its students relative to peer institutions within the region. While care must be taken to understand and interpret the information provided through these data, we do believe that bringing them together in this way can be a powerful source of self-analysis, which can lead to institutional initiatives to improve student completion.
As noted, WSCUC has been working with the dashboard since 2013. While we are excited and encouraged regarding the benefits of the tool in providing a more complete and nuanced picture of student success, we also recognize that we have a great deal of work ahead of us to make the tool as useful as we believe it can be. After two pilot projects including a limited number of WSCUC-accredited institutions, the required collection of data by all WSCUC colleges and universities in 2015 revealed a number of challenges to institutions in submitting the correct data. The dashboard can be somewhat difficult to understand, especially for institutions with large shifts in enrollment patterns. And unlike National Student Clearinghouse data, dashboard data, at least at this point, cannot be disaggregated to reveal patterns of completion for various student subpopulations.
Such issues notwithstanding, we are encouraged by the value of the dashboard that we have seen to date and are committed to continuing to refine this tool. WSCUC staff members have given presentations both regionally and nationally on the dashboard, including one to IPEDS trainers to show them the possibilities of this tool to extend the data available nationally regarding student completion.
We are hopeful that other accreditors and possibly the NCES will find the dashboard a useful tool and, if so, adopt it as an additional completion measure for institutions across the country. In any case, we will continue to do this work regionally so as to not just complain about the available data but to also contribute to their improvement and usefulness.
Mary Ellen Petrisko is president of the WASC Senior College and University Commission. John Etchemendy is provost of Stanford University.
The Association of Public and Land-grant Universities on Wednesday said it backed an approach to the accreditation process where the agencies focus more time and energy on colleges that have problems than on those that don't. The U.S. Department of Education also recently has said it supports risk-based accreditation.
“Institutions with weaker outcomes often need a more careful review than schools with stronger track records that end up almost always being reaccredited to no one’s surprise,” said Peter McPherson, the group's president, in a written statement. “A risk-based accreditation process would ensure all schools are subject to a review, but that schools with a strong track record do not need to go through the same extensive and expensive review process as those with weak programs and poor student outcomes.”
An analysis of key actions 10 institutional accrediting agencies took over five years found a "highly uneven and inconsistent system of sanctions." The report from the Center for American Progress, which has previously chided accreditors for their oversight of poor-performing colleges, found that national accreditors are more likely to sanction their member colleges, but that regional agencies keep institutions on sanction for longer periods of time. The group recommended "clearer, common rules of the road about sanction terminology, definitions and use."
Growing up in a low-income family, David Machado knew he would have to find creative ways to pay for college.
After graduating from high school in Florida in 2004, he joined the U.S. Navy for the Post-9/11 GI Bill benefits and a chance to gain medical experience as a hospital corpsman. And when he went into the reserves in 2010 to have more time to focus on his education, he enrolled in community college, first in North Carolina and then in Connecticut.
Though he had been planning to transfer to a state school or the University of Connecticut, an English teacher convinced him Wesleyan University in Middletown, Conn., would be a good fit, allowing him to pursue his passions for poetry and painting and his childhood goal of becoming a doctor.
“I fell in love with writing and what he taught, and he’d talk about Wesleyan,” said Machado, now 29.
But his road to transfer wasn’t always smooth. He didn’t find out about a program for automatic transfer to UConn until he had too many credits to qualify. His community college adviser didn’t answer his emails, so he had to drop into his office to get help. Eventually he gave up on the adviser, relying instead on the advice of professors and others, who led him to other opportunities like a summer medical education program at Yale.
Still, he didn’t always take the right classes in his two years in community college.
“I didn’t understand the transferability of classes at the time, so I was just taking classes that would be of interest and would satisfy the pre-med requirements,” Machado said. Because many of his classes only transferred as electives, and some as three credits instead of four, Machado entered Wesleyan as a sophomore.
Though as many as 80 percent of community college students want to transfer, a study by the Community College Research Center, the Aspen Institute and the National Student Clearinghouse Research Center released in January found that only 14 percent of degree-seeking students earned a bachelor’s within six years. And research has found many pitfalls in the process of transferring from a community college to a four-year school.
Frequently, students at community colleges are advised to take courses that end up not being accepted by the local four-year campus. When courses transfer, many are accepted only as electives and do not count toward the students’ majors. In other instances, the prerequisite courses students need to transfer with junior standing aren’t offered in a given term, and so students either lose time waiting to take the courses or have to transfer and take them at the higher university cost. Research conducted by Public Agenda on the student experience of transfer found that a number of recurring themes are embedded in the stories of students like the one above:
Well-meaning but overwhelmed and underprepared general advisers at community colleges who lack the time and resources to provide students with correct and up-to-date information about degree pathways;
Faculty advisers who are critically important but dangerously siloed;
Diffuse and scattered information resources on transfer that students have difficulty accessing or effectively navigating;
A lack of clear programs of study that carry through the community college into the four-year institution and through graduation;
Insufficient or dysfunctional channels of communication between faculty and staff within and across two-year and four-year institutions, fueled by institutions’ cultural histories of suspicion and competition.
For first-generation and lower-income students, unconfident learners and students who lack clear goals, the stakes of these challenges are particularly high. Public Agenda research found that community college students often blame themselves for the barriers they face in seeking to transfer. Students not only lose time and money as they attempt to navigate broken systems, they also lose hope in their ability to make a better life through education.
In focus groups conducted by the Center for Community College Student Engagement at the University of Texas at Austin, students shared some of their frustrations with the transfer process.
I’d rather look for myself than ask for somebody to answer the questions, because I’ve had cases where those questions weren’t answered correctly, and since they’re not answered correctly it’s a big, big mistake. … If you miss a deadline because somebody answered your question wrong, you start getting skeptical about the advice you’re getting.
A quote from Public Agenda’s research captures the hope deficit that is created through the problems community college transfer students face.
I’m getting tired of school. I had a plan and thought I was doing everything right, and everyone I talked to [at the school] seemed so sure they were giving me the right information, so I never questioned it because I had no idea what I was doing. But here I am and I’ve probably lost two whole semesters taking classes I didn’t need or that ended up not transferring or counting toward my major. I don’t even want to think about the money I lost, because I couldn’t afford to lose it … at this point, honestly, I don’t know if I’m ever going to finish. I’m just getting tired.
The stories of transfer students show the dogged persistence needed to make it.
Jordan Kratz came out of high school in 2012 planning to be a veterinary technician. She chose SUNY Canton in northern New York for its specialized curriculum. But by the spring of her second year, Kratz, from Ballston Spa, N.Y., decided she didn’t want to work with animals full time and applied to transfer to Ithaca College.
“I actually did a total flip,” she said in a recent interview. “I’m in communications management and design.”
Kratz, now 21, dived into research on four-year colleges with the help of her parents and advice from friends. She didn’t turn to her adviser, who was a veterinarian experienced in helping students going to veterinary school.
“I didn’t know if he would have the advice for me that I was looking for,” she said.
The Ithaca admissions office was helpful, answering questions and offering tours, but it wasn’t until she enrolled that she got the full story on how her Canton credits would apply to requirements at Ithaca. Because Ithaca has a very specific core curriculum, many of Kratz’s credits only transferred for general credit.
“On my transcript it just says, ‘transfer elective,’” she said. “It doesn’t even say what the course was.”
In order to catch up, she has to take a series of courses in humanities, creative arts, social sciences and diversity on top of the upper-division courses in her major. But because she has senior standing, the registration system locks her out of the core classes designated for freshmen and sophomores.
“I’m actually having a hard time getting into them as a transfer student,” she said. By the time she files the override paperwork and, if that fails, appeals to the dean, the classes are full.
“You would think when they know you’re a transfer student they would override you into those classes,” Kratz said.
With four more core classes to go, in addition to other requirements, she’s hoping to graduate in the spring of 2017. By then she will have many more credits than she needs to graduate, even after having taken a semester off as she transferred.
“If I did the typical four years in college I should graduate this May,” she said.
Creating the conditions for more students to successfully transfer with junior standing in their majors is the collective work of institutions, systems and policy makers. Students share in the responsibility, but systems need to work better for the majority of students who come to community college with fewer supports and less confidence than Kratz.
As institutional leaders and policy makers seek to diagnose and address a tremendous host of challenges facing transfer students, elevating the voices and perspectives of students themselves is an essential piece of the work to be done.
Alison Kadlec is senior vice president and director of higher education and workforce programs at Public Agenda. Elizabeth Ganga is a communications specialist at the Community College Research Center at Columbia University's Teachers College.