Bachelor's degree recipients in 2007-8 who began their postsecondary educations at a community college took almost 20 percent longer to complete their degrees than did those who started out at a four-year institution, those who began at four-year private colleges finished faster than did those at four-year public and for-profit institutions, and those who delayed entry into college by more than a year out of high school took almost 60 percent longer to complete their degrees than did those who went directly to college.
America's smaller colleges and universities are rarely given much chance for victory in the NCAA basketball tournament. But, they call it March Madness for a reason, in large part because of the upsets when an underdog takes on the big favorite and wins. Bucknell has 3,500 students, but last year enjoyed the thrill of taking on a far larger school and succeeding, when we upset Kansas in the first round.
Now that we have earned a second straight bid to the NCAA men's basketball tournament last week, the media have praised our players' talent and tenacity. The Los Angeles Times described Bucknell as "Duke of the Susquehanna." Added the Arkansas Democrat-Gazette, "Think (of a) bigger, stronger, more talented and more athletic Princeton."
As Bucknell's president, I can tell you we mean what we say at Bucknell -- on the court and, more important, in the classroom. Our basketball success has demonstrated that impressive academic and athletic achievements are not mutually exclusive. In fact, athletics directors and academic administrators at colleges and universities around the nation need to understand the new realities of Division I basketball competition.
Earlier this week, the Institute for Diversity and Ethics in Sport at the University of Central Florida released yet another study showing a disturbing disparity between basketball players' and all students' graduation rates. Of 65 teams competing in the Big Dance, only Bucknell could boast a 100-percent graduation rate (one team, Penn, does not report such data). I want to suggest, however, that rather than remaining an anomaly in Division I athletics, Bucknell's program -- and many in the Patriot League -- be taken as a model of the next best thing in college sports.
That is, our Bison have confirmed that fielding a team of smart players can create a competitive advantage.
There are enough bright students with basketball skills who want to play major college basketball to permit schools like Bucknell, with a driving focus on quality education, to succeed in the NCAA tournament.
Consider Kevin Bettencourt, who scored a game-high 23 points in last week's Patriot League tournament final. He's an American history major who chose Bucknell for its "great academic reputation along with Division I athletics." Chris McNaughton, the 6-11 center whose graceful hook shot sent Kansas home early last March, traveled all the way from Germany to avail himself of Bucknell's nationally celebrated electrical engineering program. Kevin, Chris, and Patriot League Player of the Year Charles Lee, earned 3.4 G.P.A.s or better this fall.
When we recruit students like these, they choose us because they receive a great education and a great basketball opportunity, yet they always know that academics will come first. On Friday, Arkansas will meet a Bucknell team populated with future scientists, engineers, writers, and businessmen who, happily, also love to play basketball.
In the future, being a "big time" sports school is going to provide less competitive advantage than it used to, in part owing to the NCAA's academic reform plan. Bucknell supports reform efforts because we want all students, not just ours, to graduate with a solid education that prepares them for life.
Also, the schools that traditionally have dominated television coverage now have competition for viewers. The championship games of all the conferences -- not just the ACC, SEC, Big East, or Big Ten -- are being broadcast. And the trend is accelerating with broadband coverage and new stations such as ESPNU and CSTV.
As the lesser-known basketball programs enjoy greater exposure, quality players will increasingly opt to attend institutions like Bucknell, knowing they will have a reasonable shot at two hours (or more) of fame every March. But more important, they will join the ranks of those alumni who are CEOs, COOs, university professors, doctors, and lawyers, ensuring far more than two hours in the spotlight.
Just ask Les Moonves, a 1971 Bucknell graduate. He runs CBS, and CBS runs all the Big Dance games.
Brian C. Mitchell
Brian C. Mitchell is the president of Bucknell University in Lewisburg, Pa.
Of all the ideas to come out of Margaret Spellings's Commission on the Future of Higher Education, the final report proposal that has been the most contentious inside the DC Beltway is the proposal for a unit-records database. There are plenty of other controversial ideas floated in the commission's hearings, briefing papers, and report drafts, but the one bureaucratic detail that most vexed private colleges and student associations over the past year is the idea that the federal government would keep track of every student enrolled in every college and university in the country. Given reports this year about the Pentagon hiring a marketing firm to collect data on teens and college students, the possibility that Big Brother would know every student's grades and financial aid package has worried privacy advocates.
Fortunately, privacy and accountability do not need to be at odds.
The proposal for a unit-records database was floated in a 2005 report that the U.S. Department of Education commissioned. Advocates have argued that the current system of reporting graduation data through the Integrated Postsecondary Education Data System (IPEDS) only captures the experiences of first-time, full-time students who stay in a single college or university for their undergraduate education. How do we capture the experiences of those who transfer, or those who accumulate credits from more than one institution? Theoretically, we could trace such educational paths by tracking individuals, including your Social Security Number or another identifier to link records.
Charles Miller, who led the Spellings commission, was one of the unit-records database advocates and pushed it through the commission's deliberations. Community-college organizations liked the idea, because it would allow them to gain credit for the degrees earned by their alumni. But the National Association of Independent Colleges and Universities, the U.S. Student Association, and other organizations opposed the unit-records database, and in its current form the proposal is certainly dead on arrival as far as Congress is concerned.
There are three problems with a unit records database. The first problem is privacy. I just don't believe that the federal government would keep my children's college student records secure. An October report by the House Committee on Government Reform documents data losses by 19 agencies, including financial aid records that the U.S. Department of Education is responsible for. Who trusts that the federal Department of Education could keep records safe?
The second problem is accuracy. I have worked with the individual-level records of Florida, which has had a student-level database in elementary and secondary education since the early 1990s. If any state could have worked the kinks out, Florida should have. But the database is not perfectly accurate. I have seen records of first graders who are in their 30s (or 40s) and records of other students whose birthdays (as recorded in the database) are in 2008 and 2010. The problem is not that the shepherds of the database system are incompetent but that the management task is overwhelming, and there are insufficient resources to maintain the database. Poorly-paid data entry clerks spend their time entering students into the rolls, entering grades, withdrawals, and dozens of other small bits of information. We probably could have a nearly perfect unit-records database system, if we are willing to spend billions of dollars on maintenance, editing, and auditing. In all likelihood, a unit-records database system for all higher education in the U.S. would push most of the costs onto colleges and universities, with insufficient resources to ensure their complete accuracy.
The third problem with such a database is that the structure and size would be unwieldy. Florida and some other states have extensive experience with unit records, and very few researchers use the data that exist in such states. The structures of the data sets are complicated, and beyond the fact that using the data taxes the resources of even the fastest computers, the expertise needed to understand and work with the structures is specialized. Such experts live in Florida's universities and produce reports because they are the experts. But few others are. There would be no huge bonanza of research that would come from a national unit-records database.
A Solution: Anonymous Diploma Registration
Most of the problems with the unit-records database proposal can be solved if we follow the advice of statistician Steven Banks (from The Bristol Observatory) and change the fundamental orientation away from the question, Who graduated? and toward the question, How many graduated? The first question requires an invasion of privacy, expensive efforts to build and maintain a database, and a complex structure for data that few will use. But the second question -- how many graduated? -- is the one to answer for accountability purposes. It's the question that community colleges want answered for their alumni. And it does not require keeping track of enrollment, course-taking, or financial aid every semester for every student in the country.
All that we need is the post-graduation reporting of diploma recipients by institutions, with birthdates, sex, and some other information but without personal identifiers that would allow easy record linkage. Such a diploma registration system would fit with the process colleges and universities already go through in processing graduations. An anonymous diploma registration system could also identify prior institutions -- high schools where they graduated and other colleges where students earned credits that transferred and were used for graduation. Such an additional part of the system could be phased in, so that colleges and universities record the information when they evaluate transcripts of transfer students and other admissions. The recording of prior institutions would address the need of community colleges to find out where their alumni went and how many graduated with baccalaureate degrees.
Under such a system, any college or university could calculate how many students graduated and the average time to degree (as my institution in Florida already can). Any college or university could also count how many students who transferred to other institutions eventually graduated. High schools would be able to identify how many of their own graduates finished college from either in-state and out-of-state institutions. Institutions could figure out what types of programs helped students graduate, and the public would have information that is more accurate and fairer than the current IPEDS graduation statistics. All of these benefits would happen without having to identify a single student in a new database.
A short column is not the place to describe the complete structure for such a system or to address the inevitable questions. I am presenting the idea in more depth this afternoon at the Minnesota Population Center, and I have established an online tutorial describing the idea of anonymous diploma registration in more detail. But I am convinced that the unit-records database idea is wasteful, dangerous, and unnecessary. Anonymous diploma registration is sufficient to address the most critical questions of how many graduate from institutions, and it does not threaten privacy.
Now, there are some strings. As the Tucson Citizen notes, "It doesn't hold if students change majors midway through college or drop or flunk several courses. A few majors, such as engineering, are excluded because some students need to take pre-college math courses that can extend graduation beyond four years." So, do it right, make no changes, make no mistakes, and you can move efficiently through the university.
As someone who has to report to my university’s provost about what we will do to get our students to graduate in four years, I am sensitive to this newest fad. It affects how our institutions will be ranked and how parents will select the perfect place for their children to study. Yet, as a five-year undergrad myself, I am not sure why this is even a good goal. Yes, our federal loan money, and our state subsidies, will go to more students if we can push them through, but that is exactly what we would be doing ... pushing. And is that what we are here to do? For that matter, is efficiency a worthwhile measure of a college? Of a student?
When I attend events to recruit new students, I rejoice in those who don't know what they want to do. They come to the experience open for adventure, exploration, excitement, and challenge. I tell them that they will probably do better than those who have their future planned out. Why? Because most students change their majors. And, at a public university like mine, students are even more likely to change their majors than their private college counterparts.
Why do students change their majors? I think it is because students have little idea about (a) what jobs exist, (b) what majors correspond with what jobs, (c) what they are good at, and (d) what course of study would best use their abilities.
Hell, when I attend college major recruitment fairs, almost all the students and their parents line up for business, pre-med, and pre-law. (Working class folks tend to go for health sciences and business, because they hear there are jobs there.) I am tempted to just hand out fliers that say, "Business majors have to take accounting and advanced math. Pre-med (and health sciences) folks have to take a LOT of science courses... with labs! When you find you don't like those courses, or you fail a few of them because you actually have no special ability in advanced math or science, come check us out!"
That is how we get our majors, for the most part; the students realize that they picked a major for some bogus reason, like they knew someone who had X job and s/he made a lot of money, and they realize as they take more classes in that area that it is not what they originally thought or that it does not suit them. Then they look for something that actually suits their interests and talents. So, the parents who pushed them into their original major gnash their teeth and complain when their children have to take additional courses to meet our requirements, which are different than their original major, and their time is extended. Yet, while this can be more costly, it is such a bargain in the long term. Better to make the change in undergrad than to figure out, after earning the degree, that you are ill-suited for the professions for which you were prepared.
So, among those who don't finish in four, we first have the confused. Add to this number the students who party too much, who attend a college that doesn't suit them (that was my error), who have adjustment issues transitioning to undergraduate life, whose mental illness expresses itself during college, who have personal traumas in their lives (also my issue), whose families face financial downturns, who face discrimination or harassment, and/or who just bomb a class or two. Suddenly, our numbers look terrible! See how few students we graduate in four years!?! (And we aren't even counting the transfer student s-- the year-to-degree numbers only count students who entered as freshmen. If we included those folks in our numbers, we would see how few students really graduate in four years.)
If we still have a perverse need to measure time to degree rates, we should extend the bar to six years of full-time study, as we do for athletes and for some federal reporting requirements. (Athletes are not the only ones balancing academics with other interests!) We should exclude students who move to part-time status from our count. But I would hope that we would not use these data to rate institutions.
Finish in four sends the wrong message. It says that college is simply utilitarian, a means to a financial end. We should recognize that college is not high school. It is about self-discovery, the investigation of different majors and fields, and intellectual exploration and development. Let's reject this fad and focus on the long-term goals: producing graduates who can write, read, and think critically, and who can contribute to our society.
Lesboprof is the pseudonym of a faculty member and administrator at a public university in the Midwest where the official line is that four years and out is a good thing.
Education Secretary Margaret Spellings recently wrote a letter to the editor of The Detroit News in defense of her higher education commission's proposal for a national “student unit record” system to track all college entrants to produce a more accurate picture of degree completion. “Currently,” she said, “we can tell you anything about first-time, full time college students who have never transferred–about half of the nation’s undergraduates.” It took a long time to bring Education Department officials to a public acknowledgment of what its staff always knew: that the so-called “Congressional Methodology” of our national college graduation rate survey doesn’t pass the laugh test. If the Secretary’s Commission on the Future of Higher Education made one truly compelling recommendation, it was for a fuller and better accounting through student unit records.
But it was well known that the establishment of a national student unit record system was a non-starter in Congress due to false worries about privacy and data security. So one wonders why the department hasn’t simply proposed a serious revision of the process and formula for determining graduation rates. Having edited and analyzed most of the d-department’s postsecondary data sets, may I offer an honest and doable formula?
There are four bins of graduates in this formula, and they account for just about everyone the Secretary justly wants us to count. They count your daughter’s friends who start out as part-time students -- who are not counted now. They count your 31-year-old brother-in-law who starts in the winter term -- who is not counted now. They count active duty military whose first college courses are delivered by the University of Maryland’s University College at overseas locations -- who are not counted now. They count your nephew who transferred from Oklahoma State University to the University of Rhode Island when he became interested in marine biology -- and who is not counted now. And so forth. How do you do it, dear Congress, when you reauthorize the Higher Education Amendments this year?
First, define an “academic calendar year” as July1 through the following June 30, and use this as a reference period instead of the fall term only. Second, define the tracking cohort as all who enter a school (college, community college, or trade school) as first time students at any point during that period, and who enroll for 6 or more semester-equivalent credits in their first term (thus excluding incidental students).
Automatically, institutions would be tracking students who enter in winter and spring terms and those who enter part-time. Your brother-in-law, along with other non-traditional students, is now in the denominator along with your daughter. Ask our colleges to divide this group between dependent traditional age beginners (under age 24) and independent student beginners (age 24 and up), and to report their graduation rates separately. After all, your daughter and your brother-in-law live on different planets, in case you haven’t noticed. You now have two bins.
Third, establish another bin for all students who enter a school as formal transfers. The criteria for entering that bin are (a) a transcript from the sending institution and (b) a signed statement of transfer by the student (both of which are usually part of the application protocol). These criteria exclude the nomads who are just passing through town.
At the present moment, community colleges get credit for students who transfer, but the four-year colleges to which they transfer get no credit when these transfer students earn a bachelor’s degree, as 60 percent of traditional-age community college transfers do. At the present moment, 20 percent of the bachelor’s degree recipients who start in a four-year school earn the degree from a different four-year school. That we aren’t counting any of these transfers-in now is a travesty -- and makes it appear that the U.S. has a much lower attainment rate than, in fact, we do. All this hand-wringing about international comparisons that puts us on the short end of the stick just might take a different tone.
Fourth, ask our postsecondary institutions to report all students in each of the three bins who graduate at two intervals: for associate degree granting institutions, at 4 years and 6 years; for bachelor’s degree granting institutions at 6 years and 9 years. For institutions awarding less than associate degrees, a single two-year graduation rate will suffice. Transfers-in are more difficult, because they enter an institution with different amounts of credits, but we can put them all on the same reporting schedule as community colleges, i.e., 4 and 6 years.
These intervals will account for non-traditional students (including both active duty military and veterans) who move through the system more slowly due to part-time terms and stop-out periods, but ultimately give due credit to the students for persisting. These intervals will also present a more accurate picture of what institutions enrolling large numbers of non-traditional students, e.g. the University of Texas at Brownsville, DePaul University in Chicago, and hundreds of community colleges, actually do for a living.
Colleges, community colleges, and trade schools have all the information necessary to produce this more complete account of graduation rates now. They have no excuse not to provide it. With June 30 census dates for both establishing the tracking cohort and counting degrees awarded, the algorithms are easy to write, and data systems can produce the core reports within a maximum of two months. It's important to note that the tracking cohort report does not not replace the standard fall term enrollment report, the purposes of which are very different."
But there is one more step necessary to judge institutions' contribution to the academic attainment of the students who start out with them.
So, in rewriting the graduation rate formula in the coming reauthorization of the Higher Education Amendments, Congress should also ask all institutions to make a good faith effort to find the students who left their school and enrolled elsewhere to determine whether these students, too, graduated. The National Student Clearinghouse will help in many of these cases, the Consortium for Student Retention Data Exchange will help in others, state higher education system offices will help in still others, and we might even get the interstate compacts (e.g. the Western Interstate Commission on Higher Education) into the act. Require our postsecondary institutions to report the students they find in a fourth bin. They will not be taking credit for credentials, but will be acknowledged as contributing to student progress.
No, this is not as full an account as we would get under a student unit record system, but it would be darned close -- and all it takes is a rewriting of a bad formula.
After 27 years of research for the U.S. Department of Education, Clifford Adelman recently left to be a senior associate at the Institute for Higher Education Policy. His last monograph for the department was The Toolbox Revisited: Paths to Degree Completion from High School Through College (2006).
I am a faculty member, and so began my career with an almost inborn distaste for assessment, which seemed like the advanced jargon of administrators with a quixotic envy for corporate processes. The only model for assessment that I could think of was legislatively or decanally mandated, and therefore it smacked of makework. Over the past two years, though, I've come round quite a bit, and now see assessment as both politically inevitable and pedagogically useful -- if done correctly. That it is politically inevitable doesn't mean it's wrong -- higher education should become more transparent to interested parties. Would you rather a legislator, donor, or prospective student base decisions on incomplete data, hearsay, and idiosyncratic assumptions? Of course not.
This essay is about a number, the kind of number that made me take an interest in assessment's possibilities. While John Lombardi is rightly skeptical about the National Survey of Student Engagement surveys, which measure student satisfaction, there is a wealth of data in those surveys that, when appropriately framed, can help us think creatively about our work with students.
Like many regional comprehensive universities, the institution where I teach worries about its six-year graduation rates. Our mission of providing access to first-generation and other precarious aspirants to higher education is imperiled if we cannot help these students graduate. Our numbers haven't always been great, but a series of initiatives over the past few years may have started nudging the percentages in the right direction.
Many faculty members respond -- I have responded -- to attention to graduation rates in a couple of different ways: first, to blame others (the students!), and second, to assume that we will be asked to make the curriculum less rigorous. It sounds like an attack: How can you be doing your job if so few students finish?
But at a recent meeting about assessment, I learned the following tantalizing datum: Sixty-three percent of our full-time students who complete their first semester with a 3.0 or better grade-point average graduate within six years. When full-time students finish the first semester with a GPA below 2.0, only 9 percent graduate within six years.
This sort of tracking, conceived and performed by experts in assessment and statistical analysis, ought to spur professors to think about their mission, about their individual courses, and about their institutions' political status in a state or system. What are we teaching our students? How can we convey to first-year students the seriousness of creditable habits? How can we discuss seriously with outside stakeholders the challenges posed by teaching adults?
For some time now, the great fetish of assessment gurus has been so-called "value-added" assessment: You can't just test what students know at the end of a semester or a program of study, because such a test can't discriminate between knowledge gained during the course and outside of it. Many professors and institutions use a combination of pre- and post-assessment as a kludge: "Here's what the students know at the start of the semester" and "Here's what they know at the end." This is a start, but it's still somewhat indirect, since improvement on such metrics doesn't always capture causal relationships.
The 63/9 percent statistic might call into question the value of pre- and post-assessments that aren't specifically about bodies of knowledge, since it suggests that differences in student performance arise from factors external to the particular class or course of study. The student with a 3.5 in her first semester doesn't need to be taught critical thinking; she is already an adept critical thinker, and will simply be refining that skill and adding to her base of knowledge. The student, by contrast, who struggles to achieve a 1.4 could very well improve -- and we all know students who have done, and perhaps some of us have even been that student. It's also possible that the student might have performed better on a different measure than grades. But it might also be the case that that student needs to pull away from college for a while. Perhaps he needs to try again in a semester when his childcare is more stable, or after she's saved up money, or after her father has weathered his major surgery. Or maybe he needs to come back after some time away, having reflected on what makes college success possible. (Again, some of us might have been this student.) Perhaps she needs to rethink whether college is, at present, as necessary to her career path as she believes. Is it the right thing to aspire to keep all such students on campus at all costs? Could a low graduation or retention rate mean that the institutions helps students make good long-term decisions, even if sometimes that decision is that they need to put off higher education?
To put all of this slightly more directly: The consistency of outcomes from first semester to sixth-year graduation suggests that we need to take a deep breath and think about what we're doing. Blaming K-12 educators for delivering us poor students isn't very credible when, to a surprising extent, we simply validate their outcomes.
Surveys of student engagement repeatedly indicate that first-year students put in nothing like the mythical two to three hours of out-of-class preparation for each hour in class. Indeed, many students spend fewer hours studying outside of class than they spend in class during the week. The 63/9 split is relevant here: Do you pitch your course to those students who will do the work outside of class? ("Teaching to the six," as Michael Bérubé once called it.) Or do you try to make the course manageable by more students?
The split suggests that the latter strategy is a good example of the fallacy of good intentions. You can craft an intro course such that more students pass it, but such strategies smack of social promotion -- students not adept at managing college work in the first semester are going to continue to struggle. What's necessary instead is a pedagogy that bootstraps students into desired study habits. Technology can help: required posts to a class discussion board or blog, the use of social bookmarking tools to create a community of inquiry, the capacity of course management software to grade simple quizzes for you -- all of these things can help students learn how to prepare without necessarily sucking up vast quantities of time.
We can decry a generation brought up believing in the myth of multi-tasking (and that myth has done our students real harm), but unless we systematically design courses to inculcate sustained attention -- and then reward that attention by making class time intellectually meaningful--then we're not really contributing much beyond gripes and moans.
Assessment in college is different from assessment in elementary and secondary education, since college isn't mandatory. We control much less about our students than did the parents and teachers who have taught them (or not) over the previous 18 or more years. The choices of young adults drive their success far more than anything we offer.
It's true that legislators, tuition-payers, and future employers of our graduates have the right to demand effective teaching. But we can't teach students who are forced to work 35 hours while they're in college. We can't teach students who don't have access to affordable, reliable daycare. We can't teach students who have significant health concerns. The rhetoric of assessment is all too frequently pitched at whipping those tenured layabouts -- or, worse, tenured radicals -- into compliance. But turning any college into a legislators' paradise -- 5/5 teaching loads taught by contingent faculty -- won't have demonstrable results on student success. Effective assessment of colleges and universities needs to be thought of as promoting learning, not as disciplining the unruly faculty.
Many faculty are suspicious about assessment, whether for ideological reasons or because they perceive it as an unfunded administrative mandate. And faculty hear numbers, especially subpar numbers, as an indictment of their expertise or their empathy for students. I have reacted this way myself. Now, however, I try to remember that numbers are an opening salvo, not the final word: We've got a measurement -- how do we improve it? That number looks bad -- but what are its causes? Is the instrument measuring the right thing? Are we administering it in the best way? Are we making sure there's a tight fit between assessment measures and intended learning outcomes? Until we begin to think clearly, both within departments and across schools and even across peer institutions -- about what our students are up to, our own cultural position will continue to seem in crisis.
Jason B. Jones
Jason B. Jones is an associate professor of English at Central Connecticut State University. His book, Lost Causes: Historical Consciousness in Victorian Literature, was published in 2006 by Ohio State University Press. Online, he maintains a blog at The Salt-Box and contributes regularly to PopMatters and Bookslut.
At a time when postsecondary education is a requirement for an increasing number of U.S. jobs, community colleges provide broad access to higher education, enrolling nearly half of the nation’s undergraduates. But is access enough? Fewer than half of degree-seeking community college students achieve their goals. Do we want merely to get students to attend college, or are we committed to seeing them through to graduation?
One might think that states, in order to reap the economic benefits of a more educated workforce, would offer incentives for more students to complete their education. But most states link their support of community colleges to enrollment levels, not to student progress or success. Public funding rewards getting students into the college, independent of whether any given student is achieving his or her educational goal or is on the road to dropping out.
Over the years, a number of states have experimented with financial incentives based on performance measures like graduation rates; but a newly approved program in Washington state takes a bold and different approach. The State Board for Community and Technical Colleges decided that institutions might be more motivated to improve performance by rewards for student progress past key “momentum points,” as well as for completion. Under the new plan, Washington will reward community and technical colleges for every student who achieves particular research-based benchmarks leading up to and including graduation.
Washington's community and technical colleges will receive extra money for students who earn their first 15 and first 30 college credits, earn their first 5 credits of college-level math, pass a pre-college writing or math course, make significant gains in certain basic skills tests, earn a degree or complete a certificate. Colleges also will be rewarded for students who earn a GED through their programs. All of these benchmarks are important accomplishments that help propel students forward on the road of higher education.
Washington State’s Student Achievement Initiative rewards its colleges for helping students continue moving forward regardless of where they start or how far they may be from attaining their educational goals. Successful students take many intermediate steps between enrollment and graduation, each accomplishment building a foundation for future success. Washington state’s plan recognizes the importance of supporting students as they achieve these intermediate milestones and rewards colleges for doing so. A student who is unable to pass a pre-college math course, for example, cannot continue on to college-level work, much less earn a degree.
We know there are key points along students’ educational journeys where they may be more likely to discontinue or postpone their studies. Students who are underprepared for college-level work are less likely to graduate than their peers who move directly into college classes, for example. However, an analysis of data from Achieving the Dream: Community Colleges Count, a national initiative to help more community college students succeed, shows that students who successfully completed any developmental course in their first semester were actually more likely than their peers to persist and succeed. Washington’s plan seeks to focus colleges’ attention on some of these key educational turning points and improve the odds of success at each step.
Knowing that the success of the Student Achievement Initiative depends upon buy-in at the institutional level, from CEOs down to classroom faculty, the State Board pursued an inclusive design process and is reaching out to every college in the state. During the design phase, presidents, trustees, business and civic leaders, faculty representatives and others -- both supportive and skeptical -- were consulted. In the current year, when the new system will be tested before full implementation, video conferences have been held with faculty members, administrators, and other staff at every college.
This incentive program is a good fit in Washington, which is among 15 states across the country participating in the Achieving the Dream initiative. Participating colleges make five specific commitments, which align well with Washington’s new benchmarks. The colleges pledge to increase the percentage of students who complete developmental courses, complete introductory college courses, complete any courses they take with a “C” or better, re-enroll from one academic term to the next, and earn certificates and degrees. For each commitment, colleges analyze data to measure their progress with support and guidance from the initiative.
Currently, six of Washington’s 34 community and technical colleges participate in Achieving the Dream and can serve as a learning laboratory for the entire system. The state’s incentive plan gives colleges the freedom to figure out how best to improve their students’ success rates, and being able to learn from peers who have already analyzed the effectiveness of various strategies will help them make more informed decisions.
Washington isn’t the only state where such an incentive system can work. With more than 80 participating colleges, Achieving the Dream provides an existing support network for efforts to improve student success rates. And offering student success incentives need not be confined to Achieving the Dream states. More states should implement similar programs, altering incentives in ways that will compel colleges to action. With so many students in community colleges and so many of today’s jobs requiring higher-level skills, it just makes sense.
George R. Boggs and Marlene B. Seltzer
George R. Boggs is president and CEO of the American Association of Community Colleges. Marlene B. Seltzer is president and CEO of Jobs for the Future. Both of their groups are among nine national organizations working together as part of Achieving the Dream: Community Colleges Count.