Assessment

CLA experiment focused private colleges' attention on assessment

Smart Title: 

Report finds that use of standardized measure of student learning drove curricular innovations at consortium of private colleges -- but produced little (so far) in measurable improvement.

Student veterans do better than peers when given support services

Section: 
Smart Title: 

Student veterans, when given support services at their colleges, earn better grades and show higher retention than their peers, a study finds.

Essay: Focus student success efforts on what happens in the classroom

Over the past 20 years, if not more, colleges and universities, states and private foundations have invested considerable resources in the development and implementation of a range of programs to increase college completion. Though several of these have achieved some degree of success, most have not made a significant impact on college completion rates.

This is the case because most efforts to improve college completion, such as learning centers and first-year seminars, sit at the margins of the classroom and do not substantially improve students' classroom experience. Lest we forget, many students, certainly those in community college, commute to college and work and/or attend part-time. For them, if not for most students, the classroom is one, and perhaps the only, place where they meet with faculty and other students and engage in learning activities. Their success in college is built upon classroom success, one class and one course at a time. If our efforts do not reach into the classroom and enhance student classroom success, they are unlikely to substantially impact college success.

How then should colleges proceed? First and foremost they must direct their actions to the classroom, especially for those in the first year, and construct classrooms whose attributes are such as to enhance the likelihood that students will succeed academically.

Attributes of Effective Classrooms

What are the attributes of such classrooms? Generally speaking, they can be described by the terms expectations, support, assessment and feedback, and involvement. Unlike the attributes of students, these are within the grasp of institutions to modify if they are serious about enhancing student success.

Expectations 

Student classroom performance is driven, in part, by the expectations that faculty have for their students and that students have for themselves. Student success is directly influenced not only by the clarity and consistency of expectations, but also by their level. High expectations are a condition for student success; low expectations a harbinger of failure. Simply put, no one rises to low expectations. A faculty member’s expectations are communicated to students, sometimes implicitly, through syllabuses, assignments, grading metrics, course management sites, and conversations. Students quickly pick up what is expected of them in the classroom and adjust their behaviors accordingly. In this regard it is telling that evidence from the National Survey of Student Engagement indicate that the expectations of beginning college students for the amount of work required for classroom success declines over the course of the first year. 

Support

It is one thing to hold high expectations; it is another to provide the support students need to achieve them. At no time is support, in particular academic support, more important than during the critical first year of college when student success is still so much in question and still malleable to institutional intervention. A key feature of such support is its being aligned or contextualized to the demands of the classroom, thereby enabling students to more easily translate the support they receive into success in the classroom. As applied to basic skills for instance contextualization creates explicit connections between the teaching of reading, writing, or mathematics on one hand and instruction in a subject area on the other, as might occur when writing skills are taught with direct reference to material taught in a sociology class.

Assessment and Feedback

Students are more likely to succeed in classrooms that assess their performance and provide frequent feedback about their performance in ways that enable everyone -- students, faculty, and staff -- to adjust their behaviors to better promote student success in the classroom. Classroom assessment of student performance is particular effective when it is early and is used to trigger to provision of academic support to those whose performance indicates the need for support. This is especially true during the first year when students are trying to adjust their behaviors to the new academic and social demands of college life. 

Involvement

A fourth, and perhaps the most important, attribute of effective classrooms is involvement, or what is now commonly referred to as engagement. Simply put, the more students are academically and socially engaged with faculty, staff, and peers, especially in classroom activities, the more likely they are to succeed in the classroom. Such engagements lead not only to social affiliations and the social and emotional support they provide, but also to greater involvement in learning activities and the learning they produce. Both lead to success in the classroom. As with assessment and feedback, involvement is particularly important early in the semester, as it helps to establish a pattern of student behaviors that further enhances student effort throughout the semester.

Efforts to Enhance Classroom Effectiveness

Though they are still limited in scope, there are now a number of efforts to reshape the classroom by altering the way academic support is provided, improving the usability of assessment and feedback techniques, and restructuring patterns of student engagement in the curriculum and classroom. Several of these deserve special attention, not only because of evidence that supports their effectiveness, but also because of their capacity to reshape the nature of classroom learning, and in turn enhance classroom success -- in particular, but not only, for those who enter college academically underprepared.

Contextualized Academic Support

Contextualized support can be achieved in a variety of ways. Perhaps the most common is that where study groups are directly connected to a specific course, as they are in supplemental instruction.

In this case, leaders of the study groups work closely with the course instructor to ensure that the work of the group is closely aligned to the demands of the course. The result is that courses to which such groups are linked typically have higher average grades, if only because there are many fewer low grades. For some students who are just below college-level work, accelerated learning programs that link a college-level course to a study or basic skills course yield similar results.

These programs, such as the one at the Community College of Baltimore County, challenge the conventional assumption that basic skill instruction should precede the beginning of college-level work.

For other students who require additional academic skills, learning communities, such as those at the City University of New York's LaGuardia Community College, are being used to connect one or more basic skills or developmental courses, such as writing, to other content courses, such as history, in which the students are also registered. In other cases, they may include a student success or counseling course. In this and other ways, learning communities provide a structure that enables the institution to align its academic and social support for basic skills students in ways that allow students to obtain needed support, acquire basic skills, and learn content at the same time.

Contextualization can also occur through the integration of academic support within the classroom. The Washington State Board of Community and Technical Colleges developed the Integrated Basic Education and Skills Training (I-BEST) initiative that enables students in technical and vocational courses to get academic support from basic skills instructors while earning credit toward a certificate or degree. This is achieved through the collaboration of basic skills instructors and faculty who jointly design and teach college-level technical and vocational courses. As a result, students learn basic skills and program content at the same time from a team of faculty. The result is that I-Best students fare better on a variety of outcomes (e.g., credits earned, completion of workforce training) when compared with traditional students at the same proficiency level.   

Automating Classroom Assessment, Feedback, and Early Warning

There are a variety of assessment techniques that can be used to assess student learning and trigger academic intervention when necessary. Classroom assessment techniques like the “one-minute” paper and the “muddiest point” described by Angelo and Cross have been in practice for decades. So are early warning systems that employ information on student performance to trigger intervention.

What is new is the availability of technologies that allow faculty to easily capture and analyze more and different data in ways that can provide a clearer view into student learning and automate previously time-consuming tasks whose effort often stymied efforts at wide adoption. The Signals project at Purdue University, for instance, employs predictive modeling and data mining of student performance on mini-exams and patterns of utilization of course materials on a web-based platform to identify students who are “at risk” of doing poorly in a course. Once these students are identified, the system sends alerts to faculty and then emails the students urging them to seek help via available resources such as office hours, study materials, and various academic support services. Though employed throughout the university, it has proven most effective for students in their first two years of coursework.

Promoting Classroom Engagement

Faculty are moving not only to change the manner in which students experience the curriculum, as they do in learning communities, but also the way they experience learning. They do so by employing pedagogies of engagement, such as cooperative learning and problem or project-based learning, that require students to work together in some form of collaborative groups and become active, indeed responsible, for the learning of the group and classroom peers. In this way, students share not only the experience of the curriculum, but also of learning within the curriculum.

By asking students to construct knowledge together, as they do at the University of Delaware and North Essex Community College, such pedagogies involve students both socially and intellectually in ways that promote cognitive and social development as well as an appreciation for the many ways in which one's own knowing is enhanced when other voices are part of that learning experience. As importantly, they enhance student effort and the heightened learning that follows.

Building Effective Classrooms: Enhancing Faculty Skills 

These strategies, especially those that employ pedagogies of engagement to enhance student classroom success, ultimately depend on the skills of the faculty to effectively implement them in class. Yet the faculty who teach those classes, unlike those who teach in primary and secondary schools, are often not trained to teach their students. This is not to say that there are not many talented college faculty who bring considerable skills to the task of teaching students. There are. Rather college faculty are not, generally speaking, trained in pedagogy, curriculum, and assessment in ways that would enable them to be more effective in promoting the success of their students in the classrooms they teach, in particular but not only those who are academically underprepared.

Of course, colleges are not blind to the issue of faculty skills. For years they have invested in faculty development programs. Yet for all that investment, little change is apparent, if only because most programs are not well-conceived, are voluntary in nature, and/or attract a small segment of the faculty. Fortunately, this is beginning to change at a limited but growing number of colleges, such as Chandler-Gilbert Community College, Richland College, and Foothill College.

These institutions have established faculty development programs that require all new faculty to be part of a two-year or longer series of activities in which faculty, working together in what amounts to a faculty learning community, acquire pedagogical, curricular, and assessment skills appropriate to the needs of community college students, in particular those who require basic skills instruction. 

Closing Comment

Efforts to increase student success in college are not new. But most have not penetrated the classroom. Even when successful, they have been isolated, sometimes idiosyncratic, and often of limited duration. If we are serious in our efforts to enhance college success, much must change. Our students deserve no less. Our nation requires no less. It is time to take the classroom seriously.

Author/s: 
Vincent Tinto
Author's email: 
vtinto@syr.edu

Vincent Tinto is Distinguished University Professor at Syracuse University. His forthcoming book, Completing College: Rethinking Institutional Action (University of Chicago Press), discusses these issues.

U.S. panel's ideas for revamping higher ed accreditation

Smart Title: 

In draft report, U.S. accreditation panel stops short of recommending ending link between agencies' judgments and access to federal student aid, but suggests how such a system could work.

 

Ohio chancellor wants to end remedial education at public universities

Smart Title: 

Chancellor of Ohio's higher ed system wants to end remedial education at four-year universities. Critics say the policy could hurt minority and low-income students.

Moneycollege

If you’ve seen “Moneyball,” the new baseball film about the unlikely success of the Oakland A’s and their out-of-the-box-thinking general manager, Billy Beane, you may have already drawn parallels to the current state of higher education. If not, we’re pleased to do it for you.

Early in "Moneyball" there’s a funny scene of Billy sitting around a table with his scouts, wise old men of America’s pastime. The scouts jaw on about players’ arms, legs and bodies and their potential. One scout insists that an ugly girlfriend means that a player doesn’t have confidence. The scouts are entranced by the obvious. And when it comes to metrics, the scouts focus on what’s easy to measure. The scouts love high school pitchers: “High school pitchers had brand-new arms, and brand-new arms were able to generate the one asset scouts could measure: a fastball’s velocity,” Michael Lewis writes in the book on which the movie was based.

But Billy isn’t fooled. He decides to bring data to the table in the form of Peter Brand, a Yalie with an economics degree and a statistics-spewing laptop ready at hand.

It turns out that high school pitchers are much less likely to go on to successful major league careers than are comparable pitchers who have attended college. And when you try to correlate a range of statistics to runs scored, batting average is a poor indicator, whereas on-base percentage (OBP) is highly correlated. So Billy and the A’s eschew high school pitchers and focus on OBP; the A’s begin to value and acquire players with a knack for getting on base any way they can, especially by taking walks.

The result, chronicled in the entertaining film based on Lewis's book, is an unlikely group of major leaguers who, during the 2002 season, win 20 games in a row -- still a record -- and make the playoffs.

“My only question is if he’s that good a hitter, why doesn’t he hit better?”

-- Billy Beane

Like baseball 10 years ago, higher education is focused on what’s easy to measure. For baseball it may have been body parts, batting averages and the numbers on the radar gun. For higher education, it’s the 3Rs: research, rankings and real estate. Each of these areas is easily quantified or judged: research citations or number of publications in Nature and Science; U.S. News ranking (or colleges choose from a plethora of new entrants to the ranking game, including the international ranking by Shanghai Jiao Tong University); and in terms of real estate, how much has been spent on a new building and how stately, innovative and generally impressive it appears.

Unfortunately, the 3Rs correlate about as closely to student learning and student outcomes as batting average or fastball velocity, which is to say, not at all. Buildings are the “ugly girlfriend” of higher education.

Universities that continue to focus on the 3Rs in the wake of the seismic shifts currently roiling higher education (state budget cuts, increased sticker shock, technology-based learning) are either not serious about improving student learning and student outcomes, or they’re like the baseball fan who has lost her car keys in the stadium parking lot at night. Where does she look for them? Not where she lost them, but under the light because that’s where she can see.

“A young player is not what he looks like, or what he might become, but what he has done.”

-- Billy Beane

Similarly, a university is not what its buildings look like, or what its reputation or rankings say, but what it has done. And by done, we don’t mean research. The link between research and instructional efficacy is unproven at best. We define instruction of students to mean producing measurable outcomes in terms of student learning and employment.

The first step will be to get the data; before we find the Billy Beane of higher education, we first need to find Bill James. With his famous Baseball Abstract, Bill James revolutionized how data was tracked, and which metrics were most important to the success of teams and individual players. James jump-started a movement, called sabermetrics, that collected data that had never before been systematically collected: the pitch count at the end of at-bats, pitch types and locations, the direction and distance of batted balls.

A report issued last month by Complete College America, an organization funded by the Bill & Melinda Gates Foundation and the Lumina Foundation for Education, demonstrates just how ripe higher education is for sabermetrics. While the report was sobering in the data it did present (e.g., of every 100 students who enroll in a public college in Texas, 79 enroll in a community college -- of these 79, only seven have completed a program in four years’ time), more fundamental are the huge holes in the data – larger than the holes in the Houston Astros infield! According to Stan Jones, president of Complete College America, the data are incomplete because students who enroll part-time or who transfer are not tracked: “We know they enroll, but we don’t know what happens to them,” he said. “We shouldn’t make policy based on the image of students going straight from high school to college, living on campus, and graduating four years later, when the majority of college students don’t do that.”

“The great thing about college players: they had meaningful stats. They played a lot more games, against stiffer competition, than high school players. The sample size of their relevant statistics was larger, and therefore a more accurate reflection of some underlying reality. You could project college players with greater certainty than you could project high school players.”

-- Michael Lewis, Moneyball

How ironic that we may be doing a better job gathering baseball statistics at colleges than we are at gathering education statistics. It is essential that we begin to track persistence data on part-time and transfer students on a systematic basis. The Department of Education should lead this initiative. Failing that, Gates, Lumina and others undoubtedly will pick up the slack.

Just as the Moneyball approach has narrowed the gap between teams with $40 million payrolls and teams with payrolls three times higher (see, e.g., Tampa Bay Rays storming back in the month of September and taking the American League wild card berth away from Boston with a payroll of $41 million, 25 percent of the Red Sox payroll), finding and tracking the OBP of higher education will do the same for data-driven institutions of all stripes, including those that do not receive state subsidies, and those that pay taxes.

With the right data, dozens of would-be Billy Beanes will spring up across the country arguing what the on-base percentage equivalent for higher education is, coalescing on persistence and completion metrics that are meaningful for all students (i.e., traditional/adult, full-time/part-time, on-ground/online) and helping their institutions reform and restructure to increase “wins.”

Completion Rates in Context

Much attention has been directed at college completion rates in the past two years, since President Obama announced his goal that the United States will again lead the world with the highest proportion of college graduates by 2020. The most recent contribution to this dialogue was last month’s release of "Time Is the Enemy" by Complete College America.

Much in the introduction to this report is welcome. Expanding completion rate reporting to include part-time students, recognizing that more students are juggling employment and family responsibilities with college, acknowledging that many come to college unprepared for college-level work -- such awareness should inform our policy choices. All in higher education share the desire expressed by Complete College America that more students complete their programs, and do so in less time.

The graduation rates for two-year institutions included in "Time Is the Enemy" show, however, just how inadequate our current measures are for assessing community college student degree progress -- a shortfall also acknowledged by the appointment of the federal Committee on Measures of Student Success, which is charged with making recommendations to the U.S. education secretary by April. Our current national completion measures for community colleges underestimate the true progress of students, presenting a misleading picture of the performance of these open-admissions institutions.

The following suggestions might inform a new set of national metrics for assessing student performance at two-year institutions.

Completion Rates for Community Colleges Should Include Transfers to Baccalaureate Institutions. Although community colleges usually advise students aiming for a bachelor’s degree to complete their associate degree before transferring, to reap the benefits of additional tuition savings and attain a credential, transferring before attaining the associate degree is, for many students, a rational decision. Accepting admission and assimilating into competitive baccalaureate programs and institutions, establishing mentorships with professors in the intended baccalaureate major, or embracing the residential college experience may all lead students to transfer before completing the associate degree. In addition, for a variety of reasons, universities may delay admission of incoming freshmen to the spring semester and advise them to start in the fall at a community college. These students are not seeking degrees at the community college, and will transfer after one semester. Thus, for two-year institutions, preparing students for transfer to a four-year institution should be considered an outcome as favorable as a student earning an associate degree.

The appropriate completion measure for community colleges is a combined graduation-transfer rate. The preferred metric is the percentage of students in the initial cohort who have graduated and/or transferred to a four-year institution. It is important to include transfers to out-of-state institutions in these calculations. In Maryland, a fourth of the community college transfers to baccalaureate institutions enroll in colleges and universities outside of Maryland. Reliance on state reporting systems that do not utilize national databases such as the National Student Clearinghouse to report this metric results in serious underestimates of student success. The need to track transfers across state lines is a major reason for the so-far-unsuccessful push for a national unit record system.

Comparisons of completion rates at community colleges and four-year institutions, where transfer is not included in the community college measure, are inappropriate. Reports such as "Time Is the Enemy" that report graduation rates for community colleges, with table labels such as “Associate Degree-seeking Students,” are misleading in that these calculations include many students who are pursuing baccalaureate transfer programs with no intention of earning the associate.

Completion Rate Calculations Should Exclude Students Not Seeking Degrees. Community colleges serve many students not seeking a college degree, and these students should be excluded from the calculation of completion rates. A student’s stated intent at entry is not adequate to identify degree-seekers, since students may be uncertain about their goals and goals may change. Enrollment in a degree program is not adequate, since students without a degree goal must declare a program in order to be eligible for financial aid, and many colleges force students to choose a major in order to gather gauge student interest for advising purposes.

A better way to define degree-seeking status is based on student behavior. Have students demonstrated pursuit of a degree by enrolling in more than two or three classes? A minimum number of attempted hours is the preferred way of defining the cohort to study. In Maryland, to be included in the denominator of graduation-transfer rates, a student must attempt at least 18 hours within two years of entry. Hours in developmental or remedial courses are included. This way of defining the cohort has several benefits. It does not exclude students beginning as part-time students, as IPEDS does. It eliminates transient students with short-term job skill enhancement or personal enrichment motives. By using attempted hours as the threshold, rather than earned credits as in some other states, this definition does not bias the sample toward success. Students who fail all their courses and earn zero credits will still be in the cohort if they have attempted 18 hours. And finally, it seems reasonable that students show some evidence of effort to persist if institutions are to be held accountable for their degree attainment.

Recognize that Community College Students Who Start Full-time Typically Do Not Remain Full-time. A number of studies suggest that the majority of community college students initially enrolling full-time switch to part-time attendance. This contrasts with students at most four-year institutions, who start and remain full-time. For example, 52 percent of students at community colleges that participate in the Achieving the Dream project began as full-time students. Yet only 31 percent attended full-time for the entire first year. Studies of Florida’s community colleges find similar results. Most students end up with a combination of full-time and part-time attendance, regardless of their initial status. Among students enrolled at least three additional semesters, only 30 percent of Florida’s “full-time” community college students enrolled full-time every semester. As a Florida College System report concludes, “Expecting a ‘full-time’ student to complete an associate degree in two years or even three assumes that the student remains full-time and this is most often not the case. As a result, students will progress at rates slower than assumed by models that consider initial full-time students to be full-time throughout their time in college.” Thus, comparisons of completion rates at 2-year and 4-year institutions, even controlling for full-time status in the first semester, are misleading. Studies at my college suggest that completion rates of community college students who start full-time and continuously attend full-time without interruption are comparable to completion rates attained at many four-year institutions.

Extend the Time for Assessing Completion to at least Six Years. “Normal Time” to completion excludes most associate degree completers. Due to part-time attendance, interrupted studies, and the need to complete remedial education, most associate degree graduates take more than three years to complete. Completion rates calculated at the end of three or four years will undercount true completion. It is not uncommon for a third of associate degree completers to take more than four years to complete their degree. At my institution, fully 5 percent of our associate degree recipients take 10 or more years to complete their “two-year” degree. These students are not failures; they are heroes. Yes, we would all like students to complete their degrees more quickly. But if life circumstances dictate a slower pace, let us support these students in their remarkable persistence. And, in our accountability reporting, recognize that our completion rate statistics are time-bound and fail to account for all who will eventually succeed in their degree pursuit.

When Comparing Completion Rates, Compare Institutions with Similar Students. Differences in completion rates among institutions largely reflect differences in student populations. Community college students who are similar to students at four-year institutions in academic preparation, and in their ability to consistently attend full-time, achieve completion rates comparable to those at many four-year institutions. In Maryland, if you include transfer as a community college completion, community colleges have four-year completion rates equal or higher than the eight-year bachelor’s degree graduation rates at a majority of the state’s four-year institutions with open or low-selectivity admissions. And the completion rate of college-ready community college students -- those not needing developmental education — is similar to all but the most selective four-year schools. At my college, 88 percent of the students in our honors program have graduated with an associate degree in two years. This graduation rate is comparable with that of Johns Hopkins and above that of the flagship University of Maryland at College Park.

Students at four-year institutions who are similar in profile to the typical community college student have completion rates similar to those attained at community colleges. This is not a new finding. A March 1996 report, "Beginning Postsecondary Students: Five Years Later," identified the following “risk factors” affecting bachelor’s degree completion: delayed enrollment in higher education, being a GED recipient, being financially independent, having children, being a single parent, attending part-time, and working full-time while enrolled. Fifty-four percent of the students who had none of these risk factors earned the bachelor’s degree within five years. The graduation rate for students with just one of these risk factors fell to 42 percent. For students with two risk factors the bachelor’s degree graduation rate was 21 percent, and for those with three or more the graduation rate was 13 percent.

Readers of this essay who work at community colleges are probably smiling to themselves. For most community colleges, the majority, if not the overwhelming majority, of students are coping with several of these risk factors. And this list does not account for the need of most community college students for developmental or remedial education. The comparability of completion rates at two- and four-year institutions, when student characteristics are controlled for, should not be a surprising finding.

If we must compare completion rates, it is incumbent upon analysts to account for differences in the academic preparation and life circumstances of student populations. This can be done by sophisticated statistical analysis, or in the selection of peer groups of institutions with similar admissions policies and student body demographics.

Support Hopeful Signs at the Federal Level. The work to date of the Committee on Measures of Student Success authorized by the Higher Education Act of 2008 is encouraging. The committee is to make recommendations to the Secretary of Education by April 2012 regarding the accurate reporting of completion rates for community colleges.

A number of the recommendations in the committee’s draft report issued September 2, 2011 would greatly improve reporting of completion statistics for community colleges:

  • Defining the degree-seeking cohort for calculating completion rates by looking at student behavior, such as a threshold number of hours attempted.
  • Recognizing that “preparing students for transfer to a four-year institution is an equally positive outcome as a student earning an associate’s degree.”
  • Reporting a combined graduation-transfer rate as the primary outcome measure for degree-seeking students.
  • Creating an interim, persistence measure combining lateral transfer with retention at the initial institution.

These recommendations show an understanding of the student populations served by community colleges. Inclusion of these definitions and measures in federal IPEDS reporting would provide more meaningful peer, state, and national benchmarks for all community colleges.

Author/s: 
Craig A. Clagett
Author's email: 
newsroom@insidehighered.com

Assuring Civility or Curbing Criticism?

Smart Title: 

Higher ed research group calls off panel that would have focused on controversial issue of its journal that featured articles questioning student engagement surveys.

Too Many Rules

Smart Title: 

A federal panel asking whether Higher Education Act regulations are burdensome got an earful.

Pages

Subscribe to RSS - Assessment
Back to Top