The New Diagnostics
About a week into any class at Rio Salado College, officials can make a pretty good guess as to which students will succeed and which ones will not.
The Arizona community college, where more than half of the 64,000 students pursue their degrees online, has devised a system of predictive modeling that officials believe can forecast, with 70 percent accuracy, how likely it is that a student will achieve a “C” grade or higher (the threshold for transferable credits) in a given course. The tool -- one of several of its kind -- is intended to help instructors to identify at-risk students early enough that they can intervene.
“We’re trying to really understand the true behavior of the student based on reality,” says Adam Lange, the programmer analyst at Rio Salado who designed the system, “and then use that information to be able to make informed, data-driven decisions about how we can help students.”
At a time when higher education is increasingly taking place online (even when students are in a traditional classroom), colleges have more data on student engagement than ever before. Learning management systems are widely used, in online and classroom-based courses alike, as places where students interact with their professors, their course materials, and each other. But unlike traditional classrooms, these environments can keep a detailed log of everything that happens there, providing information for these new diagnostic tools.
“We’re dealing with a virtual mountain of data,” says Lange. “And a lot of these data are behavioral data… real-time data that comes from our LMS. This is really valuable information. It tells us a lot about the students.”
Such as when, and how frequently, students are logging into the course home page. Rio Salado uses more than two dozen metrics during that first week to predict how well that student stands to fare over the entire course, but some of the most effective are the most basic: Has the student logged into the course home page during that first week? Did she log in prior to the first day of class? Other predictive metrics, such as whether a student is taking other classes at the same time, whether she has been successful in previous courses, and whether she is retaking the course, are culled from the college's student information system.
The predictive modeling system uses these metrics to separate students into three color-coded categories: high-risk (red) students, medium-risk (yellow) students, and low-risk (green) students. The instructors of each class are notified a week in about the “yellow” students in their class, so they can then reach out to those students and try to get them on track. The college says it does not currently intervene in the cases of “red” students, citing limited resources (although officials there say they are working on developing a system to address the needs of those students).
While intervention methods among faculty vary, this can entail making themselves available for questions and extra help, encouraging the students to check the course page more frequently, pointing them to tutoring and other support services, and even contacting them by telephone, according to Shannon Corona, chair of the physical science faculty at Rio Salado.
What the professors don’t do is tell the students whether they have been labeled at-risk. "If we alert students directly, they may not know intuitively what they need to do to improve within our online learning environment," said Lange. "On the other hand, faculty can lead at-risk students down the right path and find the best strategies for each student."
Rio Salado differs in that respect from Purdue University, which has run similar predictive modeling program since 2006, and does keep students in the loop. At an "actionable analytics" symposium last month, John Campbell, the associate vice president of Purdue’s advanced computing center, said the “at-risk” students generally took that information as either a motivational kick in the rear or were prompted to quickly drop the class -- and were grateful in any case. A double-blind study conducted during the first two years of the Purdue's program, called Signals, revealed that 67 percent of students who learned they were in the middle- or high-risk categories were able to improve their grades.
On Thursday, SunGard Higher Education announced it is partnering with Purdue to market the Signals system to colleges everywhere.
Like Rio Salado, Capella University, a for-profit online university that has used a comparable system for the past three years, does not tell students about their risk status. Kim Pearce, the director of assessment and institutional research at Capella, says low national graduation rates suggest that students might not be vigilant enough to redirect themselves on their own. “I think the general national dissatisfaction with our graduation rates … is partially based on the idea that [students] are exclusively in charge of their own learning experience.” However, Pearce does predict that Capella will eventually start informing students when its computers forecast a bad outcome. Lange says Rio Salado will likely do the same. Neither has yet gathered enough data to quantify the effect of their inventions on student success.
Still, "We're confident enough in the modeling and the interventions that we're going to continue," Pearce says.
Rio Salado, Purdue, and Capella appear to be at the front end of what campus computing expert Kenneth C. Green this week called the “third phase” of e-learning: the point at which colleges and technology companies shift their attention toward finding ways to mine and utilize all the data created by interactions between professors and students on virtual learning platforms. That technology, Lange says, is changing the practice of predicting student success from instinct and generalization to genuine science.
“The knowledge of predictive modeling and of data driven approaches just wasn’t out there, and now it’s just sort of creeping its way into higher ed, especially in distance learning,” he says.
“Online is a data rich environment,” says Pearce.
“It’s just a matter of time before everybody starts using the data that are available to them,” she says.