You have /5 articles left.
Sign up for a free account or log in.
Getty Images
Knowing how often college students log onto learning management software is one of the best ways to predict whether they will stick with their studies or drop out.
That finding, which comes from a trove of data collected by Civitas, an education technology company that does predictive data analytics, might seem like common sense: students who don’t do their course work are less likely to graduate.
But engagement data from learning management systems (LMS), said officials at colleges that are clients of Civitas, can be sliced and diced to much better predict which students are likely to struggle, and for colleges to act on that information.
Strayer University hired Civitas three years ago. Joe Schaefer, the chief technology and innovation officer for the large for-profit chain, said the university had previously relied on the standard metrics most colleges use to predict student success, such as grade point averages, scores on standardized tests, demographics, academic standing and whether students attend college full time or part time.
“Student engagement trumps everything, by far,” said Schaefer.
Over all, Civitas said that for a sample of 600,000 students at 23 institutions, engagement data accounted for two of the 10 top predictors for the retention of first-year students. Sometimes it was the No. 1 predictor.
At one research university, which Civitas did not identify, about 88 percent of students remained enrolled after their first year, Civitas said. But the university’s persistence rate dropped to 76 percent for students who interacted with the learning management system on fewer than five days during the first two weeks of the term, versus 92 percent for students with five or more days of activity during that period. The percentage dropped to 48 percent, meaning more than half will drop out, for students who use the software on one day or fewer.
Colleges can use that level of specificity about which students are falling behind to reach out and offer support, such as a meeting with an academic adviser, said Laura Malcolm, vice president of product management at Civitas. And the behavioral data tend to be more telling than static predictors like a student’s background or GPA.
The data tell more than whether or not students log on to the LMS, Malcolm said, offering specificity such as whether they check a syllabus or participate on a discussion board. It also shows how a student varies from his or her peers.
Even so, Malcolm said she was surprised by the consistency of the findings, which applied both to online and on-ground programs (LMS engagement is a bit more of a retention predictor online, not surprisingly).
GPAs in particular seem to lag in comparison to engagement data as a predictor, according to Civitas, which found that almost half of students who drop out -- across a sample of two million students -- had a GPA of 3.0 or higher.
Marie Cini, the provost and vice president for academic affairs at the University of Maryland University College, said the finely tuned data from Civitas on LMS engagement help to ensure students succeed. And UMUC starts tracking student behavior even before the first day of a term.
The university looks to see whether students are logging onto material for a course before it begins, she said, to see whether they’re starting to prepare.
“We can tell you on day zero, before classes start, which students are likely to succeed,” said Cini.
Schaefer said Strayer also pays particularly close attention to students in the beginning of a term.
“You need to catch them early,” he said. “We look at engagement of students relative to other students.”
The university has experimented with asking faculty members and coaches to reach out to students with low levels of engagement during various intervals as courses progress. The goal, Schaefer said, was to “constantly monitor the relative engagement, particularly the ones who change” during the term.
One Strayer project, which featured faculty interventions with students the university knew were falling behind based on Civitas-provided data, resulted in a 5 percent increase in class attendance, a 12 percent bump in students who passed the course and an 8 percent decrease in those who dropped the course.
The most successful approach, Schaefer said, was when faculty members reached out via phone, email or even video and sought to have “real and meaningful human conversations” with students. Asking, “Are you OK?” and “How can I help you?” seemed to make a difference, he said.
The LMS engagement information from Civitas doesn’t explain why a student is disengaged. It's just a signal, said Malcolm, but a valuable one.
“It’s a key early signal that they can use with students,” she said. “The more you focus on behavior, the more predictive it becomes.”