Essay looks at how early warning systems can better boost retention
- Researchers cast doubt about early warning systems' effect on retention
- Essay on the real meaning of the debate over an early notification system for students
- Civitas brings completion-oriented big data to community colleges and universities
- Starfish's retention software includes both early alerts and kudos
- Technology and the Completion Agenda
Tech Alone Won't Cut It
The news that Purdue University likely overstated the impact of its early warning system, Course Signals, has cast doubt about the efficacy of a host of technology products intended to improve student retention and completion. In a commentary published in Inside Higher Ed, Mark Milliron responded by arguing that “next-generation” early warning systems use more robust analytics and will be likely to get better results.
We contend that even with extremely robust and appropriate analytics, programs like Course Signals may still fall short if their adoption ignores the most pressing piece of electronic advising systems — their use on the front end, by advisers, faculty and students. Until more attention is paid to the messy, human side of educational technology, Course Signals — and other programs like it — will continue to show anemic impacts on student retention and graduation.
Over the past year, we have worked with colleges in the process of implementing Integrated Planning and Advising Systems (which include early warning systems like Course Signals). The adoption of early warning systems requires advisers, faculty and students to approach college success differently and should, in theory, refocus attention on how they engage with advising and support services. In practice, however, we have found that colleges consistently underestimate the challenge of ensuring that such systems are adopted effectively by end-users.
The concept of an early alert is far from new. In interviews, instructors and advisers have consistently reminded us that for years, students have received “early alert” feedback in the form of grades and midterm reports. Early warning systems may streamline this process, and provide the reports in a new format (a red light instead of a warning note, for example), but the warning itself isn’t terribly different.
What is potentially different about products like Course Signals is their ability to connect these course-level warnings to the broader student support services offered by the college. If early warning signals are shared across college personnel, and if those warnings serve to trigger new behaviors on their part, then we are likely to see changed student behavior and success. In other words, sending up a red light isn’t likely to influence retention. But if that red light leads to advisers or tutors reaching out to students and providing targeted support, we might see bigger impacts on student outcomes.
Milliron says, for example, that with predictive analytics, “student[s] might be advised away from a combination of courses that could be toxic for him or her.” But such advising doesn’t happen spontaneously: it requires advisers to be more proactive in preparing for and conducting each advising session. They must examine a student’s early warning profile, program plan and case file prior to the session; they must reframe how they present course choices to students; and they have to rethink what the best course combinations are for students with varying educational and career goals, as well as learning styles and abilities. Finally, they may have to link students to additional resources on campus — such as tutoring— and colleges need to ensure these services exist and are of high quality.
For this process to occur, advisers need to be well-versed in how to use the analytics, and be encouraged to move past registering students for the most common set of courses to courses that make sense for the individual. But because most colleges remain uncertain about the process changes that should occur when they adopt early warning systems, they are unable to provide the training that would help faculty and advisers make potentially transformative adjustments in their practice.
Even if colleges do adequately prepare faculty and advisers for this transition, there is much we still don’t know about how students will perceive and use the data and messages they receive from early warning systems. These unknowns may influence the extent to which the systems impact student outcomes.
For example, if students perceive early warnings as a reprimand rather than an opportunity to get help, they may ignore the signals or avoid efforts of college personnel to contact them. To anticipate and mitigate these kinds of potentially negative responses, it is important to understand how all students, not just those who use and enjoy early alert systems, experience and react to such signals. As Milliron notes, we need to figure how to send the right message to the right people in the right way.
Early warning systems are only tools, and colleges will have to pay closer attention to changing end-user culture in order to maximize their effectiveness. Currently, colleges are skipping this step. At the end of the day, even the best system and the best data depend on people to translate them into actions and behaviors that can influence student retention and completion.
Melinda Mechur Karp is a senior research associate at the Community College Research Center at Columbia University's Teachers College. Also contributing to the essay were Jeff Fletcher, a senior research assistant, Hoori Santikian Kalamkarian, a research associate, and Serena Klempin, a research associate.