The news that Purdue University likely overstated the impact of its early warning system, Course Signals, has cast doubt about the efficacy of a host of technology products intended to improve student retention and completion. In a commentary published in Inside Higher Ed, Mark Milliron responded by arguing that “next-generation” early warning systems use more robust analytics and will be likely to get better results.
We contend that even with extremely robust and appropriate analytics, programs like Course Signals may still fall short if their adoption ignores the most pressing piece of electronic advising systems — their use on the front end, by advisers, faculty and students. Until more attention is paid to the messy, human side of educational technology, Course Signals — and other programs like it — will continue to show anemic impacts on student retention and graduation.
Over the past year, we have worked with colleges in the process of implementing Integrated Planning and Advising Systems (which include early warning systems like Course Signals). The adoption of early warning systems requires advisers, faculty and students to approach college success differently and should, in theory, refocus attention on how they engage with advising and support services. In practice, however, we have found that colleges consistently underestimate the challenge of ensuring that such systems are adopted effectively by end-users.
The concept of an early alert is far from new. In interviews, instructors and advisers have consistently reminded us that for years, students have received “early alert” feedback in the form of grades and midterm reports. Early warning systems may streamline this process, and provide the reports in a new format (a red light instead of a warning note, for example), but the warning itself isn’t terribly different.
What is potentially different about products like Course Signals is their ability to connect these course-level warnings to the broader student support services offered by the college. If early warning signals are shared across college personnel, and if those warnings serve to trigger new behaviors on their part, then we are likely to see changed student behavior and success. In other words, sending up a red light isn’t likely to influence retention. But if that red light leads to advisers or tutors reaching out to students and providing targeted support, we might see bigger impacts on student outcomes.
Milliron says, for example, that with predictive analytics, “student[s] might be advised away from a combination of courses that could be toxic for him or her.” But such advising doesn’t happen spontaneously: it requires advisers to be more proactive in preparing for and conducting each advising session. They must examine a student’s early warning profile, program plan and case file prior to the session; they must reframe how they present course choices to students; and they have to rethink what the best course combinations are for students with varying educational and career goals, as well as learning styles and abilities. Finally, they may have to link students to additional resources on campus — such as tutoring— and colleges need to ensure these services exist and are of high quality.
For this process to occur, advisers need to be well-versed in how to use the analytics, and be encouraged to move past registering students for the most common set of courses to courses that make sense for the individual. But because most colleges remain uncertain about the process changes that should occur when they adopt early warning systems, they are unable to provide the training that would help faculty and advisers make potentially transformative adjustments in their practice.
Even if colleges do adequately prepare faculty and advisers for this transition, there is much we still don’t know about how students will perceive and use the data and messages they receive from early warning systems. These unknowns may influence the extent to which the systems impact student outcomes.
For example, if students perceive early warnings as a reprimand rather than an opportunity to get help, they may ignore the signals or avoid efforts of college personnel to contact them. To anticipate and mitigate these kinds of potentially negative responses, it is important to understand how all students, not just those who use and enjoy early alert systems, experience and react to such signals. As Milliron notes, we need to figure how to send the right message to the right people in the right way.
Early warning systems are only tools, and colleges will have to pay closer attention to changing end-user culture in order to maximize their effectiveness. Currently, colleges are skipping this step. At the end of the day, even the best system and the best data depend on people to translate them into actions and behaviors that can influence student retention and completion.
Melinda Mechur Karp is a senior research associate at the Community College Research Center at Columbia University's Teachers College. Also contributing to the essay were Jeff Fletcher, a senior research assistant, HooriSantikianKalamkarian, a research associate, and Serena Klempin, a research associate.
That college you have your eye on for your teenager? It may be going out of business. Your alma mater, too.
Here’s why: we keep seeing reports that the financial model undergirding much of higher education is weak and getting weaker. The way colleges are financed is out of date with the demands of a much larger student population. Few people outside higher education are aware of this, but college and university leaders are deeply concerned.
As director of the Postsecondary Success Strategy at the Bill & Melinda Gates Foundation, I have spent the last year talking with chancellors, provosts, faculty, policy makers, and education technologists. Pretty much all of them recognize that higher education is at a tipping point, and that it will soon look nothing like it does today, except perhaps at a few ivy-covered, well-endowed institutions.
This is not hyperbole.
Bain & Co. looked at hundreds of colleges and universities and found that about one in three is on an unsustainable financial track. “A growing percentage of our colleges and universities are in real financial trouble. And if the current trends continue, we will see a higher education system that will no longer be able to meet the diverse needs of the U.S. student population in 20 years.”
The report found that, at a time when college revenues and cash reserves are down, too many institutions face bigger debt service bills and ever-increasing expenses. Colleges were once able to make ends meet with annual tuition hikes, new fees and by securing more government support. Those days, though, are gone. Too many students now must borrow heavily just to keep pace with tuition increases, and government coffers are bare.
Last summer Inside Higher Ed and Gallup surveyed campus chief financial officers on their thoughts on the sustainability of their higher education institutions. Only 27 percent of them expressed strong confidence in their institution's financial model over the next five years. When asked to consider a 10-year window, the number expressing strong confidence in the financial health of their institutions dropped to 13 percent.
Improvement is needed on the academic side, as well. Data shows that our higher education system currently serves only about a third of students well, any most of those come from generally well-off families. Institutions of all types–two-year, four-year, public, private and online–need to adapt to the realities of today’s students even as they grapple with shrinking resources and increasing demand.
Only one student in four graduates from high school ready to succeed in a postsecondary program. Too many of the rest end up stuck in remedial programs that drain their resources and don’t prepare them to successfully complete postsecondary coursework.
Many of these students are from low-income families, or they are older, juggling life, jobs, and family as they pursue their educations. They are often first-generation college-goers who lack the support and guidance crucial to navigating the thicket that is higher education. As a result, too many students end up leaving college with a lot of debt but no degree.
We used to call these students “nontraditional.” Now they are the “new majority.” And their struggles were highlighted recently in data released by the Organization for Economic Cooperation and Development that showed U.S. adults have below average literacy, math and problem solving skills when compared to their peers in the world’s richest countries. We have to make the system better for these students — but how?
Technology is often looked at as an answer. Yet, it has to be more than just bolting new technology on an antiquated platform. Technology-driven innovation has the potential to help colleges and universities address some of these challenges while helping faculty do their jobs by helping them offer students more personalized instruction and academic support. Done thoughtfully and well, technology can help faculty provide a more personalized learning experience for their students and ease some of the financial pressure on colleges and universities.
Today’s students need highly personalized coaching, mentoring, and other supports tailored to their individual needs and goals. Technology holds huge promise for making this kind of personalization possible by enabling colleges to effectively target the most costly and most important aspect of any education – the interactions with instructors and advisors.
Too often, we are debating the wrong things about technology and higher education. For example, we can’t just compare online or in-person classes. We need new business models that include technology and allow colleges and universities to put scarce dollars where they matter most. For today’s student, what can make a big, positive difference is access to an education tailored to their needs, their learning styles, and their goals, with appropriate coaching and advising.
Look at the State University of New York, which plans to add 100,000 new students over the next three years through its Open SUNY initiative. It will make online classes at each of its 64 campuses available to all of the system’s 468,000 students. Personalization will be an important part of the initiative, combining on-site and online academic support. Arizona State University, for its part, combines face-to-face learning, hybrid classes, and online instruction to increase enrollments, even as it faces severe physical space limits.
The cause is urgent. For higher education to fulfill its historic role as an engine of social mobility and economic growth, we must continue to seek big technology breakthroughs. This means thinking creatively about how to serve students as individuals, while also ensuring that many more students get the learning opportunities they deserve.
This might sound paradoxical, but investments in education technology will be increasingly crucial to humanizing and improving the student experience. And it might just keep your alma mater – or your child’s future alma mater – in business, and more purposeful and student-centered than ever.
Dan Greenstein is the director of postsecondary success at the Bill & Melinda Gates Foundation. Follow him on Twitter: @dan_greenstein.
Signals has had a rough few months. Blog posts,articles, and pointed posts on social media have recently taken both the creators and promoters of the tool to task for inflated retention claims and for falling into common statistical traps in making those claims. Some of the wounds are self-inflicted — not responding is rarely received well. Others, however, are misunderstandings of the founding goals and the exciting next phases of related work.
Signals is a technology application originally created by a team at Purdue University that uses a basic rules-based set of predictions and triggers — based on years of educational research and insight from the university's faculty — and combines them with real-time activity of students in a given course. It then uses a “traffic light” interface with student that sends them a familiar kind of message:
Green light: you’re doing well and on the right track.
Yellow light: you’re a little off-track, you might want to think about x,y, or z or talk with someone.
Red light: you’re in trouble. You probably need to reach out to someone for help.
These same data are also shared with faculty and administrators so they can reach out to those who need specific support in overcoming an academic challenge or just extra encouragement. Signals is now part of the services offered by Ellucian, but just about all the major players in education technology offer some version of "early warning" applications. In our insight and analytics work, a number of colleges and universities are piloting related apps; however, the promise and problems of Signals are an important predicate as we move forward with that work.
The Signals app project began with a clear and compelling goal: to allow students, faculty, and advisers access to data that might help them navigate the learning journey. For too long, the key data work in education has been focused on reporting, accreditation, or research that leads to long reports that few people see and are all too often used to make excuses, brag, blame, or shame. More problematic, most of these uses happen after courses are over or worse, after students have already dropped out.
The Signals team was trying to turn that on its head by gathering some useful feedback data we know from research may help students navigate a given course, and to give more information to faculty and administrators dedicated to helping them in the process. The course-level outcomes were strong. More students earned As, fewer got Fs, and the qualitative comments made it clear that many students appreciated the feedback. A welcome “wake-up call,” many called it.
John Campbell, then the associate vice president of information technology at Purdue, was committed to this vision. In numerous presentations he argued that “Signals was an attempt to take what decades of educational research was saying was important — tighter feedback loops — and create a clean, simple way for students, faculty, and advisers to get feedback that would be useful.”
Signals was a vital, high-profile first step in the process of turning the power of educational data work toward getting clean, clear, and useable information to the front lines. It was a pioneer in this work and should be recognized as such. The trouble is the conflation of this work with large-scale retention and student success efforts. Claiming a 21 percent long-term retention lift, as some at Purdue have, is a significant stretch at best. However, Signals has shown itself to be a useful tool to help students navigate specific courses, and for faculty and staff striving to supporting them. And while that will likely be useful in long-term retention, there is still much work to be done to both bring Signals to the next level of utility in courses and to test its impact on larger student success initiatives.
First, as Campbell, now CIO at West Virginia University notes, Signals has to truly leverage analytics. In our recent conversation he posited, “The only way to bring apps like Signals to their full potential, to bring them to scale, to make them sustainable is through analytics.” Front-line tools like Signals have to be powered by analyses that bring better and more personalized insight into individual students based on large-scale, consistently updated, student-level predictive models of pathways through a given institution. Put simply, basing the triggers and tools of these front-line systems on blunt, best-practice rules is not personalized, but generalized. It’s probably useful, but not optimal for that individual student. There needs to be a “next generation” of Signals, as Campbell notes, one that is more sophisticated and personalized.
For example, with a better understanding of the entire student pathway derived from analytics anchored on individual-level course completion, retention, and graduation predictions, a student who was highly likely to struggle in a given course from day one — e.g., a student having consistent difficulty with writing-intensive courses who is trying to take three simultaneously — might be advised away from a combination of courses that could be toxic for him or her. By better balancing the course selection, the problem — which would not necessarily be the challenge of a given course — could be solved before it begins. In addition, an institution may find that for a cluster of students standard “triggers” for intervention are meaningless. We’ve seen institutions that are serving military officers who have stellar completion and grade patterns over multiple semesters; however, because of the challenges of their day jobs, regular attendance patterns are not the norm. A generalized predictive model that pings instructors, advisers, or automated systems to intervene with these students may be simply annoying a highly capable student and/or wasting the time of faculty and advisers who are pushed to intervene.
Second, these tools have to be studied and tuned to better understand and maximize their positive impact on diverse student populations. With large-scale predictive flow models of student progression and propensity-score matching, for example, we can better understand how these tools contribute to long-term student success. Moreover, we can do tighter testing on the impact of user-interface design.
Indeed, we have a lot to learn about how we bring the right data to the right people – students, faculty, and advisers — in the right way. A red traffic light flashing in the face of a first-generation student that says, “You are likely to fail” might be a disaster. It might just reaffirm what he or she feared all along (e.g., “I don’t belong here”) and lead to dropping out. Is there a better way to display the data that would be motivating to that student?
The chief data scientist at our company, David Kil, comes from the world of health care, where they have learned the lessons of the impact of lifespan analysis and rapidly testing interventions. He points out the importance of knowing both when to intervene and how to intervene. Moreover, they learned that sometimes data is best brought right to the patient in an app or even an SMS message, other times the message is better sent through nurses or peer coaches, other times a conversation with a physician is the game changer. Regardless, testing the intervention for interface and impact on unique patient types, and its impact on long-term health, is a must.
The parallel in education is clear: Signals was an important first step to break the data wall and bring more focus to the front lines. However, as Campbell notes, if we want these kinds of tools to become more useful, we need to design them with triggers and tools grounded in truly predictive models and create a large-scale community of practice to test their impact and utility with students, faculty, and advisers – and their long-term contribution to retention and graduation. Moreover, as Mike Caulfield notes, technology-assisted interventions need to be put in the larger context of other interventions and strategies, many of which are deeply personal and/or driven by face-to-face work in instruction, advising, and coaching. Indeed, front-line apps at their best might make the human moments in education more frequent, informed, and meaningful. Because, let’s be clear about it, students don’t get choked up about apps that changed their lives at graduation.
Mark Milliron is Civitas Learning's chief learning officer and co-founder.