You have /5 articles left.
Sign up for a free account or log in.

Chanit'a Holmes, pictured at the Virginia Tech campus in a yellow cardigan.

Chanit'a Holmes, pictured here, conducted two studies to investigate the best ways to deliver academic notifications, or nudges, to students.

Felicia Spencer / Virginia Tech

Do academic alerts actually motivate students to step up their performance? Or do they just make them feel stressed out? New research by a Virginia Tech professor indicates the results of such “nudging” might be mixed.

The research, which has not yet been published, is based on two studies. One study investigated the effects of emailing feedback to students and the other one evaluated texting students about their assignments and deadlines. Both studies found that students who received some form of communication did better on assignments than their classmates early in the semester, but the performance of the nudged students fell in line with that of their peers who were not nudged as the course progressed.

The studies come at a time when colleges are increasingly working to improve student engagement, which some professors say took a nosedive during the pandemic and has yet to recover. Although academic performance notifications existed prior to COVID-19, the pandemic accelerated their adoption as an engagement strategy, according to Ed Venit, managing director at the education consulting firm EAB, who has worked with colleges to implement these systems.

At the same time, various other types of nudges used by colleges, such as notifications to encourage students to apply for financial aid have proven to be ineffective.

Chanit’a Holmes, an assistant professor of agriculture and applied economics at Virginia Tech and the researcher behind the studies, said a handful of factors may have led to this decline in student grades. In the initial experiment, Holmes emailed feedback to students in her microeconomics course after each assignment, including information about how each student’s score ranked in comparison to the rest of the class. She wondered if the decline in grades was rooted in students’ disappointment with their ranking not increasing even when they put in extra effort.

“They maybe stop putting in effort or just say, ‘OK, this is where I’m at in the class and it is what it is,’” she said.

The second study was much larger: 1,500 students in courses taught by four different professors, including Holmes, participated. The students received either no academic notifications, text messages that they could not respond to, or text messages that they could respond to, sent by one of 35 volunteer peer coaches who had already taken the course.

While some students who participated in the larger study reacted positively and said the messages were motivating, others said they felt the messages were too repetitive, impersonal and even stress-inducing because they seemed to put additional pressure on the students to do well in class. A small percentage of students, about five percent, dropped out of the study and stopped receiving notifications altogether.

Holmes speculated that these frustrations may have led students to stop reading or caring about the texts, which could have influenced the drop in grades mid-semester.

She believes those complaints can be addressed with future experimentation on the most effective frequency at which to message students.

Venit had another theory for why some students disliked the text system: the actual written content of the messages may not have been conducive to a “growth mindset.”

One example of the type of feedback that may have been sent to a student involved asking the student to come to Holmes’s office to receive additional feedback. Venit said some students might view that message negatively and interpret it as being in trouble for their poor academic performance.

“The messaging really matters if you’re trying to get someone to respond positively,” he said.

She believes college administrators and professors should look at marketing strategies used in other fields, such as medicine, for ideas on how to make academic alerts or other higher ed-related nudges feel more encouraging to students.

“What are doctors saying to patients to get them to take care of themselves?” he asked. “In the end, we obviously want a student to both learn a lot and have a good feeling about their experience.”

Prior Research

This isn't the first time researchers have tried to better understand what makes an effective nudge.

Kim Manturuk, a former researcher at Duke University’s Teaching and Learning Innovation Lab who studied nudging, argued in a 2019 column, “Reclaiming the Nudge,” that the term is often misused by higher ed professionals who apply it to “virtually any intervention designed to create incremental behavior change.”

“In other words, a nudge isn’t something you realize you’re getting, and the result isn’t something you think too much about. A nudge happens in the background of your daily life, and it works best when the goal is to slightly increase a positive outcome,” she wrote.

Two researchers who took part in a study about nudges published in 2022 recommended best practices for nudges in a Brookins Institute article. They said the most effective nudges motivated students to achieve serious and time-sensitive tasks, outlined next steps and were personalized to students’ specific needs.

Holmes said the benefits of all three methodologies in her studies—emailed feedback, one-way text reminders and two-way text coaching—outweighed the drawbacks because students who received notifications received higher grades, overall, than those who didn’t. That outcome makes nudging strategies a valuable, low-cost student success tool.

“I was a first-generation student in college … and it was difficult for me to navigate,” she said. “So, I’m trying to figure out, ‘What are low-cost interventions that we as a university, or maybe policymakers, can implement that are not going to cost a lot, but be very helpful for students?’.”

Holmes hopes to take a deeper dive on the results of the second study and evaluate the demographics of the students who participated to determine other factors that may have played a role in their responses to the notifications and how well they did in the classes.

“The next step is to determine, ‘Are there differences between gender? Are there differences between those who are first-generation students? Are there differences between underrepresented students?’” she said. “I think there's a lot more that we can determine from doing a deeper dive.”

Venit postulated that students who are already confident in their abilities might be more open to receiving notifications than those who are less confident. The latter group might be inclined to take feedback more harshly or feel more self-conscious about where their scores rank in comparison to their peers.

At the same time, considering the widely varied responses students had in the exit interview, he wouldn’t be surprised if the data paints a narrative he hadn’t previously considered.

“This is complex stuff,” he said.

Next Story

Written By

More from Academics