College students might sometimes feel they are getting mixed messages about laptops. Many receive them for free or at a discount from their colleges, only to have professors banish the machines from their classrooms, or at least complain about them.
For years, researchers have conducted studies in hopes of answering whether having laptops in class undermines student learning. In the avalanche of literature, one can find data pointing each way. A 2006 study of 83 undergraduate psychology students suggested that having laptops in class distracts both the students who use them and their classmates. Several law professors have written triumphal papers documenting their own experiments banning laptops, which one of them complained had transformed his students from thoughtful, selective note-takers into “court reporters” reduced to mindlessly transcribing his lectures. And yet other papers have argued that laptop bans are reductive exercises that ignore the possibility that some students — maybe even a majority — might in fact benefit from being able to use computers in class if only professors would provide a modicum of discipline and direction.
Still, there is one notable consistency that spans the literature on laptops in class: most researchers obtained their data by surveying students and professors.
The authors of two recent studies of laptops and classroom learning decided that relying on student and professor testimony would not do. They decided instead to spy on students.
In one study, a St. John's University law professor hired research assistants to peek over students’ shoulders from the back of the lecture hall. In the other, a pair of University of Vermont business professors used computer spyware to monitor their students’ browsing activities during lectures.
The authors of both papers acknowledged that their respective studies had plenty of flaws (including possibly understating the extent of non-class use). But they also suggested that neither sweeping bans nor unalloyed permissions reflect the nuances of how laptops affect student behavior in class. And by contrasting data collected through surveys with data obtained through more sophisticated means, the Vermont professors also show why professors should be skeptical of previous studies that rely on self-reporting from students — which is to say, most of them.
Virtual Monitors in Vermont
In the Vermont study, published last year in the Journal of Information Systems Education, business professors James Kraushaar and David Novak persuaded 45 students to install software on their laptops that logged information on the applications the students were running, including how long the applications were open and which were most frequently the primary “focus” of the laptop monitor. (Students were granted anonymity.)
The professors then developed a rubric to distinguish “productive” applications (Microsoft Office and course-related websites) from “distractive” ones (e-mail, instant messaging, and non-course-related websites), and generated two scores: one based on how frequently a student appeared to be using distracting applications, and another based on the duration of each detour. Then they compared each student’s distraction scores during each course unit to how well he or she performed on the evaluation meant to test mastery of what had been taught over that stretch — controlling for variables such as the cumulative grade point average and standardized test scores of each student.
The average student in the Vermont study cycled through a whopping 65 new, active windows per lecture, nearly two-thirds of which were classified as “distractive.” (One student averaged 174 new windows per lecture.) But only one type of distractive application appeared to have any effect on how well students ended up doing on assessments: instant messaging.
Students who frequently checked e-mail and surfed non-course-related sites did not appear to sweat for their sins on homework, quizzes, tests, or the final exam. High rates of instant-messaging activity, however, showed significant correlations with poor performances on all but one test during the semester.
Not only was instant-messaging the most harmful type of application, students tended to be wildly inaccurate when reporting how frequently they used it. In addition to using computer spyware, Kraushaar and Novak surveyed students to test how reliably they could assess their own laptop activity. Forty percent of students whom the spyware caught using instant-messaging applications in class told the professor they had never done so.
“It is possible that student reported use may reflect social expectations rather than actual use,” the authors wrote. “If true, these reporting biases would seem to pose a major problem for technology usage studies that rely solely on student perception surveys.”
The Vermont professors note that their study was “exploratory,” and suffered from several design flaws. Perhaps most notably, the professors had to secure permission in order to install the spyware on students’ computers, thereby tipping their hand and possibly discouraging would-be Web-surfers. (Only 46 percent of the students they approached consented to being so monitored.) They suggest that further research focusing on “how” technology affects learning, rather than “whether” it does, would be helpful in developing “new technologies and learning strategies that minimize the negative impacts of software multitasking while maximizing the positive impacts.”
Sleuthing at St. John's
In a more recent study, a working draft of which was posted on the Social Science Research Network website last month, St. John's University law professor Jeff Sovern spied on students the old-fashioned way: by hiring people to peek over their shoulders. He had assistants do so in 60 class sessions for six different courses (two at St. John's, four elsewhere) that held a total of 1,072 laptop users.
Thus Sovern was able to avoid the giveaway of asking permission (he told students the assistants were there to observe the lecture, and did not elaborate further). However, it also meant that the observations were subject to human error, and less precise. For instance, the spies could not record the precise amount of time each student spent on each application and how many windows each student opened over the course of a single class session. In fact, Sovern only asked them to classify an application or website as class-related or non-class-related, and only to make a note if students used the application for five or more minutes.
Still, the data Sovern and his associates managed to collect proved useful in figuring out how to shape laptop policies that give at least a cursory nod to empirical data on the different ways students are using the technology.
Sovern’s spies found that more than half of second- and third-year law students who came to class with laptops used the computers for non-class purposes more than half the time, compared to a mere 4 percent of first-year students. For the most part, first-year students tended to be rapt when text was being read aloud or a rule was being discussed, and less attentive when classmates were asking questions; upper-level students tended to be distractible no matter what was going on.
From this, Sovern concluded that “student decisions on whether to pay attention are responses to the tension between incentives and temptation.” In other words, upper-level students — either because they were more confident, or more focused on finding jobs, or for some other reason — were more likely to abuse their laptop privileges than first-years. As a result of the study, Sovern has decided to ban laptops in his upper-year courses, while allowing them in his first-year courses — a broad policy, but one based on evidence collected on-site.
“The problem is a lot of students use laptops legitimately, so anytime you ban laptops, you’re cutting off the ability of students to do that,” Sovern said in an interview. “So it’s a decision that, to my mind, should be based on the data rather than ego.”
For the latest technology news from Inside Higher Ed, follow @IHEtech on Twitter.
Read more by
You may also be interested in...
Inside Digital Learning Articles
Inside Digital Learning Opinion
Today’s News from Inside Higher Ed
Inside Higher Ed’s Quick Takes
What Others Are Reading