When students take exams on the computer at home, there is no classmate a seat over to copy from. Then again, Google knows more than any fellow test-taker.
So the results of a new meta-study on cheating, published in this fall’s edition of the Journal of Distance Learning Administration, might come as no surprise: Online courses that rely heavily on unproctored, multiple-choice exams are at greater risk of being cheated on than similar face-to-face courses, the study concluded. And while there are mechanisms available to forfend dishonesty in online exams, they can be costly and inconvenient, and may not be widely used.
The meta-study, conducted by researchers at University of Connecticut and Union Graduate College, looked at three prior studies examining cheating as it applies to online courses versus face-to-face, and three studies that looked at cheating as it applies to proctored exams compared to unproctored ones. “The six studies, considered as a group, imply cheating risk is less correlated with instructional format (online v. face-to-face), and more correlated with unproctored online assessments,” the authors write.
The problem, of course, is that online assessments can be hard to proctor. There are companies that offer proctors and testing centers where online students can go to take the exams in the same controlled environment as traditional students customarily use, the authors note. But those centers and proctors come with fees. And since many online students choose distance learning because they need the flexibility of a program that is asynchronous and non-placebound, having to show up at a certain time and place to take exams tends to defeat the purpose. “Conventionally, the target market for online courses is thought of as underserved populations such as working students who manage conflicting practice, work, family, and academic commitments,” write the authors, citing a 2006 U.S. Education Department report that affirms as much.
The efforts of many online programs to enroll international students might also undermine the secure-site method. For an online student taking a course from some far-flung locale, showing up at a testing center could go beyond mere inconvenience.
Software companies provide some potential fixes for the problem of proctoring online exams. Starting at $2,000 for an institutional license, a company called Respondus offers a product, which can be downloaded remotely, that integrates with the institution’s learning-management system and locks down an online test-taker’s ability to browse the Internet while taking an exam.
Of course, this does nothing to prevent students from Googling answers on another computer or on their smartphones — which is why another company, called Software Secure, Inc., offers similar anti-browsing software with its Securexam Remote Proctor — along with a $200 piece of hardware that takes periodic fingerprint readings as well as audio and 360-degree video recordings of the test-taking environment to make sure test-takers are not being fed answers the old-fashioned way.
These two products count hundreds of higher ed clients, but there are indications that many online programs do not use even the most basic safeguards against cheating. In a 2009 Campus Computing Project survey of 182 online program administrators at nonprofit institutions, only about half said they consistently make an effort to “authenticate” their users. (That survey did not include for-profit institutions, which serve many fully online students.) And there are many more fully online degree programs than Software Secure has clients. And even among those institutions that deploy rigorous monitoring technology, there is no guarantee that instructors will review the video and audio recordings of each student’s test-taking session.
“Institutions of higher education [that are] tone deaf to the issue of proctoring online multiple-choice assessments may understandably find other institutions reluctant to accept these courses for transfer credit,” write the authors of the new meta-study.
Where applicable, exams might become more cheat-proof if they involved more essays — which, the authors note, are harder to cheat on without being detected. Online instructors could also simply move to marginalize the weight of online exams on students’ final grades for a course, shifting to a more essay- and homework-centered scoring rubric, they write.
However, even online homework is not necessarily safe. A team of professors at the Massachusetts Institute of Technology earlier this year published an article in the open journal Physical Review Special Topics—Physics Education Research suggesting that cheating on problem sets administered online is more common than some professors might have hoped.
The good news is that technology the authors used to detect the cheating — developed by lead author David E. Pritchard, a physics professor, for Pearson's "Mastering" series of online homework software — suggests that institutions may soon be capable of catching students who cheat on homework. As for essays, programs such as Turnitin and Blackboard's SafeAssign offer products that can out plagiarists and essay-mill patrons.
As for exams, the “best solutions involve proctors,” says John Bourne, executive director of the Sloan Consortium, a nonprofit that studies online learning. But apart from that method, Bourne recommends not using multiple choice questions, and suggests creative solutions such as “interview[ing] students orally on Skype.”
“Solutions for cheating are out there,” Bourne wrote in an e-mail. “Whether [professors] use them or not is, of course, up to them.”
For the latest technology news and opinion from Inside Higher Ed, follow @IHEtech on Twitter.