You have /5 articles left.
Sign up for a free account or log in.

Critics of testing through the computer often argue that it's difficult to tell if students are doing their own work. It's also unclear to some professors whether using the technology is worth their while. A new study makes the argument that giving electronic tests can actually reduce cheating and save faculty time.

Anthony Catanach Jr. and Noah Barsky, both associate professors of accounting at the Villanova School of Business, came to that conclusion after speaking with faculty members and analyzing the responses of more than 100 students at Villanova and Philadelphia University. Both Catanach and Barsky teach a course called Principles of Managerial Accounting that utilizes the WebCT Vista e-learning platform. The professors also surveyed undergraduates at Philadelphia who took tests electronically.

The Villanova course follows a pattern of Monday lecture, Wednesday case assignment, Friday assessment. The first two days require in-person attendance, while students can check in Friday from wherever they are.

"It never used to make sense to me why at business schools you have Friday classes," Catanach said. "As an instructor it's frustrating because 30 percent of the class won't show up, so you have to redo material. We said, how can we make that day not lose its effectiveness?"

The answer, he and Barsky determined, was to make all electronically submitted group work due on Fridays and have that be electronic quiz day. That's where academic integrity came into play. Since the professors weren't requiring students to be present to take the exams, they wanted to deter cheating. Catanach said programs like the one he uses mitigate the effectiveness of looking up answers or consulting friends.

In electronic form, questions are given to students in random order so that copying is difficult. Professors can change variables within a problem to make sure that each test is unique while also ensuring a uniform level of difficulty. The programs also measure how much time a student spends on each question, which could signal to an instructor that a student might have slowed to use outside resources. Backtracking on questions generally is not permitted. Catanach said he doesn't pay much attention to time spent on individual questions. And since he gives his students a narrow time limit to finish their electronic quizzes, consulting outside sources would only lead students to be rushed by the end of the exam, he added.

Forty-five percent of students who took part in the study reported that the electronic testing system reduced the likelihood of their cheating during the course.

Stephen Satris, director of the Center for Academic Integrity at Clemson University, said he applauds the use of technology to deter academic dishonesty. Students who take these courses might think twice about copying or plagiarizing on other exams, he said.

"It's good to see this program working," Satris said. "It does an end run around cheating."

The report also makes the case that both faculty and students save time with e-testing. Catanach is up front about the initial time investment: For instructors to make best use of the testing programs, they need to create a "bank" of exam questions and code them by topic, learning objectives and level of difficulty. That way, the program knows how to distribute questions. (He said instructors should budget roughly 10 extra hours per week during the course for this task.)

The payoff, he said, comes later in the term. In the study, professors reported recouping an average of 80 hours by using the e-exams. Faculty don't have to hand-grade tests (that often being a deterrent for the Friday test, Catanach notes), and graduate students or administrative staff can help prepare the test banks, the report points out.

Since tests are taken from afar, class time can be used for other purposes. Students are less likely to ask about test results during sessions, the study says, because the computer program gives them immediate results and points to pages where they can find out why their answers were incorrect. Satris said this type of system likely dissuades students from grade groveling, because the explanations are all there on the computer. He said it also make sense in other ways.

"I like that professors can truly say, 'I don't know what's going to be on the test. There's a question bank; it's out of my control,' " he said.

And then there's the common argument about administrative efficiency: An institution can keep a permanent electronic record of its students.

Survey results showed that Villanova students, who Catanach said were more likely to have their own laptop computers and be familiar with e-technology, responded better to the electronic testing system than did students at Philadelphia, who weren't as tech savvy. Both Catanach and Satris said the e-testing programs are not likely to excite English and philosophy professors, whose disciplines call for essay questions rather than computer-graded content.

From a testing perspective, Catanach said the programs can be most helpful for faculty with large classes who need to save time on grading. That's why the programs have proven popular at community colleges in some of the larger states, he said.

"It works for almost anyone who wants to have periodic assessment," he said. "How much does the midterm and final motivate students to keep up with material? It doesn't. It motivates cramming. This is a tool to help students keep up with the material."

Next Story

Written By

More from News