Twas the final before Christmas, and all through the hall,
It seemed that a gift had been given to all.
For one professor had shared, along with the test,
A partial list of which answers were best.
No, it’s not just a nightmarish twist on a holiday classic. It’s what happened recently at Ryerson University, in Toronto, when 190 engineering students in a required, intro-level chemistry course received final exams with part of an answer key on the back. An unnamed professor accidentally left a computer-generated key attached to one of the versions of the test, which was distributed to about one-fifth of students in the massive exam hall. The mistake meant those students had the first 20 answers to the 50-question exam.
Proctors realized the error within minutes, after a student quietly pointed it out. Immediately they asked all 1,000 students taking various versions of the exam to stop writing without explaining why. They scrambled to collect the keys by checking each exam book.
But the professors had to decide on the fly how to proceed. Should they cancel the exam, even though some 800 other students in the hall had different (answerless) versions of the exam? Should they dismiss only the students with the tainted tests, even though it was unclear how many had taken advantage of or even noticed the grading key in such a short period of time?
Ultimately, the proctoring professors decided to let the students proceed with all versions of the test, and analyze their responses later on to see if those who had received the keys scored unusually high marks on the first section of the test.
Days later – and perhaps much to the chagrin of those who had peeked at the key – students in the course received an email from their professors saying that the numbers indeed had been strange.
“Unfortunately, our analysis indicated that several students benefited from the answer sheet,” the email says. “Because of that, we have had to find a way to resolve the issue that is as fair as possible, under the circumstances, to everyone in the course, and maintains academic integrity.”
The solution directly affects only those 190 students who received the key along with their exam. For them, only the last 30 answers of their exam will count. Those unhappy with that decision (or their mark) may opt to take another version of the exam in January. But in that case, the January grade is binding; there is no option to record which score is higher.
“[Because] the test for this cohort was evaluated differently from the rest of the class, we feel it is fair to offer a chance to write a new final exam to those affected students who want it,” the email says. “Another option, to order a new exam for the entire cohort, would be unfair to the students who sat through the three-hour exam, did not benefit from the answer key, and do not want to be burdened with another exam in January.”
The email also notes that those who took answers from the keys will not be penalized.
“Nobody will be charged with academic misconduct for using the answer key,” it says. “The temptation to use an answer key supplied with the test is just too great, and so charges would be unfair. Those who were worried about that can put their minds at ease.”
Anne Johnson, associate professor and undergraduate chemistry program director, said in an interview that department professors deliberated over the results of their analysis but found their solution to be most fair. Some students who did not receive keys might find it unfair that those who did have the option to take the exam again – with extra time to study – but there’s inherent unfairness in any testing scenario, she said, given those students who were “so-called sick” on the day of the exam and will have to make it up also will have more time to study.
“It’s hard to come up with anything fairer,” she added, noting that the course fulfilled the single chemistry requirement for most of the engineering program students involved. “My guess is that they’ll look at their grade, and say, ‘Oh, good,’ and be done with it.”
So far, about 45 students have said they’ll take the text again.
Johnson said the original mistake was due to human error: a professor who used a computerized program to generate his test questions failed to change his default setting to one that did not print the answer key along with the test.
The test took place earlier this month. With a recovery plan in place, the department isn’t exactly laughing about the error, Johnson said, “but we’re not losing any sleep over it.”
She continued: “There’s a whole catalog of things that can go wrong during an exam, and there are things that are more serious [than this] on that spectrum.”
Johnson said some of her colleagues are trying to make the best out of the situation, using the test results as research material for item response theory and test design.
Read more by
Today’s News from Inside Higher Ed
Inside Higher Ed’s Quick Takes
What Others Are Reading