You have /5 articles left.
Sign up for a free account or log in.

Earlier this year, Dartmouth College’s medical school charged 17 students with cheating on remote online exams. Three of the students were expelled. The accused protested their innocence, claiming the medical school’s remote test administration, or RTA, system had falsely flagged their conduct. With their reputations and careers hanging in the balance, their fate came down to a dispute about the software. Eventually, a technical explanation emerged, showing how students’ logged-in cellphones and tablets might have been accessing course notes while the exam was being administered. Dartmouth’s administrators then dropped charges against all the students accused, reinstated the ones expelled and apologized.

The incident demonstrated that while RTA systems are entrusted with enormous power, the basis for their automated judgment can sometimes seem mysterious. In tech parlance, systems whose inner workings are not open for easy inspection are often called black boxes. Given that RTA systems have become pervasive, the challenges in deciphering how they work can present huge technical, social, ethical and, possibly, legal problems.

That said, teachers and administrators can hardly be faulted for moving to remote testing over the past year. The COVID pandemic required everyone to improvise. Suspending classes and bringing the entire global educational system to an indefinite halt was not an option. Moving to online learning environments, in tandem with remote test administration, was arguably the only viable choice.

From elementary schools to graduate and professional schools, teachers and administrators did their best to purchase and deploy remote testing software packages that featured accessible interfaces, efficiently processed test results and, yes, appropriately included tools to detect potential cheating. Integrity is essential in academe, and academic misconduct does occur.

The trend toward online learning that predated the pandemic, and the consequent growing reliance on RTA systems, highlights the urgent need for American (and likely global) educators to develop a comprehensive set of uniform principles to guide the development and use of remote testing software.

To that end, the U.S. Technology Policy Committee of the Association for Computing Machinery, the world’s largest association of computing professionals, released a Statement on Principles for Secure Remote Test Administration. The committee, which we’re privileged to represent here, believes that technologists, faculty and administrators, as well as the companies that develop these packages, need to more consciously consider issues of fairness, accuracy, data security and accessibility during development and implementation of RTA systems.

The question of how these systems detect academic misconduct -- including specifically how they monitor the gazes of test takers during exams, enable the test taker’s microphone to listen for voices and access test takers’ computers to flag potential cheating or test stealing -- have been among the most talked-about aspects of these technologies. The Technology Policy Committee’s principles focus heavily on assuring the actual ability of these systems to detect misconduct and how developers’ claims of accuracy in this regard might be verified.

In addition, however, we urge the college and university administrators who are responsible for procuring these systems to ask more fundamental and wide-ranging threshold questions that go beyond mere functionality. For example, to ensure equity, they should make sure that the costs of using RTA systems are not passed along to students and that any students asked to take exams via a remote system have sufficiently robust access to the internet. Administrators should verify that the RTA software can be used by students with disabilities who may need to rely on accessibility software or other kinds of accommodations. The RTA data-retention policies should be clear and audited. And all these questions should be evaluated using uniform benchmarks and accepted certification procedures -- two things that do not exist today.

As with any technology that monitors people’s activities, especially on their personal computers, the statement also discusses and outlines principles to protect student privacy and data security.

Some people might argue that, as the pandemic wanes and students continue to return to the classroom, these concerns will fade as remote test administration systems are shelved. But the forecasted economic reality is that, given the cost and convenience advantages they offer educators and students, online learning platforms coupled with RTA systems will only become more pervasive in a post-COVID world. If such analyses are correct, the pandemic will simply have accelerated the digital trend in education that was already well underway.

That is not to say that classroom learning is going away anytime soon. After all, technology has its limits, and many studies have shown that in-person education yields better outcomes than digital alternatives. For the foreseeable future, students around the world will attend in-person classes for their instruction and take exams with proctors looking over their shoulders.

But with what we’ve learned about RTA systems in the past year, and recognizing their growing role in the education ecosystem, the window in which to establish important principles about their use is closing. Such principles will help maximize their benefits to higher education institutions and students while minimizing their now all-too-evident and profound risks.

Next Story

More from Views