The plagiarism detection service Turnitin on Wednesday made bold claims about the effectiveness of its product to weed out “unoriginal writing”, but researchers in the field aren’t buying the results.
In a study detailed in the report released Wednesday morning, Turnitin tracked the decrease in "unoriginal writing" -- meaning writing that scored 50 percent or higher on the software’s Overall Similarity Index -- at 1,003 non-profit colleges and universities in the U.S. that had used Turnitin for five years.
Most institutions started experiencing drops in unoriginal writing by the third year of Turnitin use, and by year four, not a single type of institution reported an increase. In the fifth and final year of the study, every class posted a double-digit decrease, ranging from 19.1 percent among four-year institutions with fewer than 1,000 students to 77.9 percent among two-year colleges with 3,000 to 5,000 students. Overall, unoriginal writing decreased by 39.1 percent.
The results were met with skepticism from people such as Thomas S. Dee, a professor in the Stanford Graduate School of Education.
“To be honest, I don’t find this study particularly convincing, and I say that as someone who is quite sympathetic to the idea that there are technological solutions to the plagiarism problem that are readily available and affordable and worth studying further,” said Dee, who is also a research associate for the National Bureau of Economic Research.
Dee described the report as containing “suggestive, descriptive evidence,” not “convincing causal evidence” that Turnitin was primarily responsible for the drop in unoriginal writing.
“It’s a kind of selection bias story,” Dee said. “They’re looking into the prevalence of unoriginal content over time -- but think about the professors who might be picking up Turnitin. They may be the ones who feel they have the biggest problem with plagiarism in their class, so you’re going to get a very high baseline of unoriginal content. The profs who were late adopters could be those who see only marginal benefits. That would also create the appearance of a treatment effect when none may exist.”
Dee also said the adoption of Turnitin may have coincided with new policies meant to crack down on plagiarism. “Are we really observing the effect of the website here or the additive -- possibly multiplicative -- effect with other institutional changes?” he said.
Dee, along with Brian A. Jacob, professor of education at the University of Michigan, conducted a study on academic integrity education in 2010. Instead of using scare tactics as a deterrent, Dee and Jacob found students were less likely to plagiarize if they clicked through a free, online tutorial on skills such as effective note taking and avoiding procrastination.
“What’s lacking in all of this is what we fielded in our randomized trial, which is something that grabbed students up front and said this is what academic integrity really means, and here are some effective strategies that you can implement right now to avoid plagiarism. I worry about approaches that are more law and order and less about us taking up our primary duty of trying to educate these students.”
In response to the criticism, Turnitin’s vice president of marketing, Chris Harrick, said the study was crafted to avoid the pitfalls listed by Dee.
“We did this by ensuring that there was a critical mass of submissions in the account (at least 10 percent of the total of the most recent year’s submissions),” Harrick said in an email. “Schools also had to use the service for at least a year, which reduces the early-adopter argument.”
As some students who wish to avoid getting caught may spend more time hiding their tracks than doing the actual coursework, it may not be surprising that they would find ways to circumvent Turnitin over the course of five years. Harrick said Turnitin is constantly being tweaked to prevent that from happening.
“Moreover, if students found a way around the system that would meaningfully impact the results across millions of submissions, we would find out,” he wrote. “Also instructors would not be shy in letting us know we were missing cases of plagiarism.”
Turnitin aligned the reported results based on years of use rather than when the institutions became customers. As a general rule, the institutions all used Turnitin between 2011 and 2013. While the company doesn’t have any historical data showing unoriginal writing trends, Harrick said the slight uptick seen from the first to second year of the study suggests it is on the rise. “That might be useful as a guide to general trend if you accept the assumption that students are employing preexisting behaviors in writing as they get acquainted with Turnitin,” he said.
Read more by
Today’s News from Inside Higher Ed
What Others Are Reading