You have /5 articles left.
Sign up for a free account or log in.

SAN FRANCISCO -- Student plagiarism drives professors crazy. And even as some question the educational value of trying to detect and punish plagiarism, services that review papers for lack of originality are popular with many college administrators and professors. One area within academe where skepticism of plagiarism detection services has been high is among those who actually teach writing. Past meetings of the Conference on College Composition and Communication, which attract thousands of composition and rhetoric instructors, have featured sessions debating the uses of such services.

On Thursday, at this year's meeting, a team from Texas Tech University presented data that challenged the plagiarism detection services in a new way. The team found that services that theoretically detect the same sorts of problems actually find (or don't find) very different examples of possible plagiarism.

Generally, the study found that Turnitin was much more likely than competitor SafeAssign (which is part of Blackboard) to identify material as being potentially not original. But that finding shouldn't necessarily cheer Turnitin. The researchers reported that many of the instances of "non-originality" that Turnitin finds aren't plagiarism, but are just the use of jargon, course terms or the sort of lack of originality one might expect in a freshman paper. In other cases, the study found that Turnitin didn't necessarily identify the correct source of plagiarized materials.

This year's meeting also comes at a time that Turnitin is trying to encourage different kinds of presentations to the composition meeting. Turnitin is paying the travel costs of some of those who are speaking here. The Texas Tech professors are not among those in San Francisco on Turnitin's dime and the company won't reveal those who are receiving support.

But some of those in the program giving papers that suggest a more positive view of Turnitin confirmed that they have been promised money by the company. The board of the composition association has adopted new rules, prompted by Turnitin's grants to selected speakers, to encourage speakers to disclose their financial support, but some speakers said they didn't know about the rule.

Finding False Positives

The Texas Tech research came about in the last year as the university started to consider whether to purchase an institutional license for a plagiarism-detection service. Because Texas Tech's writing program maintains a large online database of student work, professors there had access to papers of varying quality written for the same assignments, and the writing scholars developed various ways to test the programs. They took a batch of 200 papers from similar assignments and ran them through both Turnitin and SafeAssign's systems. They then did an in-depth analysis of a smaller number of papers to determine what was being flagged by each service. They repeated this with another set of 200 papers and another subset that was subject to closer review.

All of the members of the Texas Tech team said that they emerged from their study with serious reservations about using the services. (And these are not instructors who are laissez faire about plagiarism; all regularly use Google and other search engines to identify copying, and believe that inappropriate theft of ideas and writing should be challenged.)

"Everyone needs to understand the limitations" of these services, which have flaws even if they help instructors with some issues, said Susan M. Lang, director of first-year composition at Texas Tech.

Some of the issues raised by the study:

Consistency: By several measures, the study found Turnitin flagging more papers for review than Safe Assign. For example, of the 400 papers reviewed, Turnitin found that 46 had 26-50 percent unoriginal material, compared to 18 identified by SafeAssign. Turnitin flagged 152 papers as having between 11 and 25 percent unoriginal material, while SafeAssign found only 55 papers in this category. This means, the researchers said, that students engaged in the same kind of work (or questionable work) might get treated in different ways at different colleges, suggesting a lack of consensus about academic misconduct.

False positives: Many of the phrases or sentences flagged by both services -- but especially the greater number identified by Turnitin -- weren't plagiarism, but were cases in which certain phrases appeared for legitimate reasons in many student papers. For example, the researchers found high percentages of flagged material in the topic terms of papers (for example "global warming") or "topic phrases," which they defined as the paper topic with a few words added (for example "the prevalence of childhood obesity continues to rise").

Likewise, commonly used phrases generate much flagging even though writing something like "there is not enough money to go around," while not original, wouldn't be considered plagiarism. When the Texas Tech researchers started asking professors about some of these issues, they discovered unusual work-arounds, such as a professor who tells his students to write their papers, and then to delete any topic sentences so that their papers won't be flagged in error.

Incorrect links to sources: Lang said that one use of detection services is that they supply the source that a student apparently copied, giving a faculty member or campus judicial board evidence solid grounds for discussing the problem or seeking sanctions. To test this service, the Texas Tech researchers engaged in what writing instructors jokingly call "'paste-urizing" -- finding large blocks of content on Web sites and pasting them together to create a paper. The resulting papers were flagged as problematic, but the sources didn't match the Web sites used to create the paper. Lang imagined a scene where she might confront an apparent plagiarizer with this sort of evidence. "If I’m talking to a study whose paper was tagged, and I say 'here’s where the work came from' and the student says 'no it didn’t,' the student may be right."

Missing the printed word: Generally, Lang and others said that the kind of plagiarism detection offered by companies assumes that students will copy material that is available online. While it's true that the services advertise their access to online databases, and that students do copy material from the Web, the researchers found that material copied from books that aren't online could get through undetected.

Kathleen T. Gillis, director of Texas Tech's Writing Center, said that the findings left her thinking that software designed to promote academic integrity is sending mixed messages to students and not teaching them anything. "This all runs very counter to the instruction we give people every day."

Sally D. Elliott, chief operating officer of iParadigms (the parent company of Turnitin), was in the audience at the session. She said she agreed with many of the findings and said that the Texas Tech study showed that faculty training is "quite a critical aspect of all of this." Any service like Turnitin needs to be used "with knowledge of what it can do, what it can't," she said. Elliott said that if colleges "have taken the time to properly prepare teachers and students, the value is there," but "if they don't, results can be misinterpreted."

Elliott also noted that Turnitin, when flagging material for examination, doesn't brand anyone a plagiarist, but identifies potentially non-original material for faculty members or others to review.

Paying Presenters' Travel Costs

The Texas Tech session was a critical look at plagiarism detection, but other sessions in the program had titles that sounded less critical. For example, Jim Lee of Texas A&M University at Corpus Christi was to speak on "Improving Writing and Analytical Skills Through Turnitin." And Diana Vecchio of Widener University spoke on "Turnitin Originality Report: Not Just for Plagiarism Anymore." They were scheduled to share the podium with Lanette Cadle of Missouri State University, speaking on "Fighting the Fear: Plagiarism as an Expression of Technophobia." Cadle is the only one of the three who wasn't awarded Turnitin funds. She said that she wondered whether her fellow panelists were receiving support from the company. Cadle said she was not offered money, and wouldn't have accepted it if offered, given that she was speaking about the industry at an academic conference.

Lee said via e-mail that he applied for and was promised Turnitin money, but that when he didn't get details that he expected about the payment, he decided not to go to the meeting and so won't be giving the talk. "Having a company sponsor presentations of its service represents a conflict of interest, but I thought the company would have no influence on whatever I was going to say," said Lee.

Kent Williamson, executive director of the National Council of Teachers of English, of which the composition group is a part, said that the board decided -- after learning of the Turnitin grants -- to ask all speakers receiving financial support from an entity their papers discussed to reveal such support during their presentations. The idea, he said, was to ensure that people in the audience could make appropriate judgments of their own.

Williamson said that he didn't know which speakers on the program received support from Turnitin, and that the company had no influence over the selection process for speakers.

The issue of travel payments is particularly sensitive for a group like the composition conference because its members include many at community colleges and many adjuncts -- people who don't tend to have access to travel budgets (even in years that are better financially than this one).

Vecchio, of Widener, said that she was aware of the controversies over Turnitin and intellectual property and other issues, but that her talk about the company's services didn't relate to those issues. She spoke about how she uses Turnitin to teach first-year composition students how to paraphrase. By running their essays through Turnitin, she shows them how they are effectively copying material -- at least at the beginning of the course -- and can show progress toward the end. Turnitin is "a learning tool," she said.

When Turnitin first appeared, Vecchio said she was excited about the possibility of no longer having to hunt down the sources of papers that were likely plagiarized. But she said she doesn't use the service routinely for plagiarism detection and only does so when she has reason to suspect that a paper is not a student's work.

Vecchio said that she didn't inform her audience Thursday that she was receiving a travel grant from the company whose services she was discussing. "No one said anything" about the board's desire for speakers to make such statements, she said. Asked if she considered the issue before agreeing to the grant, she said, "I didn't even think about it."

Katie Povejsil, a spokeswoman for Turnitin, declined to say who was receiving the funds or even how many grants were awarded.

Asked why, she responded: "Our purpose is to continue to advance the writing conversation in these difficult economic times. We look forward to hearing about new insights and fresh approaches. The real story here is: How can we help students learn to write better for the 21st century?"

Next Story

Written By

More from News