It started with an announcement in February that a University of Georgia instructor would start a crowdsourcing project to find out more about working conditions and salaries of adjuncts. Last month, a graduate student at the University of North Carolina at Greensboro announced that she was attempting something similar for graduate student employees.
Both projects are attempts at gathering information -- on the salaries of adjuncts and graduate students -- where rigorously researched data is difficult to come by. Adjuncts, by their very nature, are difficult to track: they might not teach every semester, they might teach at more than one institution, and many institutions are inconsistent in how they pay them. And major surveys on such topics as faculty salaries -- such as the annual report by the American Association of University Professors -- focus on full-time faculty members and exclude more of those off the tenure track.
Both the surveys involve crowdsourcing, which gives researchers the possibility of reaching a wider crowd than a traditional statistical method, whereby researchers would use a sampling method to survey a random, representative pool, and then extrapolate those results.
Anand Kulkarni, CEO of MobileWorks, a crowdsourcing platform, and a Ph.D. student at University of California at Berkeley, said one of the hazards of crowdsourcing surveys is that people who are responding to a self-reported survey might have a stake in the outcomes, whereas a more traditional method would try to reach out to as unbiased a sample as possible. “The nice thing about crowdsourcing, especially in a project that is on a larger scale, is that it is going to be a good representation of the communities,” he said.
The challenge in crowdsourcing surveys is to get the word out to as many people as possible. Kulkarni said the adjunct project was an effective use of crowdsourcing because it also attempts to give the academic community a sense of the plight of adjuncts. “There is a strong incentive to self-report and there is a very good likelihood of getting meaningful results,” he said.
One common practice in crowdsourcing is for survey participants to refer to other participants, and that’s how the effectiveness of the survey increases.
Josh Boldt, who started the Adjunct Project, said his goal was to have more transparency about adjunct salaries and remove the veil of secrecy that has surrounded the pay scale of such faculty members. “I think it is intentional on part of the administrations,” said Boldt, who is an instructor in the University of Georgia's English department. Boldt , who has been an adjunct for a year, said he began the project because of the “horrible experiences” that he had heard about from others. “Once I saw the huge reaction to my project, I knew I would have to keep it going for the sake of everyone involved,” he said.
In the six weeks that his project has been online, it has garnered about 50,000 hits, and there are about 1,600 entries from adjunct faculty with information about their salaries. Only three of these entries have been challenged, according to Boldt. The challenged entries have the original data and the new numbers from the challenger. "The control is in the crowdsourcing itself,” he said. “In some sense, it is the ultimate form of peer review.”
He compared the checks to a “Wikipedia mentality,” meaning that the information on the website was being constantly peer-reviewed.
Inspired by Boldt’s project, Nancy Poole, a graduate student in the Department of Library and Information Studies at the University of North Carolina at Greensboro, has started a similar effort to collect data from graduate employees. A spreadsheet on her website collects information from graduate student employees on stipends, hours worked each week, benefits and other related information.
A more detailed adjunct compensation survey is the one that the Coalition on the Academic Workforce has been working on since late 2010. (The coalition includes disciplinary associations, unions and others.) The initial data for this survey of contingent faculty was gathered online from September to November 2010 and includes about 30,000 responses. “The survey is unique in the sense that we are asking for course-level data, we are trying to find out what they were paid for each course they were teaching,” said Craig Smith, higher education deputy director with the American Federation of Teachers.
The survey is expected to be published by the end of spring this year, and those involved with it said it had been delayed because those involved with the project have to work full-time jobs as well as spending time on the survey results. “We are tabulating the results and doing some analysis,” said John Curtis, the director of research and public policy for the American Association of University Professors.
Curtis said the large amount of data indicated the extent to which people wanted to share their experiences. “We are getting a dataset ready that academic researchers will be able to use. We are going to be careful with the data. We cannot say with statistical certainty that is representative of the whole [adjunct] population,” he said.
As for Boldt’s project, the New Faculty Majority, a national group representing adjuncts and contingent faculty, will house the data and analyze it. “NFM Foundation's Fall 2011 'Back to School' survey results will be released sometime this spring, and the data that Josh collected will be integrated into the various reports we issue on the working conditions of adjunct and contingent faculty, particularly as these working conditions relate to student learning,” said Esther Merves, director of research and special programs for the New Faculty Majority Foundation.
And though crowdsourcing may have its critics, experts see it as a relatively inexpensive way of reaching out to a wide audience. "If you have five adjuncts from a university teaching English and they all have similar salary figures, then you can rule out an outlier from the same department who says that his or her salary is much more. The challenge arises if you have one person reporting from a given department, you have to take it at face value," said Daren Brabham, assistant professor at the University of North Carolina School of Journalism and Mass Communication.
“The brilliance of it is the ability to get more and more people involved," he said. Brabham, who has researched crowdsourcing, said it is this ability to get more and more people involved that chips away at any weakness that the system might have. “Eventually, the outliers will go away on their own,” he said.
Search for Jobs