Too seldom do we ask graduate students in science or engineering about their experiences in completing doctoral degree requirements. We go to administrators, faculty, and sponsors, but we don't ask students -- the main educational client -- what they make of what is happening to them. In particular, we are remiss with minority graduate students.
The need to communicate is self-evident. In 2004, fewer than 500 African American citizens and permanent residents earned Ph.D.'s in science and engineering fields, not even 1 percent of the total awarded. The numbers in some disciplines are so tiny as to defy sensibility: 17 in computer and information science, 13 in physics, 10 in mathematics, zero in astronomy. Today the science and engineering workforce -- like medicine, law, and business -- barely resembles the rest of America. The pattern for African Americans, observed for over half a century, is particularly bleak.
Last summer, I asked 40 minority doctoral candidates about their experiences in a "talk back" session at the annual meeting of the Graduate Scholars Program of the David and Lucile Packard Foundation. Since 1992, Packard Scholars have been selected from among the premier graduates of historically black colleges and universities.
The discussion confirmed that -- for these scholars at least -- those who do enter graduate programs in the sciences often face pressures not experienced by their non-minority colleagues. "It's not fun being a trailblazer in 2005," said one scholar, "because there are certain things we should not have to deal with. When you already have the responsibility and expectation of class work, nobody wants to carry the burden of the entire race and deal with issues that should have been resolved a long time ago."
Often, minority doctoral students in the sciences become PR spokespeople: "We are called upon to do a lot on diversity for the university. To sit on panels every time a black student is invited to the school ... to attend conferences, to take pictures for publications that show the diversity of the university. While we are doing these things, our counterparts are in the lab doing research and producing publications.... When a first-year student comes in, I want them to see another black face. But how do I maintain that research direction and focus? I have an extra burden not carried by my majority colleagues."
And while many students are supportive of diversity efforts, they cannot help but feel conflicted about the competitive realities facing science grads. "Yeah, I wanted to be a trailblazer," summarized one student, "but I also want the Nobel Prize in physics. I don't want to trail blaze in race relations at the university. I want to focus on my research and come up with a new laser treatment for cancer, that's my focus. I don't want to have to deal with the other stuff. Let me be me, let me shine, get your foot off of my neck, let me do my work."
The experiences voiced by the Packard Scholars are not unique. The AAAS Center for Advancing Science & Engineering Capacity was created to assist universities and colleges committed to improving the success of all students and faculty, especially those of color. The Packard Scholars reinforced much of what we've learned from our site visits, focus groups, and data reviews (for the center's approach, see this article). Their insights are noted here, many in the scholars' own voices.
Outreach must penetrate the academic reward system. As a faculty activity, outreach ranks a distant third behind research/entrepreneurship and teaching. Neither the faculty effort nor the outcome will change without institutional policies that restructure rewards. As one scholar put it, "Diversity will not be an issue until you start diving into their pockets, their budgets, because they'll do anything to get and keep their grants. But, if a university ... has all the money they need and new buildings, but they have never graduated an African American person, it's too easy to say, 'Oh, we don't know what to do, or we don't have the resources.' That's bull, because if you want the resources, you can get them."
And another remarked, "The program I selected had four African American graduates in the last 10 years. Two more are there now and another came in with me.... That makes a huge difference. Establish a great relationship with one student; make one happy and others will hear.... That is the easiest way to recruit because if they went to a black school, there are other students in their department who are looking for a good graduate program."
Gender and racial bias is a reality. Get over it -- with or without mentoring. The Packard Scholars report discrimination is alive and well in university programs: It ranges from negative comments in the lab about ability or preparation to the faculty's assumption that the only two black students in the department are going to work together. Some universities have developed mentoring or other support programs to mitigate the effects, while others let the problems go unattended.
Many students recommended that universities conduct diversity sensitivity training for the faculty. "That stops a lot of the comments and issues in the labs and in the classroom."
Still others found mentoring programs to be effective interventions. "I'm in medical school now [as an M.D./Ph.D. student], and there are institutionalized mechanisms designed with the philosophy that if we bring you to the school, it looks bad if we can't bring you to completion. Some of these or similar mechanisms, like 'big sib, little sib' mentoring situations can be implemented early. If you start to intervene after the first warning signs, these are still very much preventable problems. I think we would see a much improved attrition rate if we didn't wait until the problem is full blown -- a classic ounce of prevention is worth a pound of cure."
In situations lacking a formal infrastructure for dealing with discrimination, students devise their own. "Coming from [a historically black college/university] where the learning environment was more constructive, I was overlooked here several times because I was the only black in the class. I came up with strategies to cope. My best friend and I would intentionally split up ... so that we weren't in the same group.... We were able to survive because he would bring the information back to me and vice versa."
The student must focus on completing doctoral requirements. This form of accountability is a "performance contract" between student and major professor (if not one's dissertation committee). It reveals to the student the delicate balance of his/her endeavor: "When I started graduate school, the faculty taught us to work together, yet how to be competitive.... If I asked my advisor how to do something, he would guide me, but say 'You are different people, and I'm going to approach you at your level, so I may not ask you to do something that I ask your cohort to do because you are at a different place. But the results should be the same, because you are all here to get the Ph.D."
All kinds of institutions can be "minority serving." If we examine the baccalaureate origins of African American Ph.D.'s and of Latino Ph.D.'s, historically black colleges and Hispanic-serving Institutions, respectively, are the largest producers. But Massachusetts Institute of Technology, Stanford University, and the University of California at Berkeley, among others, have distinguished records as producers of minority bachelor's graduates who go on to earn a doctorate in science or engineering. In addition, relative newcomers such as University of Maryland-Baltimore County and Louisiana State University are undergraduate models of student preparation for science-based Ph.D.'s. Some institutions, and often departments within institutions, clearly "get it." But decentralized authority at the graduate level ensures unevenness and lack of sharing of best practices.
New Ph.D.'s underestimate the skills they possess. The orientation of most graduate programs in the sciences is to a single sector or career pathway that represents immediate job opportunity, but little demand for versatility. Because the doctoral training process reproduces the past, (i.e., the traditions that fit an earlier time), it also reflects the biases and career of one's major professors. Consequently, the Ph.D. experience minimizes belief and understanding about skills beyond science fundamentals. The Capacity Center works with institutions to develop the skills required by 21st century organizations, academic and nonacademic alike: teamwork, problem-solving, adaptation, communication, cultural competence.
This is about leadership -- the overarching need to grow leaders. For all the talk about the impact of mentors and role models, there will always be successful professional women and persons of color who will say, "It was tough for me and it's going to be tough for those who come behind me." These folks, irrespective of vintage or field, will not reach out. That's just the way they are -- making assumptions, suppressing memories of the help they received, and dealing with students their way. As one scholar noted, "Just think about how far the world has come in 10 years. Most of these cats [faculty] we're working for got their Ph.D. in the 1980s, 70s. The technology is moving way too fast and with the stuff that we know, we'll take their jobs. Some of them do everything they can to keep you from completing these programs, making it that much more difficult. The last thing they want to do is lose a job to you."
Change comes as new professionals ascend to positions that control resources and decisions. It may mean climbing the academic ladder or pursuing a nonacademic path. Both routes demonstrate that it's who you know plus what you know that matters -- not one or the other exclusively. Who's in your network? Who talks to whom? The AAAS Capacity Center makes explicit these aspects of professional socialization and networking that can make a difference in a career.
The nation has invested in science and engineering since Sputnik -- a half century -- to advance its education, economic, workforce, and national security interests. When students are not recruited and nurtured to degree completion, we waste talent and material resources -- in defiance of student demographics and to the detriment of the nation's place in the world.
Daryl E. Chubin
Daryl E. Chubin is director of the Center for Advancing Science & Engineering Capacity, at the American Association for the Advancement of Science.
Zotero is a tool for storing, retrieving, organizing, and annotating digital documents. It has been available for not quite a year. I started using it about six weeks ago, and am still learning some of the fine points, but feel sufficient enthusiasm about Zotero to recommend it to anyone doing research online. If very much of your work involves material from JSTOR, for example – or if you find it necessary to collect bibliographical references, or to locate Web-based publications that you expect to cite in your own work -- then Zotero is worth knowing how to use. (You can install it on your computer for free; more on that in due course.)
Now, my highest qualification for testing a digital tool is, perhaps, that I have no qualifications for testing a digital tool. That is not as paradoxical as it sounds. The limits of my technological competence are very quickly reached. My command of the laptop computer consists primarily of the ability to (1) turn it on and (2) type stuff. This condition entails certain disadvantages (the mockery of nieces and nephews, for example) but it makes for a pretty good guinea pig.
And in that respect, I can report that the folks at George Mason University’s Center for History and New Media have done an exemplary job in designing Zotero. A relatively clueless person can learn to use it without exhaustive effort.
Still, it seems as if institutions that do not currently do so might want to offer tutorials on Zotero for faculty and students who may lack whatever gene makes for an intuitive grasp of software. Academic librarians are probably the best people to offer instruction. Aside from being digitally savvy, they may be the people at a university in the best position to appreciate the range of uses to which Zotero can be put.
For the absolute newbie, however, let me explain what Zotero is -- or rather, what it allows you to do. I’ll also mention a couple of problems or limitations. Zotero is still under development and will doubtless become more powerful (that is, more useful) in later releases. But the version now available has numerous valuable features that far outweigh any glitches.
Suppose you go online to gather material on some aspect of a book you are writing. In the course of a few hours, you might find several promising titles in the library catalog, a few more with Amazon, a dozen useful papers via JSTOR, and three blog entries by scholars who are thinking aloud about some matter tangential to your project.
How do you keep track of all this material? In the case of the JSTOR articles, you might download them to your laptop to read later. With material available only on Web pages, you can do a "screen capture" (provided you've learned the command for that) but might well end up printing them out, since otherwise it is impossible to highlight or annotate the text. As for the bibliographical citations, you can open a word-processing document and copy the references, one by one, or use note-taking software to do the same thing a little more efficiently.
In any case, you will end up with a number of kinds of digital files. They will be dispersed around your laptop in various places, organized as best you can. Gathering them is one thing; keeping track of them is another. And if you have a number of lines of research running at the same time (some of them distinct, some of them overlapping) then the problem may be compounded. Unless you have an excellent memory, or a very efficient note-taking regimen, it is easy to get swamped.
What Zotero does, in short, is solve most of these problems from the start -- that is, at the very moment you find a piece of material online and decide that it is worth keeping. You can organize material by subject, in whatever format. And it allows cross-referencing between the documents in ways that improve your ability to remember and use what you have unearthed.
For example, you can "grab" all the bibliographical data on a given monograph from the library catalog with a click, and save it in the same folder as any reviews of the book you've downloaded from JSTOR. If the author has a Web site with his recent conference papers, you can download them to the same project file just as easily.
This isn’t just bookmarking the page. You actually have the full text available and can read it offline. The ability to store and retrieve whole Web pages is especially valuable when no reliable archive of a site exists. I got a better sense of this from a conversation with Manan Ahmed, a fellow member of the group blog Cliopatria, who has been using Zotero while working on his dissertation at the University of Chicago. Articles he read from Indian newspapers online were sometimes up for only a short time, so he needed more than the URL to find them again. (He also mentions that Zotero can handle his bibliographical references better than other note-taking systems; it can store citations in Urdu or Arabic just as well as English.)
Furthermore, Zotero allows you to annotate any of the documents you hunt and gather. You can cross-reference texts from different formats -- linking a catalog citation to JSTOR articles, Web publications, and so on. If a specific passage you are reading stands out as important, it is possible to mark it with the digital equivalent of a yellow highlighter. And you can also add the marginal annotations, just like with a printout -- except without any limitation of space.
When the time comes to incorporate any of this material into a manuscript, Zotero allows you to export the citations, notes, and so forth into a word-processing document.
Zotero is what is called a “plug in” for the Firefox Mozilla Web browser. You can use it only with Firefox; it doesn’t work with Netscape or Internet Explorer. People who know such things tell me that Firefox is preferable to any other browser. Be that as it may, the fact that Zotero functions only with Firefox means you need to have Firefox installed first. Fortunately it, too, is free. (All the necessary links will be given at the end of this column.)
While you are online, using Firefox to look at websites, there is a Zotero button in the lower right hand corner of the browser. If something is worth adding to your files, you click the button to open the Zotero directory. This gives you the ability to download bibliographical information, webpages, digital texts, etc. and to organize them into folders you create. (If a given document might be of use to you in two different projects, it is easy to file it in two separate folders with a couple of clicks.)
Likewise, you use the Zotero button in Firefox to get access to your material when offline. Then you can read things you glanced over quickly at the library, add notes, and so forth.
I won't try to explain the steps involved in using Zotero’s various features. Prose is hardly the best way to do so, and in any case the Zotero website offers "screencasts” (little digital movies, basically) showing how things work. The most striking thing about Zotero is how well the designers have combined simplicity, power, and efficiency -- none of them qualities to be taken for granted with a digital research tool. (Here I am thinking of a certain note-taking software that cost me $200, then required printing out the 300 page user’s manual explaining the 15 steps involved in doing every damned thing.)
There is some room for improvement, however. All of the material gathered with Zotero is stored on the hard drive of whatever computer you happen to be using at the time. If you work with both a laptop and a computer at home, you can end up with two different sets of files. And of course the document you really need at a given moment will always be on the other system, per Murphy's law.
The optimal situation would be something closer to an e-mail system. That is, users would be able to get access to their files from any computer that had Web access. Material would be stored online (that is, on a server somewhere) and be available to the user by logging in.
Aside from the increased convenience to the individual user, making Zotero a completely Web-based instrument would have other benefits. The most important -- the development likely to have a significant impact on scholarship itself -- would be its ability to enhance collaborative work. Using a Zotero account as a hub, a community of researchers could share references, create new databases, and so on. And the more specialized the field of research, I suppose, the more powerful the effect.
All of which is supposed to be possible with Zotero 2.0, which is on the way. The release date is unclear at this point, though improved features of the existing version are rolled out periodically.
But for now, the folders you create on your laptop are stored there -- and remain unavailable elsewhere, unless you make a point to transfer them to another computer. This brings up the other serious problem. There does not seem to be a ready way to back up your Zotero files en masse. In the best case, there would be a command allowing you to export all of the material in Zotero to, say, a zip drive. Otherwise you can end up with huge masses of data, representing however many hours of exploration and annotation, and no easy way to protect it.
Perhaps it is actually possible to do so and I just can’t figure it out. But then, neither can the full-fledged member of the digerati who initiated me into Zotero. And so we both use it with a mingled sense of appreciation (this sure makes research more efficient!) and dread (what if the system crashes?)
For now, though, appreciation is by far the stronger feeling. Zotero does for research what word-processing software did for writing. After a short while, you start to wonder how anyone ever did without it.
If you don't already have Firefox 2.0 on your computer's desktop, you will need to download it before installing Zotero itself. Both are available here. The site also offers a great deal of information for anyone getting started with Zotero. Especially helpful are the “screencast tutorials” -- the next best thing to having a live geek to ask for help.
"While clearly Zotero has a direct audience for citation management and research," according to another commentary, "the same infrastructure and techniques used by the system could become a general semantic Web or data framework for any other structured application." I am going to hope that is good news and not the sort of thing that leads to cyborgs traveling backward in time to destroy us all.
In recent years there has been a strong push to attempt to regulate science by increasing disclosure of financial conflicts of interest (FCOI). As well-intentioned as this regulatory approach might be, it is based on flawed assumptions, poses the risk of becoming a self-perpetuating end in itself, and is a distraction from the underlying serious problem.”
It is hard to see how it could be possible that strengthened FCOI disclosures could have a significant effect since we know from the very reports from Sen. Charles Grassley’s Finance Committee that there is next to no enforcement. If the dog is all but toothless now, will someone unscrupulous hesitate much to game the system by simply not reporting? A clever operator will get a lawyer to guide such behavior. But this is hardly the extent of what we should be thinking about.
Conflict of interest rules are supposed to control corruption by recusing those with a financial stake. Corruption is the rational response in systems, so the mythical “rational players” will be corrupt. In political culture corruption is a given and conflict of interest rules have had some effect in legislatures of the USA. But science is not politics.
Scientific culture presumes honesty, but the data says that scientific fraud is large and growing. A recent study quite boggles the mind when one considers that some 9,000 papers were flagged for possible plagiarism with 212 of the first 212 being probable plagiarism on full examination, and to get into the running required substantial matches in the abstracts. It recently came to light that virtually an entire subfield in medicine was a fraud, though it is not clear if it was harmful. I will not belabor this, but basic sense tells us that if the dumbest kind of fraud is so widespread, we doubtless have serious problems elsewhere.
There is little punishment if one is caught in those rare cases when it is discovered. Looking at cases of scientific fraud, one finds that usually no charges are filed and authors don’t necessarily even withdraw their papers. When they do withdraw them, there is little facility for recording that fact, and papers can remain available in NIH databases and others without so much as a warning flag.
There is a European pilot project attempting to make a stab at the problem to mixed review. There is a Scifraud Web site as well that is similarly mixed. At worst, research privileges may be taken away. In one of the few cases I am aware of where charges were filed, the South Korea case, Hwang Woo-Suk was given a prestigious award despite currently standing trial for fraud and being unable to attend the ceremony for that reason. In other words, in science, a life of crime is easy -- at worst one gets a slap on the wrist. For those who commit frauds of various kinds, mostly one wins -- publications generate promotions, grants, etc.
Look at the situation objectively, and one must ask the question. Why bother doing real research if you can scout out what is probably true from some hard-working researchers with real data, then submit a paper that looks perfect "proving the hypothesis" with "all the latest techniques"? As we saw, simple plagiarism of the dumbest kind is probably endemic. There is software that can fake a gel, that can fake flow cytometry data, and one must assume that it is used. Call it “theft by perfection." We have no data at all on such fraud, but anecdotal evidence forming semi-random samples of significant size certainly suggests it occurs in certain areas of bioscience. So if there is great upside in science fraud – where’s the downside?
Perhaps one might be exposed, but even then, unless it’s really high profile, few people will know. Is the chance of being caught even as high as one in 5,000? Those thousands of fake papers say no, and instead suggest the chance of being caught may be less than one in 10,000 or more.
Given all of this, the rational response would be to face the scientific fraud problem head on rather than enact window dressing regulations, and I have a few proposals for how to do that.
The first regulatory change we need is to throw out the statute of limitations regulation that is set at six years. Folks, under current National Institutes of Health rules the case of the midwife toad would not have been exposed! Isn't that ridiculous? Scientists are (or should be) some of the better records keepers on the planet. Yes, records aren’t perfect, nor are memories, but mostly we have them around somewhere, or at least enough of them. We should remember also that scientists have been selected for superior memories and analytic abilities. In the context of science, graduate students are the people who usually find out about fraud first because they see exactly what is going on. The median time for a graduate student to awarding of a Ph.D. is six years; this is a time when they are extremely vulnerable to retaliation.
The second regulatory change concerns intra-university investigations. There is institutional collusion that whitewashes intra-university investigations unless a professor or dean takes up the cause. Flatly, these intra-university procedures don't work for graduate students and post-docs, and those who use them tend to find themselves pariahs. In this way our biosciences system has been systematically eliminating some of the most ethical and capable researchers in training who leave when subjected to retaliation. Keep your head down and don't rock the boat is the watchword in graduate school these days. I get the strong impression that those who went through grad school 30 years ago have little clue how bad it is. Ad hoc committees of arbitrarily chosen people who I believe are sometimes interfered with backstage by chancellors can exhibit phenomenally poor investigative skills when presented with claims. Those who serve on such committees are in a lose-lose position, and have no incentive to be there.
The only way they win is to curry favor with the administration.
Consequently, responsibility for academic misconduct complaints and whistle blower reports must be removed from the institutions in which they occur. I propose that such investigations be turned over to the Justice Department for investigation by an Office of Research Integrity moved into Justice for special adjucation. Researchers should be held personally liable, and they should be charged with criminal conduct, and efforts made to cut deals with them for fingering others in their loose circle. I strongly suspect that fraud appears in clusters linked by human networks in academia as it does elsewhere. Scientific fraud should be a criminal matter at the federal level if federal funds are used. Papers containing fraudulent data should be removed from federally funded databases and replaced with an abstract of the case and link to the case file.
Non-citizen students and post-docs are even more vulnerable to manipulation and extortion than citizens because of their dependence on their professor for a visa. This enables the unscrupulous to exert even more retaliatory power. I suspect the only cure for that is to grant a 10-year visa that will act like a time limited green card to deal with the problem. That way, at least non-citizens can vote with their feet and have some leeway to get away from an unscrupulous scientist.
But we can’t just improve what we do in response, although that is important. We also have to work hard to find our problems, so the third major area for improved regulation is to create scientific transparency. This will make it possible for other scientists to more easily detect fraudulent work. It should be required that data be made available within 12 months of collection to other researchers, on request. (There could be some variation depending on the kind of research.)
The researchers running the study could be given an embargo period of two years (or some other interval chosen by a reasonable formula) to publish based on their data, but there is no reason why other scientists shouldn't be able to see the data before publication during such an embargo. After publication, other researchers should be given access. It is transparency at the most fundamental level that is missing. Since court precedents have given researchers who receive government funds control over their data, because there was no other rule in place, the only way to improve data transparency is to mandate it. At the very least, base data should be released on demand after publication.
Dealing with these fundamentals will yield good results. Most researchers are innocent; they are guilty of little more than reluctance to get involved in the hard work of whistleblowing for no reward. Just tightening the straitjacket on researchers by giving them more hoops to jump through, and forcing them to recuse themselves from their own area of expertise because of financial rewards they earned by hard work will not prevent the unscrupulous from failing to report conflicts that nobody will find if they don’t report them. It will, instead, punish the ethical and financially harm them by taking away just rewards while having next to no impact on the unethical.
In its simplest restatement, science has a two-horned problem. On the one hand, there is an enforcement problem that exists because there is little chance of being caught in any 10-year period, and if one is caught the penalty is barely a slap on the wrist. This is exaggerated by the setting of statutes of limitations to coincide with the interval during which those most likely to find out are ensconced in a feudal serfdom holdover. On the other hand, sometimes huge rewards should legitimately accrue to people who spend their lives working very hard. Protecting such rewards is the entire purpose of our patent system which encourages innovation and the creation of new economic value.
In summary, we cannot fix the enforcement problem in scientific fraud by making it harder for the rewards to occur. We won't even raise the risk premium for fraud by any of the current rule changes proposed. We will, however, slow the pace of research by taking the best researchers off of problems they know best because they are forced to recuse themselves due to financial conflicts of interest.
Doing that, we will penalize researchers. We will also be penalizing top institutions by forcing them to step aside from involvement with furthering what has great economic value to the nation; because where there is conflict of interest, that means that value has been created. We need to attack the real problem head-on if we want to get good results and keep science respectable and economically most productive. The problem is simply scientific fraud.
Brian Hanley is an entrepreneur and analyst who recently completed a Ph.D. with honors at the University of California at Davis.
In his mock documentary Take the Money and Run (1969), Woody Allen plays the ambitious but remarkably unlucky bank robber Virgil Starkwell. He never makes the FBI’s Ten Most Wanted because, after all, it all depends on who you know. But he does manage to shave some time off one of his prison sentences by volunteering for medical research. He survives the experiment. There is one side effect, however, as the narrator explains in a solemn voiceover: He is temporarily transformed into a rabbi.
This sequence came to mind while reading The Professional Guinea Pig, by Roberto Abadie, just published by Duke University Press. “An estimated 90 percent of drugs licensed before the 1970s were first tested on prisoners,” writes Abadie. “Prisoners were in many ways a perfect population for a controlled experiment. Because they had similar living conditions they provided good control groups for clinical trials, while the financial and material benefits ensured a large supply of willing and compliant volunteers.”
Only in 1980 did the Food and Drug Administration ban the use of prisoners for medical research. Their circumstances made a mockery of informed consent. (Especially in Virgil’s case. “Prisoners received one hot meal per day,” the narrator explains: “a bowl of steam.”) But the demand for experimental subjects for biomedical research had to be met somehow. And so there has emerged the new regime of power and knowledge analyzed by Abadie, a visiting scholar with the health sciences doctoral program at the City University of New York Graduate Center.
His book is an ethnographic account of the subculture of “paid volunteers” recruited to serve as subjects for pharmaceutical testing -- with a particular focus on what he calls the “professionalized” guinea pigs who derive most (or all) of their income from this work. Volunteers receive “from $1200 for three or four days in less intensive trials,” according to Abadie, “to $5000 for three or four weeks in more extensive ones.”
Actually the term “work” is somewhat problematic here. The labor is almost entirely passive. Half of it, as Woody Allen once said about life itself, is just showing up. You are weighed and your blood taken, and there might be a few other tests, along with quite a lot of boredom. (One of Abadie’s informants describes it as participation in “the mild torture economy.”) Some of the guinea pigs fall back on it as a supplement to “low-paying jobs as cooks, construction workers, house painters, or bike messengers.” For others, it is their sole source of income. They enlist for up to eight rounds of testing per year, earning “a total estimated income of $15,000 to $20,000 in exceptionally good years.”
Higher rates of pay are available to those willing to endure unpleasant procedures. Likewise, there is a premium for testing psychiatric drugs -- though the considered opinion of old-time guinea pigs is that you just don’t earn enough to make it worth letting someone mess with your brain chemistry.
Abadie’s description of the guinea-pig milieu -- based largely on interviews with a number of them living in a bohemian neighborhood in Philadelphia -- focuses on how they understand the risks involved in making a living this way, including their preferred means of recovering between rounds of exposure to “phase I” testing. (That is the term for clinical trials in which pharmaceuticals shown to have low toxicity when given to animals are tried on human subjects.) Various dietary regimens are thought to have a purifying effect. An informal network keeps participants updated on new opportunities in the human-subject market, and there used to be a zine called Guinea Pig Zero that still has a web presence.
Most of Abadie’s informants are also members of an anarchist counterculture that prides itself on remaining outside corporate capitalism. And making your living as a guinea pig is certainly different from joining the rat race. But the “mild torture economy” is well integrated into the larger and more literal economy. Testing is a necessary stage of pharmaceutical development, with some 80,000 phase I trials -- each involving 30 to 100 human subjects -- being run each year. The development of a pool of reliable but poorly paid “volunteers” (consisting mostly of young men who, as Abadie puts it, “use their bodies as ATMs to fund their lifestyles”) is one sign of the effect of deindustrialization on the labor market.
And the effect of becoming dependent on guinea-piggery as a source of income is that it creates an incentive to ignore the question of how exposure to experimental pharmaceuticals might affect you over the long run. “Beginners are more worried about risks than professionals,” notes Abadie. “Maybe this reflects the general population’s anxieties about biomedical research and its well-publicized abuses. Volunteers’ initial uneasiness focuses on the unknown effects of the drugs, but it also reflects a discomfort with a procedure they do not yet fully understand…. Some volunteers mentioned that they were somewhat concerned about developing cancer in the future.”
Not so, evidently, with those who had been through the process a few times: “Dependency on trial income, trial experiences that have not exposed them to side effects, and interactions with more experienced volunteers convinces newcomers that risks are not to be feared.” Just drink a couple of gallons of unsweetened cranberry juice and it’ll just wash the corporate technoscience right out of your system….
Meanwhile, the FDA “inspects less than 1 percent of all clinical trials in this country,” writes Abadie, and paid volunteers lack the resources to challenge any abuses they may suffer.
Trials in phases II and III -- when a drug is tested on patients suffering from the condition it may help treat -- draw on a different pool of human subjects, with motivations beyond that of payment. But when the subjects are economically vulnerable, as with some of the poor AIDS patients discussed in later chapters of Abadie's study, it compounds the ethical problems facing an institutional review board trying to assess whether the research has scientific merit or is driven instead by business interests.
The IRB in this case oversees the work of a small, community-based organization, not a university (where many clinical trials are conducted), but Abadie suggests that its ambivalence is commonplace. Its members "recognize the benefits that can derive from a relationship with the industry, but at the same time they fear that prospective financial gains can influence the research. These anxieties are reflected particularly in their views of the informed-consent process ... in which volunteers are supposed to be able to evaluate risks and benefits independently of other considerations."
The major weakness of this otherwise intriguing and worrying book is that it provides no clear sense of how typical the “professionalized” guinea pigs in Philadelphia may be -- and how central such repeat-performing volunteers are to the industry employing them.
Abadie maintains that a cohort of full-time human subjects emerged after the pool of prisoners dried up 30 years ago. The needs of the pharmaceutical industry led to the formation of “a group of reliable, knowledgeable, and willing subjects who depend on participation in trials for income to support themselves.” Okay, but just how dependent is the industry on them? What portion of the population of human research subjects for pharmaceutical research consists of such full-timers?
Invocations of “the new subjectivity required by neoliberal governmentality” may have their place in defining the situation. But hard numbers would be good, too. The fact that we don’t have them is part of the problem. But then there aren’t too many dimensions of the health care industry that don’t look like problems, right now.
Reflecting on the recent The Humanities and Technology conference (THAT Camp) in San Francisco, what strikes me most is that digital humanities events consistently tip more toward the logic-structured digital side of things. That is, they are less balanced out by the humanities side. But what I mean by that itself has been a problem I've been mulling for some time now. What is the missing contribution from the humanities?
I think this digital dominance revolves around two problems.
The first is an old problem. The humanities’ pattern of professional anxiety goes back to the 1800s and stems from pressure to incorporate the methods of science into our disciplines or to develop our own, uniquely humanistic, methods of scholarship. The "digital humanities" rubs salt in these still open wounds by demonstrating what cool things can be done with literature, history, poetry, or philosophy if only we render humanities scholarship compliant with cold, computational logic. Discussions concern how to structure the humanities as data.
The showy and often very visual products built on such data and the ease with which information contained within them is intuitively understood appear, at first blush, to be a triumph of quantitative thinking. The pretty, animated graphs or fluid screen forms belie the fact that boring spreadsheets and databases contain the details. Humanities scholars, too, often recoil from the presumably shallow grasp of a subject that data visualization invites.
For many of us trained in the humanities, to contribute data to such a project feels a bit like chopping up a Picasso into a million pieces and feeding those pieces one by one into a machine that promises to put it all back together, cleaner and prettier than it looked before.
Which leads to the second problem, the difficulty of quantifying an aesthetic experience and — more often — the resistance to doing so. A unique feature of humanities scholarship is that its objects of study evoke an aesthetic response from the reader (or viewer). While a sunset might be beautiful, recognizing its beauty is not critical to studying it scientifically. Failing to appreciate the economy of language in a poem about a sunset, however, is to miss the point.
Literature is more than the sum of its words on a page, just as an artwork is more than the sum of the molecules it comprises. To itemize every word or molecule on a spreadsheet is simply to apply more anesthetizing structure than humanists can bear. And so it seems that the digital humanities is a paradox, trying to combine two incompatible sets of values.
Yet, humanities scholarship is already based on structure: language. "Code," the underlying set of languages that empowers all things digital, is just another language entering the profession. Since the application of digital tools to traditional humanities scholarship can yield fruitful results, perhaps what is often missing from the humanities is a clearer embrace of code.
In fact, "code" is a good example of how something that is more than the sum of its parts emerges from the atomic bits of text that logic demands must be lined up next to each other in just such-and-such a way. When well-structured code is combined with the right software (e.g., a browser, which itself is a product of code), we see William Blake’s illuminated prints, or hear Gertrude Stein reading a poem, or access a world-wide conversation on just what is the digital humanities. As the folks at WordPress say, code is poetry.
I remember 7th-grade homework assignments programming onscreen fireworks explosions in BASIC. When I was in 7th grade, I was willing to patiently decipher code only because of the promise of cool graphics on the other end. When I was older, I realized the I was willing to read patiently through Hegel and Kant because I learned to see the fireworks in the code itself. To avid readers of literature, the characters of a story come alive to us, laying bare our own feelings or moral inclinations in the process.
Detecting patterns, interpreting symbolism, and analyzing logical inconsistencies in text are all techniques used in humanities scholarship. Perhaps the digital humanities' greatest gift to the humanities can be the ability to invest a generation of "users" in the techniques and practiced meticulous attention to detail required to become a scholar.
Trained in analytic philosophy, Phillip Barron is a digital history developer at the University of California at Davis.
Over the last 18 months, Microsoft has shifted its support for academic research to involve more universities and more kinds of studies. The shifts come at a time that Bill Gates, the company's founder, has become increasingly concerned about declines in support for key research agencies and declining student interest in computer science.