U.S. research universities' global dominance will be threatened in coming years unless governments invest more and universities become more efficient and better educate under-represented groups, according to new National Research Council report.
The most recent case of scientific fraud by Dutch social psychologist Diederik Stapel recalls the 2010 case against Harvard University of Marc Hauser, a well-respected researcher in human and animal cognition. In both cases, the focus was on access to and irregularities in handling of data. Stapel retained full control of the raw data, never allowing his students or colleagues to have access to data files. In the case of Hauser, the scientific misconduct investigation found missing data files and unsupported scientific inference at the center of the accusations against him. Outright data fraud by Stapel and sloppy data management and inappropriate data use by Hauser underscore the critical role data transparency plays in preventing scientific misconduct.
Recent developments at the National Science Foundation (and earlier this decade at the National Institutes of Health) suggest a solution — data-sharing requirements for all grant-funded projects and by all scientific journals. Such a requirement could prevent this type of fraud by quickly opening up research data to scrutiny by a wider community of scientists.
Stapel’s case is an extreme example and more likely possible in disciplines with substantially limited imperatives for data sharing and secondary data use. The research traditions of psychology suggest that collecting your own data is the only sound scientific practice. This tradition, less widely shared in other social sciences, encourages researchers to protect data from outsiders. The potential for abuse is clear.
According to published reports about Hauser, there were three instances in which the original data used in published articles could not be found. While Hauser repeated two of those experiments and produced data that supported his papers, his poor handling of data cast a significant shadow of uncertainty and suspicion over his work.
Hauser’s behavior is rare, but not unheard of. In 2008, the latest year for which data are available, the Office of Research Integrity at the U.S. Department of Health and Human Services reported 17 closed institutional cases that included data falsification or fabrication. These cases involved research funded by the federal government, and included the manipulation or misinterpretation of research data rather than the violation of scientific ethics or institutional oversight.
In both Hauser and Stapel's cases, graduate students were the first to alert authorities to irregularities. Rather than relying on other members of a researcher’s lab to come forward (an action that requires a great deal of personal and professional courage,) the new data sharing requirements at NSF and NIH have the potential to introduce long-term cultural changes in the conduct of science that may reduce the likelihood of misconduct based on data fabrication or falsification. The requirements were given teeth at NSF by the inclusion of new data management plans in the scored portion of the grant application.
NIH has since 2003 required all projects requesting more than $500,000 per year to include a data-sharing plan, and the NSF announced in January 2011 that it would require all grant requests to include data management plans. The NSF has an opportunity to reshape scientists' behavior by ensuring that the data-management plans are part of the peer review process and are evaluated for scientific merit. Peer review is essential for data-management plans for two reasons. First and foremost, it creates an incentive for scientists to actually share data. The NIH initiatives have offered the carrot for data sharing — the NSF provides the stick. The second reason is that the plans will reflect the traditions, rules, and constraints of the relevant scientific fields.
Past attempts to force scientists to share data have met with substantial resistance because the legislation did not acknowledge the substantial differences in the structure, use, and nature of data across the social, behavioral and natural sciences, and the costs of preparing data. Data sharing legislation has often been code for, "We don’t like your results," or political cover for previously highly controversial issues such as global warming or the health effects of secondhand smoke. The peer review process, on the other hand, forces consistent standards for data sharing, which are now largely absent, and allow scientists to build and judge those standards. "Witch hunts" disguised as data sharing would disappear.
The intent of the data sharing initiatives at the NIH and currently at NSF has very little to do with controlling or policing scientific misconduct. These initiatives are meant to both advance science more rapidly and to make the funding of science more efficient. Nevertheless, there is a very real side benefit of explicit data sharing requirements: reducing the incidence of true fraud and the likelihood that data errors would be misinterpreted as fraud.
The requirement to make one’s data available in a timely and accessible manner will change incentives and behavior. First, of course, if the data sets are made available in a timely manner to researchers outside the immediate research team, other scientists can begin to scrutinize and replicate findings immediately. A community of scientists is the best police force one can possibly imagine. Secondly, those who contemplate fraud will be faced with the prospect of having to create and share fraudulent data as well as fraudulent findings.
As scientists, it is often easier for us to imagine where we want to go than how to get there. Proponents of data sharing are often viewed as naïve scientific idealists, yet it seems an efficient and elegant solution to the many ongoing struggles to maintain the scientific infrastructure and the public’s trust in federally funded research. Every case of scientific fraud, particularly on such controversial issues such as the biological source of morality (which is part of Hauser’s research) or the sources of racial prejudice (in the case of Stapel) allows those suspicious of science and governments’ commitment to funding science to build a case in the public arena. Advances in technology have allowed the scientific community the opportunity to share data in a broad and scientifically valid manner, and in a way that would effectively counter those critics.
NIH and NSF have led the way toward more open access to scientific data. It is now imperative that other grant funding agencies and scientific journals redouble their own efforts to force data, the raw materials of science, into the light of day well before problems arise.
Felicia B. LeClere is a principal research scientist in the Public Health Department of NORC at the University of Chicago, where she works as research coordinator on multiple projects, including the National Immunization Survey and the National Children's Study.
Although I am an aspiring scholar of 17th-century devotional poetry, I’ve had a surprisingly large number of drinking buddies who are physicists. Over beers I’ve learned about the Higgs boson, the intricacies of the Large Hadron Collider and the standard model of particle physics. In turn, I hope that maybe they’ve learned a little about Milton’s “Things unattempted yet in Prose or Rhime.”
Such friendships are part and parcel of doing the entirety of my graduate education at institutions that are heavily known for their contributions to STEM fields. I’ve never been able to ignore the sciences, and I wouldn’t think it a luxury even if I could. In talking with scientists, at least on a bar stool, I have tried to be largely free from C. P. Snow’s famous assertion in his 1959 The Two Cultures and the Scientific Revolution that scientists and humanists are members of “two polar groups” and that between those two lies a “a gulf of mutual incomprehension.”
There is a cottage industry explaining why those studying the sciences need knowledge of the humanities, and I am sympathetic and largely agree with those views. But as a humanist writing to colleagues, I think that we should admit that Snow may still have a point. Too often we approach the sciences with a mixture of fear, envy and misunderstanding.
First, however, some things that I am not arguing: I do not think that science or scientists are beyond humanistic critique. Science, like any system created by humans, is going to be influenced by the wider culture, and as culture is our subject, we’re perfectly equipped to comment on those aspects of STEM that abut history, philosophy, literature, area studies and so on. The subspecialty of science studies has made important contributions to a considered understanding of how science operates within society, and one need not be a relativist to admit that ideology influences scientific discourses.
Second, if mutual suspicion has grown between the two cultures, the fault does not lie solely with us. Many advocates for a particularly positivist view of science (here I am thinking primarily of New Atheists like Richard Dawkins) are not just dismissive of whole shelves of humanistic scholarship, but they’re also downright anti-intellectual about entire disciplines as well. They deserve to be called out.
Finally, I am not claiming that humanistic work can be reduced into the scientific. Interdisciplinary respect need not entail the loss of disciplinary sovereignty, and I am not supporting a type of epistemological imperialism.
Despite those caveats, Snow’s assertion that the humanities have a bit of a science problem remains pertinent a half century later. As humanists, it behooves us to interrogate our own assumptions about the sciences and the occasionally unthinking ways we may project displaced anger onto scientists that are counterproductive to both them and us. Without an honest consideration of how we sometimes speak about science, we risk alienating potential allies in fighting for shared interests -- such as academic freedom, job security and funding in the era of the increasingly corporate university. Furthermore, some of our personal griping about the sciences subconsciously displays an anxiety that is, ironically, profoundly anti-humanist.
In that aforementioned seminal essay, Snow writes, “A good many times, I have been present at gatherings of people who, by the standards of the traditional culture, are thought highly educated and who have with considerable gusto been expressing their incredulity at the illiteracy of scientists.” While perhaps it’s unwise to universalize that phenomenon, it’s undeniable that Snow identifies an all too common sentiment. There exists a suspicious cringe toward the sciences that is condescending to our colleagues and ultimately not helpful to those of us in the humanities. It includes unstated assumptions that issues of institutional support are always easy for our colleagues in their labs across the campus, as well as the internet flame wars I’ve seen on academic threads griping about science popularization.
Stereotypes Worth Questioning
If anything, the rhetorical problem of the “two cultures” has as of late been exacerbated by the cheap outrage enabled by online culture and the easy discourse of social media, where an attitude of casual disdain toward science and scientists can proliferate. At its worst, I’ve seen the legitimate humanistic analysis of problematic pronouncements made by some scientists veer into an insinuation that said scientists are so unsophisticated that they’re somehow not even legitimately equipped to comment on their own specialties. Or I’ve seen the weird gleefulness of humanists who brag about knowing nothing about science, as if we wouldn’t justifiably denounce the equivalent from our STEM colleagues across the hall as being rank anti-intellectualism.
I’m not innocent in this. From time to time I’ve unfairly stereotyped our colleagues in the sciences as unconcerned or not knowledgeable about history, philosophy and literature. I’ve shared links to online articles and blogs that extol the virtues of humanistic training and research at the expense of the sciences. Oftentimes our disdain can be born out of unpleasant personal interactions with colleagues in STEM fields who are unwilling to acknowledge the difficulties or worth of our own work in the humanities and social sciences. I think the critical attitudes we harbor toward the sciences are almost always related to our legitimate grievances over how neoliberal policies threaten the humanities in higher education.
Being defensive is emotionally understandable, because it can feel that, as humanists, we’re under attack from all sides. After all, when Florida Governor Rick Scott asks, “Is it a vital interest of the state to have more anthropologists?” and then responds to his own question with, “I don’t think so,” it’s expected. But then when President Obama jokes about the utility of studying art history, it only reinforces our sense of being beleaguered. Such political attacks often use the humanities as a straw man to contrast them with what is presented as the supremely pragmatic choice of supporting the STEM fields. It’s natural that we’d get a bit touchy. So I get it, I really do. But that doesn’t mean that a defensive posture is always the most helpful.
At the risk of engaging in the fallacy of anecdotal evidence, I’ve heard things that highlight what I see as the dangers of painting STEM with the same broad brush with which we’re tarred. I myself shared a popular meme on Facebook a few months ago that claimed the bizarre pronouncements of former presidential candidate Ben Carson are what happen when STEM students don’t take humanities courses. I’m sure you can blame Carson’s strange comments on many things, but his medical training seems unlikely to be one of them. I’ve never seen his undergraduate transcripts -- perhaps he took scores of English and history classes -- but I imagine that those who created the meme really have no idea, either. It was an admittedly funny image, and in advocating for the humanities I imagine it was made with good intentions.
But I think it does little to convince potential academic allies in the STEM fields of much more than our own intellectual smugness. An even more insidious variety of meme that I’ve encountered are those that argue that the benefit of a humanistic education is that it makes scientists somehow more moral in their research.
The implication that scientists are incapable of parsing the ethics of their own work because they haven’t taken a philosophy course is the height of condescension. From advocating for rational climate policy to explaining what’s dangerous about pseudoscientific anti-vaccination rhetoric, scientists are more than capable advocates for ethical policies that intersect with their own research. The old chestnut that argues that studying the humanities somehow makes a person more moral is tenuous at best. The course catalog justification for the humanities as supplying special skills in “critical thought” is also shortsighted one, for it presupposes that critical thinking is our sole provenance (which it obviously isn’t) and that critical thinking is somehow all that we offer (which it also isn’t).
Not Victims or Martyrs
Of course I think it would be fantastic if more scientists did take philosophy courses. It would be fantastic if every college student did that -- not because those courses only help the student in their primary training but because such courses are an unalloyed good in themselves. That’s the ultimate irony in this sort of defense of the humanities: they use the same economic language of utility that other people use to justify increased funding for STEM.
Yet if we position ourselves too much as victims or martyrs, we ignore the oftentimes similar (or even more dire) political position that scientists find themselves in. We commit a fallacy when we confuse political lip service for STEM as being actual support. One only need look at the precarious situation climatologists find themselves in, under attack by ideologically motivated partisans every bit as organized as those who fulminated against the academic humanities during the culture wars. And while I harbor my own resentments that the wider public may view my interest in 17th-century Puritan theology as helplessly esoteric, 30 minutes of speaking to a mathematician who works on topology and number theory disavowed me of any sense of the grass being greener on the other side when it comes to the public embrace of what one might study.
In defending ourselves, in explaining why anthropology or art history is important, we should not engage in the corollary of denouncing the sciences as unimportant. Too often I see the deployment of the same language used against us, or the ironic gambit of self-justification that involves tethering the humanities to the sciences so that the former is enlisted as some kind of handmaid to the later.
I had a conversation at a conference with a fellow humanist who thought that what I think is the self-evidently fascinating field of astrophysics is simply a financial drain on society, as if it’s somehow clear that the study of poetry is obviously important to everyone. In the academy, both fields of study need to be justified, both need to be explained and both need to be defended. That can be done at the institutional level (why not sponsor events between academic societies like the Modern Language Association and American Academy for the Advancement of Science?), as well as in our own professional lives. The recent catastrophic election to the presidency of the United States of Donald Trump, a man with equal disdain for both the humanistic tradition and scientific evidence, is reason enough for building a spirit of solidarity between academic disciplines.
It’s worth considering biologist Stephen Jay Gould’s concept of “non-overlapping magisteria,” which was originally meant to delineate the different domains of religion and science, as a useful template for thinking about the relationship between science and the humanities. Factionalism, jingoism and arrogance are no more attractive when they’re gussied up in humanistic language. Incuriosity is an intellectual sin, wherever its origins. This need not be a zero-sum game, as we’re all playing for the same team.
Ed Simon is a Ph.D. candidate in English at Lehigh University. He is also a widely published writer on the subjects of religion, literature and culture. His work has appeared in publications such as The Atlantic, Aeon, The Paris Review Daily, Salon, Atlas Obscura, The Revealer, Nautilus and many others. He can be followed at his website or on Twitter @WithEdSimon.
America’s universities are home, more than any place else in our country, to the enterprise of science. That has been an important and proud role for our great universities, and it has produced wonderful discoveries. Besides providing technical progress, science gives our society its headlights, warning us of oncoming hazards. As the pace of change accelerates, we need those headlights brighter than ever. So when a threat looms over the enterprise of science, the universities that are its home need to help address the threat.
The threat is simple. The fossil fuel industry has adopted and powered up infrastructure and methods originally built by the tobacco industry and others to attack and deny science. That effort has coalesced into a large, adaptive and well-camouflaged apparatus that aspires to mimic and rival legitimate science. The science that universities support now has an unprecedented and unprincipled new adversary.
The science-denial machinery is an industrial-strength adversary, and it has big advantages over real science. First, it does not need to win its disputes with real science; it just needs to create a public illusion of a dispute. Then industry’s political forces can be put into play to stop any efforts to address whatever problem science had disclosed, since now it is “disputed science.” Hence the infamous phrase from the tobacco-era science denial operation -- “Doubt is our product.”
Second, the science-denial operatives don’t waste much time in peer-reviewed forums. They head straight to Fox News and talk radio, to committee hearings and editorial pages. Their work is, at its heart, PR dressed up as science but not actual science. So they go directly to their audience -- and the more uninformed the audience, the better.
Our universities and other organizations engaged in the enterprise of science struggle for funding. Not so for the science-denial forces. You may think maintaining this complex science-denial apparatus sounds like a lot of effort. So consider the stakes for the fossil fuel industry. The International Monetary Fund -- made up of smart people, with no apparent conflict of interest -- has calculated the subsidy fossil fuels receive in the United States to be $700 billion annually. That subsidy is mostly what economists call “externalities” -- costs the public has to bear from the product’s harm that should be, under market theory, in the price of the product. These $700-billion-per-year stakes mean that the funding available to the science-denial enterprise is virtually unlimited.
And it’s your adversary. Those of you who either are scientists, or value and want to defend scientists, should beware. You have a powerful, invasive new alien in your ecosystem: it is a rival assuredly, a mimic at best, and an outright predator at worst. Make no mistake: in every dispute that this denial machinery manufactures with real science, it is determined to see real science fail. That is its purpose.
Given the connections between the fossil fuel industry and the new administration, we can’t count on government any longer to resist this predator. Regrettably, that science denial machinery is now probably hardwired into the incoming administration, as shown by the appointment of the fossil-fuel-funded climate denier Myron Ebell to lead the transition team for the Environmental Protection Agency. This considerably increases the denial machinery’s threat to the enterprise of legitimate science. The hand of industry now works not just behind the science-denial front groups but in the halls of political power.
That makes it all the more important for entities outside government -- notably universities as well as other scientific organizations -- to join together and step up a common defense. It is neither fair nor strategically sensible for universities and scientific associations to expect individual scientists to defend our nation against the science-denial apparatus. Individual scientists are ordinarily not trained in the dark arts of calculated misinformation. Individual scientists are ordinarily not equipped to deal with attacks and harassment on multiple fronts. Individual scientists don’t often have squadrons of spin doctors and public relations experts at their disposal. And they have no institutions devoted to ferreting out the falsehoods or conflicts of interest behind their antagonists.
Individual scientists are trained in the pursuit of truth through the tested methods of science. The science-denial machinery has truth as its enemy, and propaganda and obfuscation -- even outright falsity -- as its method. So the enterprise of science generally, and universities specifically, will need a common strategy to resist this potent and encroaching adversary.
In the Senate, I see the work of this apparatus, and its associated political operation, every day. Do not underestimate its power and ambition. Again, make no mistake: in every dispute that this denial machinery manufactures with real science, it is determined to see real science fail.
Sheldon Whitehouse is a United States senator, a Democrat, representing Rhode Island.