Data Are Always the Smoking Gun

The LaCour scandal is the latest example of the inadequacy of research training and the peer-review process, writes Felicia LeClere.

June 25, 2015

Five years ago, I took a long walk in Ireland with my husband, and when we got back, there were reports of several research scandals in which academic reputations were ruined by what appeared to be data falsification or at least substantial sloppiness. I wrote about it -- claiming, as I often do, that enforced data sharing would at least ensure that researchers tidied up their documentation.

A few weeks ago, I took another long walk in Ireland with my husband, and this time the news was filled with Ireland’s public referendum legalizing same-sex marriage and another research scandal, this one involving research about the possibility that face-to-face voter canvassing by persons identified as gay can change opinions about the rights of the LGBT community. I guess I need to be more careful about my travel plans, at least when they involve Ireland.

This time I am less sanguine about the idea that simply enforcing data sharing can improve the research process enough that sloppiness and outright fraud will be well policed. The most recent story involves a young Ph.D. student, Michael J. LaCour, who made up facts about the research process -- such as who funded it, how incentives were paid out, whether the embedded experiments were registered with a centralized registry and perhaps even which survey firm conducted the study.

The scandal began, as they often do, when someone wanted to replicate the research, and the researcher did not share all of the data. The student, in fact, despite the very sound advice of his senior co-author, had not deposited all the data with my former employer the Inter-University Consortium for Political and Social Research (ICPSR) so that the full data file could both be found and shared. Once thwarted and confused, of course, the research team that wanted to replicate his research started pulling at the loose thread and unraveled a whole skein of lies and exaggerations. I bet Michael LaCour now profoundly wishes he had paid closer and more careful heed to the advice of his mentor -- because ICPSR, in fact, could have coaxed him into the truth simply by the act of scrutiny and documentation; instead he chose a self-archiving method that allowed him to upload what he wished.

The story of LaCour should bolster my cherished premise that full data sharing will reduce the amount of malfeasance, right? Is it possible to still be naïve in your early 50s? I am afraid so. After another five years of being head down and hip deep in data collection and file preparation, I am willing to admit that either encouraging or forcing data sharing among researchers just is not enough. These scandals result from deeper problems with our training and review of the research process. The scandals almost always erupt when someone starts to question the data used to answer a substantive question -- and then the answer to the substantive question is viewed with suspicion. The inability to replicate, or even get close, opens the door to all types of scrutiny. Mishandling data or data collection is like Al Capone not paying his taxes -- it provides an entrée for our academic Eliot Ness to bring home the investigation.

My claims about the inadequacy of research training and the peer-review process will likely raise howls of protest -- what about all of the graduate-level methods courses, the Institutional Review Board (IRB) and the peer-review process required for grants and publications? Yes, all of these checks and balances, in principle, ensure ethical, high-quality research. But they do not, in fact, in any of the disciplines I am familiar with. Graduate-level methods classes in the social sciences -- and I have taught more than a few -- carry a heavy burden requiring both an omnibus survey of data collection methods, research ethics and often a smattering of statistical methods. The section on research ethics usually only focuses on how to deal with human subjects, not on how to handle the data we collect from them. Even a two-course sequence will never get you much beyond what I always think of as the research equivalent of “happily ever after” data collection. No one tells you how to stay married to your Prince Charming nor how to adequately and ethically prepare data files for sharing.

What of the IRB and peer review -- don’t they represent the bulwark against sloppiness and malfeasance? Not really -- as both do not have the explicit purpose of policing the research process generally. The purpose of the IRB is, in fact, the protection of human subjects -- that is, ensuring that all data collection is ethical. This may or may not ensure that the data collection is well documented, accurate and scrupulously transparent, as the protection of human subjects requires looking carefully at informed consent, for instance, but not necessarily data documentation.

Unfortunately, peer review is even more narrowly focused, except when a reviewer pulls hard at a methodological thread. Journal articles and grant applications never allow for the careful description of the methods and procedures because of substantial space constraints. In the past, co-authors and fellow review panel members have rightly scolded me for my overweening and tedious attention to the details of the research process. Peer review focuses primarily on substance and research quality because it must -- we are meant to trust that our colleagues are well trained, careful, transparent and accurate, without a lot of detail about how they execute these traits. I am not entirely sure that trust is warranted -- thus, peer review also fails to ensure that the research process is as it should be.

On our walk in Ireland, my husband and I climbed Croagh Patrick, the mountain on top of which St. Patrick spent 40 days fasting in 441 AD. It is a religious pilgrimage for many Irish Catholics -- for us, it was the challenge of going straight uphill on loose slate for two hours. Croagh Patrick is famous for its miserable weather, and our walk was no exception -- 50-mile-an-hour winds, driving rain and dense fog. As my husband is fond of saying, St. Patrick’s religious visions on the top of the mountain can likely be attributed to hypothermia and the fact that he could not find his way down.

As I crawled my way up the mountain of loose, wet stone, in addition to cursing my husband, who is descended from a long line of spirited Irish men and women, I thought about the value of careful and thorough preparation. My husband, ever the Eagle Scout, always ensures that we are thoroughly prepared and carefully equipped for every eventuality -- thus, I only got soaked to the skin in the last 20 minutes instead of the first two hours, and we made it both up and down the mountain despite being the far side of 50 years old. It strikes me that the research process is indeed like climbing Croagh Patrick -- preparation and careful attention to detail are an absolute must. The research community must find better ways to nurture and encourage these skills rather than spend time picking over the bones of those who have fallen off the trail.


Felicia B. LeClere is a senior fellow with NORC at the University of Chicago, where she works as research coordinator on multiple projects. She has 20 years of experience in survey design and practice, with particular interest in data dissemination and the support of scientific research through the development of scientific infrastructure.


Back to Top