Essay: Academic leaders shouldn't be sure they wouldn't have acted like their counterparts at Penn State
- Stanford professor goes public on attacks over her math education research
- Quick Takes: Obama Adds to Science Team, RIAA Shifts Strategy, Milgram's Shock Experiment Still Works, Unusual Ethics Choice, Tufts Loses $20M, E-Mail About Va. Tech Killer, Morris Brown Survival in Doubt, Witch Sues Nebraska
- Essay on why smart people make foolish ethical choices
- The Harvard Letter
- NCAA inquiry at Penn State is unprecedented involvement in criminal matter
Here is the lesson people want to learn from the Penn State scandal: There are some smarmy folks out there who, through a combination of mindless groupthink and fear of antagonizing important people, will do unimaginable things, like not reporting child abusers to the police; perhaps there are other "Penn States" out there or possibly there even are people at our own institution who are hiding seriously dirty linen about which we know nothing. The one thing we know for sure is that we never would act the way those people did.
That’s the wrong lesson. Here’s why.
In the 1960s, the late Stanley Milgram did a series of studies while a faculty member at Yale University. Although the initial studies are old, they have been replicated many times since, across time and place. Milgram would have two study participants enter a room. One would be assigned, seemingly at random, to the role of learner and the other to the role of teacher. Unbeknownst to the teacher, who was a naïve subject, the role assignments were rigged and the learner was a confederate of the experimenter's.
The teacher and learner were informed that they would participate in an experiment on the effects of punishment on learning. On successive trials, the teacher would read to the learner a list of words to be learned and the learner would repeat back the words he remembered. When the learner made a mistake, the teacher would use an apparatus that would deliver an electric shock to the learner.
The apparatus was designed so that each successive shock would be heavier than the last one. Shocks on the device were arranged in increments of 10 volts, ranging from just 10 volts up to 450 volts. The switches at the high end, near 450 volts, had labels like “slight shock,” “moderate shock,” “extreme shock,” “danger: severe shock,” and at the top of the scale, “XXX.” The teacher was given a sample 45-volt shock to show him that the apparatus really did deliver shocks and that they were painful.
Once the experiment started, the learner began to make mistakes. So the teacher shocked him. (In the initial experiments, participants were male, but later experiments involved female participants as well.) After a while, the teacher heard the learner groan, later scream, still later complain about his heart, yet later demand that the experiment stop, and finally fall silent. It might seem that the teacher would stop delivering shocks once the learner started to protest, but the experimenter would reply, when the teacher indicated he wanted to stop the experiment, with responses ranging in a graded sequence: "Continue please”…. "Go on" …. "The experiment requires that you continue" …."It is absolutely essential that you continue" …."You have no choice."
As you may know, the experiment was not really on the effects of punishment on learning but rather on obedience. Psychiatrists asked to estimate what percentage of subjects would administer the maximum level of shock estimated that it would be less than 1 percent. In fact, it was roughly two-thirds.
When I have taught introductory psychology, I have asked my 150 or so students how many of them would have gone to the end, and typically, only one or two jokers say they would have. The rest of the students strenuously deny they would have administered the maximum shock. Yet, roughly two-thirds of them would have gone to the end of the shocks, even though they cannot imagine they would have. They do not yet realize the harm of which they are capable. We all are susceptible to believing that only other people act in ways that are heartless, cruel, or indifferent, and then possibly rationalizing them as humane.
Fortunately for the learner in the Milgram experiments, the shock machine was a phony and, as mentioned earlier, the learner was a confederate and a trained actor. The experiments as originally conducted never would pass muster with today’s ethical requirements because subjects could not be adequately debriefed. No matter what the debriefing said, roughly two-thirds of the subjects in a typical running of the study left the experiment knowing that they might have killed the subject had the shocks been real.
The usual interpretation of the Milgram experiment has been that people are remarkably obedient and that it is because of this typically unrealized potential for obedience that horrors like the Nazi or Rwandan genocide or the brutal reprisals in Syria could take place. In the July 2012 issue of Psychological Science, Stephen D. Reicher of the University of St. Andrews and his colleagues have suggested that “agents of tyranny actively identify with their leaders and are motivated to display creative followership in working toward goals that they believe those leaders wish to see fulfilled.” In other words, people don’t just passively obey; they behave proactively to curry favor with their admired leaders or role models. Sound familiar?
In a related demonstration, Philip Zimbardo, formerly a professor of psychology at Stanford, randomly assigned college students to one of two groups: prison guard or prisoner. He placed them in the basement of the Stanford Psychology Department and then observed how they acted. To his dismay and the dismay of anyone who has since learned of the study, the guards rather quickly started acting like sadistic prison guards and the prisoners started acting in ways betraying learned helplessness — they were essentially browbeaten into submission.
In yet another study, published in 1973 in the Journal of Personality and Social Psychology, John Darley and C. Daniel Batson found that even most divinity students on their way to give a lecture on the Good Samaritan failed to help a person in obvious distress if their other priorities, such as arriving on time for the lecture, were more important to them at the moment. The study showed that intense ethical training provides relatively little protection against bad behavior in an ethically challenging situation. Since that study was published, episodes of horrendous abuse of children at the hands of clergy, while other clergy in the know stood idly by, have reinforced this lesson in gory detail. Really, no training offers ironclad protection.
If there is one thing that social psychologists have learned over the past decades, it is the enormous but often hidden power of situational pressures. The lesson of the Penn State tragedy is not that there are heartless bureaucrats out there who are willing to sacrifice the well-being of children for the sake of the reputation of the university and its athletic teams. Almost certainly there are. However, the real lesson of the Penn State tragedy is that, given certain situational constraints, virtually all of us could behave the way those administrators allegedly did. These circumstances include severe pressures to conform accompanied by fear of punishment for noncompliance, desire to please or curry favor with one or more persons in a position of power, rationalization of one’s actions, and what I have called "ethical drift" — one’s declining ethical standards in the face of group norms whereby one is not even aware that one’s standards are dropping.
To be clear: The power of situational variables in no way excuses bad behavior. Rather, such variables should help us understand, in part, why such behavior occurs in certain situations, why we are all potentially susceptible to it, and most importantly, what we can do about it.
How do you avoid falling to the trap of ethical drift? First, you need to realize that almost anyone, including yourself, is capable of behaving abysmally under certain circumstances. Second, you need rather regularly to ask yourself whether situational pressures are leading you to behave in ways that once would have seemed totally inappropriate and wrong to you. Third, you need to ask yourself whether you are rationalizing behavior that once would have seemed unacceptable to you. And fourth, you need to be willing to take a stand and do the right thing, realizing that although there may be serious short-term costs to acting ethically, you are willing to accept those costs so you can live with yourself and others over the long term.
One last thing: You may still be thinking that although other people may fall prey to ethical drift — or even a sudden drop off the ethical cliff — you would never succumb to situational pressure to conform. For example, you may just feel you know you would not have gone to the top of the shock apparatus or have let a child abuser continue to abuse children, regardless of the situational pressures placed on you. You may be right, but research has not found any personality characteristics that reliably predict who will succumb to such extreme pressures and who won’t.
Put another way, we all have to be in the situation to know what we would do. So you may wish to reserve judgment for now. When, sooner or later, you are in an ethically challenging situation, as the Penn State administrators were, you then will have an opportunity to learn something about yourself. If you resist succumbing to the temptation just to go along, you then will be able to feel pride in yourself, as would we all. As for me, I find what happened at Penn State absolutely abhorrent and cannot believe that I would have acted in the way those administrators appear to have, but I know I cannot be absolutely sure of what I would do unless I found myself actually in such a situation under comparable pressures.
When crowds of fans shouted, “We are Penn State!” they did not realize just how right they were. Potentially, at least, we all are Penn State, both in its best aspects and its worst.
Robert J. Sternberg is provost, senior vice president, Regents Professor of Psychology and Education and George Kaiser Family Foundation Chair in Leadership Ethics at Oklahoma State University. He also is president of the Federation of Associations in Behavioral and Brain Sciences, and past president of the American Psychological Association. The opinions expressed in this article, however, are entirely his own.