Psycollegey—Following orders

 
 

With the UN recently condemning North Korea, Louise Dolphin looks at the psychology behind mass human cruelty

The most harrowing place I have ever set foot in is the Khmer Rouge S-21 prison in Phnom Penh, Cambodia. Originally a high school, the buildings were taken over by Pol Pot’s security forces and turned into a prison in 1975 during the Cambodian genocide. It quickly became the largest centre of detention and torture in the country.

Outside, it has the appearance of an ordinary high school, with plain concrete buildings, palm trees and a grassy courtyard. But this sheer ordinariness is horribly juxtaposed by the rusted beds and array of torture instruments scattered around the former classrooms.

Corridors and rooms are filled with thousands of disturbing black and white photos of each prisoner who passed through S-21, before and sometimes after torture. The majority of these torture victims were trucked out to Choeung Ek, one of the notorious killing fields just outside the city.

It is impossible to visit a place like the S-21 prison, without feeling insurmountable disgust and anger at the executioners, torturers and prison guards. How is it possible for a human to ever come to the point where they could systematically carry out such inhumane acts on other individuals?

Is it that they were just following orders? Are they just particularly sadistic people, or is there more to it? Two of the most infamous experiments in psychology can attempt to shed some light on this.

Firstly, let’s go back to the summer of 1961, just after the widely publicised trial of German Nazi war criminal Adolf Eichmann began in Jerusalem. In a psychology lab in Yale, Stanley Milgram devised an experiment to try to understand a popular question at the time. Did Eichmann, as he claimed, have “little authority in the Nazi hierarchy and was following orders?” “Befehl ist Befehl” became known as the “Nuremburg defence.” It means “an order is an order.”

Milgram wondered if the German race was particularly compliant. He reckoned that in the US, a relatively individualistic and non-conformist culture by the 1960s, people would not be so quick to follow an immoral command from authority. But he was quite wrong.

In his very shocking, and arguably, very unethical study, Milgram recruited participants for a study on learning and memory. At the Yale lab, these participants were seated in a room with the experimenter.

The learner (who, unbeknownst to the participant, was an actor), was seated in another room, linked by a sound system. Participants were required to administer varying degrees of (what they believed to be) electric shocks to the learner for incorrect responses on the learning task.

At 300 volts, the learner/actor began to cry out in pain. Yet, Milgram found that almost 65% of participants administered the experiment’s final massive 450-volt shock (labelled XXX) for an incorrect answer. The only instruction given to participants by the experimenter was “the experiment requires that you continue.” This result was startling, as a panel of experts estimated that only 1% of individuals would comply with such a command.

It should be noted that at some point, every participant paused and questioned the experiment. Video footage recently released shows that many participants were extremely uneasy and distressed administering the stronger shocks. Yet, they followed orders.

Many people argued that the experiment took place in a very unnatural setting (a lab) and the authority figure was not a senior army officer. It was just a man in a white coat, armed only with a clipboard. They felt that people would be less willing to carry out immoral acts in a more real world situation.

But, let’s fast-forward ten years to the summer of 1971. In the basement of the Stanford psychology building, Philip Zimbardo (Milgram’s former high school friend) created a mock prison. He randomly assigned 24 psychologically normal male students (based on a range of psychological tests) to the role of prisoner or guard.

The experiment, which Zimbardo had planned to run for two weeks, was stopped short after six days following pressure from Zimbardo’s then girlfriend. Participants were experiencing emotional trauma. While the guards became increasingly aggressive and brutal, the prisoners turned depressed and fatalistic.

Elements of the Stanford Prison Experiment have been compared to photographs of the sexually brutal behaviour of some members of the US Military at Abu Ghraib prison in Iraq just a number of years ago.

Dave Eshelman, a guard in the Stanford Prison Experiment says, “When I first saw the pictures, immediately a sense of familiarity struck me because I knew that I had been there before, I had been in this type of situation… I certainly subjected them to all kinds of humiliations. I don’t know where I would have stopped myself.”

When the sickening pictures at Abu Ghraib were released, the US military became defensive claiming it’s just a few bad apples, but Zimbardo disagrees. He says, “There are a set of social psychological variables, that can make ordinary people do things they never could imagine doing.” He argues that it was not an individual, but a system that permits abuse.

This is a rather terrifying thought. That under certain circumstances, any of us could act so inhumanely and cruelly. I still like to believe that the vast majority of us wouldn’t. But Zimbardo believes that “people can become evil when they are enmeshed in situations where the cognitive controls that usually guide their behaviour in socially desirable and acceptable ways are blocked, suspended or distorted.”

The findings of Milgram and Zimbardo’s experiments play on my mind since the release of the recent UN report on North Korea. It details allegations of murder, torture, rape, abductions, enslavement, and starvation. It describes North Korea as a dictatorship “that does not have any parallel in the contemporary world.”

Under no circumstances do these studies excuse such atrocities, but they may shed some light on the nature of, and factors that contribute to human cruelty. Aleksandr Solzhenitsyn, author of The Gulag Archipelago, wrote, “If only there were evil people somewhere insidiously committing evil deeds, and it were necessary only to separate them from the rest of us and destroy them. But the line dividing good and evil cuts through the heart of every human being.”

 

Advertisements