The Lure of Obedience
As economic crisis paralyzes Western economies, an ideology of personal responsibility has come to the fore. Conservatives seize on flaws in self-control as a way of evading questioning the economic system. The Left, too, blames “greedy banks” in preference to systemic explanations.
But how much do we really control our own behaviour? A remarkable experiment, first conducted in 1961, has much to say to this question, showing that we are less in control of ourselves than such ideologies would have us believe.
The Milgram experiment, which demonstrated the willingness of civilised people to inflict lethal electric shocks on innocent strangers, has been described as “one of the most controversial experiments in history”. But a half century has elapsed since it was first carried out. The conceiver of the experiment, the social psychologist and Yale University professor, Stanley Milgram, wrote a book about its implications before the end of the Vietnam War.
Its findings must be, you might think, irredeemably stamped by the unique period of history in which it took place. Actually, far from being a museum piece, the Milgram experiment is more relevant than ever.
Misconception of human action
The experiment, which has been repeated many times with very similar results, reveals a fundamental misconception about why people behave as they do, a “seriously distorted view of the determinants of human action”, in Milgram’s words. This delusion has, in the early years of the 21st century, only become more acute.
The idea that we alone control our behaviour and that our personal wants and desires shape the world around us, is a fundamental axiom of everyday ideology. Though the clamour for ethical responsibility is intense, it is also pointless. As the Milgram experiment demonstrates, ordinary people can perform horrific acts precisely because they don’t feel responsible, and don’t control what they do.
Society, says Milgram, promotes the ideology that a person’s actions flow from their character. Bad outcomes are the result of bad or misguided people. Desperate to explain the economic crisis paralysing economies in the West, without questioning the economic system, conservatives seize upon human flaws. “Consenting adults” who went on an unsustainable debt binge are seeking to blame others, says the British Conservative defence minister, Philip Hammond. Another conservative believes that holding capitalism responsible for economic collapse evades the “underlying [moral] weakness of our time”. “How do we put limits on ourselves?” he asks.
The poster boy of contemporary conservatism, Niall Ferguson, thinks that voters in Britain and America have struggled to find appropriate scapegoats for the economic problems besetting them. “We bay for tougher regulation, though not of ourselves,” he says.
But conservatives are not alone in blaming out-of-control behaviour. The left of centre also plays the blame game. “Greedy banks” overreached themselves says this narrative, and the values of a “responsible capitalism” need to assert themselves. The political spectrum unites in castigating the culture of banking. It is even thought that psychopaths led financial institutions astray.
The implications of the Milgram experiment render all these approaches fanciful. The overriding importance of institutions, not moral codes or their lack in moulding human behaviour is starkly, indeed painfully, demonstrated.
In 1961, Yale University professor Stanley Milgram and his team of researchers conducted an experiment whose controversy still resounds today. Volunteers were recruited to take part in an experiment on learning. But this was just a cover-story. A real experiment was taking place but it was not about learning.
Milgram and his team recruited volunteers to sit in front of a machine that ostensibly generated electric shocks to an unseen person strapped into an electric chair in an adjacent room. Its 39 switches, ranging from 15 to 450 volts, were labeled “slight shock,” then “moderate shock,” all the way up to “extreme intensity shock,” and the last two just “xxx.” The machine did not actually deliver any electric shocks at all, but the volunteers believed that it did. The person in the chair, supposedly receiving the shocks, was an actor, feigning pain, which the volunteers could hear.
An “experimenter” in a lab coat would ask the actor in the chair questions, and each time he got an answer wrong, the volunteer was told to shock him. The shocks were to increase in intensity each time. It was prearranged that the learner-actor would get most answers wrong.
The volunteer could hear the learner-actor’s cries progress from grunts and groans to exclamations like “I refuse to go on” and “I can’t stand the pain, then to “agonized screams”
If the volunteer protested, the experimenter would tell him to go on. After a certain point, an ominous silence reigned.
When told about the experiment, the vast majority of people, including psychiatrists, expected people to disobey and refuse to go on. “These subjects see their reactions flowing from empathy, compassion and a sense of justice,” says Milgram, in his book about the experiment, Obedience to Authority. Even people who watched the experiment through one-way mirrors expected disobedience. Only a pathological fringe of sadists would go on shocking, it was thought.
But most of the volunteers – around two-thirds of them – obeyed. They went on giving electric shocks up to the highest level, even when they feared the victim was dead. When the experiment was replicated in different locations and countries, obedience was even higher. In Munich, for example, 85 % obeyed to the point of inflicting the final shock. Empathy and compassion did not win out.
Sadists beneath the surface?
The Milgram experiment was first carried more than 50 years ago, when John F Kennedy was U.S. President. It’s discoveries about human behaviour have been repeatedly reaffirmed. In 2009 a BBC recapitulation of the experiment came up with results very similar to those obtained in 1961.
But what were those discoveries? Contrary to popular belief, the experiment did not prove that most people are sadists just beneath the surface. On the contrary, in a modification of the experiment, the volunteers were not ordered to shock—they were merely told that it was fine if they did. So instructed, the vast majority stopped before the shocks became painful.
In another variation, the volunteers were allowed to witness two of the “experimenters” disagreeing. One urged the volunteer to go on shocking, while the other didn’t. The effect was to stop the experiment dead in its tracks. Not one volunteer took advantage of the conflict to inflict more electric shocks.
Palpably, the volunteers did not like what they were told to do. When the experimenters ordered them to go on shocking, they protested, they sweated, they shook—they even laughed hysterically. They obviously experienced great stress as they went on obeying.
Milgram’s book on the experiment, Obedience to Authority, makes illuminating reading today, dispelling misconceptions about the experiment. Written in 1974, it is full of references to atrocities committed in the Vietnam War and by Nazi Germany. But to link the experiment crudely to what the Nazis did, Milgram says, is to “miss the point” of the experiment “entirely.” Indeed, to focus on the human propensity to inflict pain on others is to miss the point entirely. The experiment demonstrated people’s willingness to obey not only malevolent authorities but authorities of any kind, of varying degrees of malevolence: that is, not just amoral scientists and generals, but companies and governments.
“It is the extreme willingness of adults to go to almost any lengths on command of an authority that constitutes the chief finding of the study,” writes Milgram, “and the fact most urgently demanding explanation.”
When Milgram explains willingness to obey, his book becomes really enlightening. He says that obedience works when people are in, what appears to be, a paradoxical state of mind. Obedience has to be willingly entered into, but the actions it entails are nothing to with the personality of the person who carries them out. It is one of the findings of the experiment that motives were irrelevant. Cruel people did not deliver more electric shocks than kind people. Subjective feelings did not matter.
The volunteers quite willingly gave the victim what they believed were excruciatingly painful, possibly fatal, electric shocks, but that had no relation to what they themselves wanted to do. They voluntarily allowed someone else to dictate what they did. As Milgram says, “It is the essence of obedience that the action carried out does not correspond to the motives of the actor but is initiated in the motive system of those higher up in the social hierarchy.”
Or as he puts it elsewhere, in hierarchical institutions, “relationship overwhelms content”. When a person merges their unique personality into an organisation, they become something else, a mere vessel, not an autonomous person.
Milgram’s relevance today
Here is where the Milgram experiment sheds lights on the emptiness of the current blame game for the economic crisis. The conventional explanation for crisis seizes on the appearance of voluntarily carried out actions but deliberately ignores the hierarchical institutions that are responsible for producing those actions. Conservatives can never relinquish the insistence that humans voluntarily cause bad outcomes – one British conservative, Jesse Norman, says that companies owe their existence to “human affection” – because the perception of uncoerced willingness is so vital to the smooth running of capitalism.
Capitalism requires obedient workers but, as Milgram shows, obedience involves a delicate psychological process. An obedient person has to allow him or herself to be used. In order to work obedience has to be willingly entered into. Obedience owes its power to mould behaviour to a sense of moral obligation. “The psychological consequence of voluntary entry is that it creates a sense of commitment and obligation which will subsequently play a part in binding the subject to his role,” says Milgram. In essence, a worker says to an employer: you pay me and, in return, I give you the right to direct my behaviour for eight hours a day.
This is why neoliberal thinkers are so intransigent on this point. They hold that a person selling him or herself on the labour market, is no different from a person selling any type of commodity. It is a voluntary process and so should not be interfered with.
According to neoliberals, a working person’s obedience in a job ought to, indeed morally should, follow naturally from his or her acceptance of that job because the person has “voluntarily” entered into the contract. As Milgram says, if obedience is willing, compliance is “easily exacted”.
But if obedience is not willing, then compliance depends on direct surveillance. If surveillance ends, obedience stops.
Under voluntary obedience, control comes from within the person. Therefore, obedience has an internalised basis, not just an external one.
That is why ideology, or as Milgram phrases it, “the definition of the situation”, is so important. “Control the manner in which a man interprets his world,” he says “and you have gone a long way toward controlling his behaviour.”
Liberal capitalism was under greatest threat in the nineteenth century when the Left espoused the concept of “wage-slavery”, the idea that when a person is compelled under pressure of need, to rent themselves to a company and give away all control over what they do, their position was similar to that of a chattel slave. Then working people powerfully contested obedience and employers had to enforce it. Today obedience is, to a large extent, voluntary and the values of liberal capitalism are internalised. We are bound, by ideological bonds that can be broken, to our roles. We are, as the title of a recent book put it, “willing slaves.”
Noam Chomsky has distinguished between people who, for good or ill, are moral agents and institutions, “structures of power” that are basically amoral. But this distinction between people and institutions is hard to accept because, as Milgram says, society promotes the ideology that a person’s actions stem from their character. Bad outcomes are the result of bad people.
In the 50 years that have elapsed since the original Milgram experiment was first conducted, the idea of personal responsibility has increased in intensity. It has become an important foundation of the neoliberal ideology now dominant in the UK and US. The endless tail-chasing of attributing blame only serves to obscure the problem of the absence of ethical responsibility.
“We” has become the overused pronoun of all. We are said to be responsible for global poverty, we cause global warming. But “we” aren’t responsible for anything, because “we” don’t exist.
But the effort to make structurally amoral institutions, such as corporations, moral, is not only wrong-headed but malignant because it takes energy and attention from the proper task – institutional change
As Mark Fisher says in his book Capitalist Realism: “Does anyone really think, for instance, that things would improve if we replaced the whole managerial and banking class with a whole new set of (‘better’) people? Surely, on the contrary, it is evident that the vices are engendered by the structure, and that while the structure remains, all the vices will reproduce themselves.”
But, as Milgram says in Obedience to Authority, for a person to feel responsible for his actions, “he must sense that the behaviour has flowed from ‘the self’. In the situation we have studied, subjects have precisely the opposite view of their actions.”
So long as the structure remains off-limits, crisis will always haunt us.
So far I have examined only the behaviour of those who obeyed authority. But a sizeable minority – around one third of volunteers – defied the instruction to shock the victim and disobeyed.
As we will see in part two of my article, their defiance was no simple matter. But certain conditions made disobedience easier, and in part two, I will consider them.
In Milgram’s words, “The individual is weak in his solitary opposition to authority but the group is strong … this is a lesson that every revolutionary group learns.”