We like to think we’d do the right thing in a tough situation. We’d stand up to our boss when necessary, step in if we saw someone being bullied, and say no if we were asked to do something we felt was wrong. It’s tempting to think we have an innate moral compass that guides our actions, even under pressure from others.
In reality, however, most of us are remarkably bad at standing up to authority. New research is revealing why this is, giving us insight into how the brain deals with – or fails to deal with – these difficult situations. Ultimately, the research could show us how we can train ourselves to become stronger-minded and better able to stick to our guns when needed.
In experiments by social neuroscientist Emilie Caspar at the Netherlands Institute for Neuroscience, volunteers gave each other electric shocks. (The research follows in the footsteps of the notorious experiments of Stanley Milgram in the 1960s, but in a more ethically and scientifically rigorous way.)
First, participants were asked to administer shocks for a small sum of money (about 5p each time). When a participant was given 60 chances to shock their partner, about half of the time they chose not to. Around 5-10% of people choose not to shock their partner on all 60 opportunities.
Then Caspar stood over the participants and ordered the person giving the shocks to do it. Now, even the participants who didn’t give any shocks previously started to press the button.
As soon as Caspar gave orders, the participants’ brain activity also changed, electroencephalogram (EEG) scans showed. In particular, research showed the brain became less able to process the consequences of the respondents’ actions. For the vast majority of volunteers, their sense of agency and responsibility started to melt away.
“I’ve tested more than 450 participants, and so far only three refused to follow the orders,” says Caspar. “How are these people different from the others?”
Studies on patients with localised brain damage are helping to answer part of this question. When people have lesions in the prefrontal cortex – the outermost layer of the front part the brain – they appear to be much more prone to following orders than the general population.
“They really very readily listen to authorities, and are less able to doubt them,” says Erik Asp, an assistant professor of psychology at Hamline University’s College of Liberal Arts in the US. “That means if an authority figure tells you to hurt someone else, you’re more likely to.”
What is it about this part of the brain that helps us stand up to authority?
The question gets into philosophical topics like the nature – and neurological basis – of belief. While there is no clear scientific consensus, the Spinozan model is a strong contender. It suggests that in order to understand a new idea or fact, our brain must, for a split-second, believe it completely.
“The act of understanding is the act of believing. Whatever those processes are, they are the same,” says Asp.
After a split second, you then can doubt or reject this new piece of information. “You can use a separate neuropsychological process to come back and disbelieve that mental representation,” says Asp. “In other words, you come back and doubt it.”
For prefrontal cortex patients, it’s this second part of the process that is impaired, Asp argues. So instead of quite literally thinking twice about what an authority figure says, prefrontal cortex patients are more likely to take what they hear as given.
If the prefrontal cortex is the seat of our ability to doubt and question authority, there may be a way in healthy people to strengthen our ability to do this. The prefrontal cortex has some plasticity. “I think it is modifiable,” Asp says. “It’s not built in to the brain function that you have – it’s not set in stone.”
Education is one of the best ways to improve your ability to doubt, says Asp, and therefore your ability to think critically about things you might be told to do.
There is another deciding factor that influences how you behave, too.
When an authority figure asks us to do something, we usually do it because we’re led to believe in the cause behind their request, says Megan Birney, a psychologist and senior lecturer at the University of Chester at University Centre Shrewsbury.
In one experiment, Birney and her colleagues measured how many people dropped out of an experiment where they were told to do something morally objectionable. The participants had to attach negative terms to groups of people in photos. The pictures started off with groups it was easy to dislike, such as Nazis or the Ku Klux Klan. Gradually, the pictures were of more neutral groups and eventually of families or groups of young children.