Navigating the ravine: the perils of confirmation bias
13 March 2015
Psychologists have identified the name of the black hole down which we pointlessly fling the majority of our communication: it’s called confirmation bias.
This ravine has been known to divide the House of Commons like the rift in a tectonic plate, making attempts at true debate an unhappy business. And, as we approach a general election, this most toxic of biases will be hoovering up and dispensing with information faster than Colin from accounts at a buffet lunch.
But what exactly is it, and where did it come from? Put simply, confirmation bias is an everyday process that looks something like this:
1. Man or woman bumbles around doing stuff, chatting with friends, watching Better Call Saul, and generally having a pleasant enough time.
2. Man or woman meets someone with a strong opinion (or they read an article or watch a news report). They step away, blinking into the bright sunshine, a born-again acolyte.
3. Over the coming months, every day they seem to find new evidence to support their conviction. They are amazed this evidence has been out there so obviously all this time, and that anyone can be so blind as not to see it.
4. Gradually, they also notice lacuna – fragments of alternative text beneath the otherwise orderly manuscript of their strongly held belief. These lacuna represent contrary arguments, and they trigger alarm, activating the part of their brain that deals with conflict resolution (and leaving untroubled the part that deals with reasoning)*.
5. Like a triage nurse in A+E dispatching patients for relevant treatment, man or woman evolves simple strategies to deal with danger and keep their opinion healthy. After assessing the severity of the potential injury, some are just ignored – sent home with a mild admonishment for wasting NHS time. Some are slightly more troubling, and these get cleaned up and hidden under a bandage. And the most challenging of all are sent hurtling down corridors, followed by junior doctors and a giddy anaesthetist.
Put another way, some arguments and evidence we shrug off with irritation as being silly and irrelevant, some we file away under ‘disreputable’ by denouncing the provenance as untrustworthy, and some we go to town on, attempting to shred the argument and overwhelm it with rhetoric and righteous anger.
The point is that confirmation bias leads us rarely to engage our reasoning faculty when faced with a contrary point of view. Instead, we filter out inconvenient truths and focus in on what we know and love. This is why so much well-thought out communication becomes little more than verbal junk mail, flung into a bottomless pit of misfired messaging.
As long ago as 1602, Francis Bacon, observed this phenomenon, saying that:
The human understanding when it has once adopted an opinion… draws all things else to support and agree with it. And though there be a greater number and weight of instances to be found on the other side, yet these it… despises and rejects. (Novum Organum, 1620)
Evidence to support Mr Bacon’s observation was found in a seminal 1979 study by Charles Lord and his colleagues at Stanford, while neuroscientists at Emory University in 2006 used fMRI scans to show that what happens in our brains during these moments is nothing to do with reasoning and everything to do with emotion and internal conflict resolution. See below for more detail on these studies.
So what does all this mean for communication, and what can we do to avoid the waste that this implies? Stephen Denning, who cites the above research in his book The Secret language of Leaders, argues that we’re getting it wrong if we try to engage others in rational argument at the outset. And while traditional wisdom has it that we start with a definition and analysis of the argument, before recommending a solution, this actually just plays into the vacuum that is confirmation bias.
Instead, he says, we should get people’s attention first, probably with a surprising and unsettling story that lays out the current predicament. ‘Labour isn’t working’, claimed Saatchi & Saatchi on behalf of the Conservatives in 1979, showing a picture of a dole queue. In 2001, they gave Blair evil eyes and warned: ‘New Labour, new danger’.
On the other side, Labour argued, in 1997, that ‘Britain deserves better’, and in 2001 they dressed William Hague in a Maggie wig and warned “Be afraid. Be very afraid”.
These attention-grabbers touch on a whole frame of references that the audience can assemble, on their own, into a troubling and visceral story of unhappiness. This is how they arrest the audience – not with reason but with emotion.
After getting attention in this way, says Denning, we should next create appetite and stimulate desire in the listener. A great way to do this is by telling positive stories of how we have overcome predicaments and adversity in the past – the implication being that we can do so again.
In 2000, Al Gore was mocked as a dull technocrat when he ran for the US presidency, placing rational argument upon rational argument for why he should be elected. Six years later, he was selling out stadia with An Inconvenient Truth, his campaign on climate change. How? By capturing people’s attention and then telling stories of overcoming adversity and changing direction, such as his tobacco-farming family’s struggle to come to terms with his sister’s death from lung cancer.
And finally, after getting attention and stimulating the appetite, it is safe to move into the territory of reasoning and recommended solutions. To cross the ravine, then, you need to build a bridge that connects you to the other side. If you don’t, you’re wasting your time, standing on a cliff and chucking carefully crafted messaging into an immense void. And worse still, while you do it you’re being watched from afar by a distant opponent, wearing a quizzical and disgusted expression.
A deeper diver into the science
In 1979, Stanford psychologist Charles Lord and his colleagues uncovered confirmation bias in an extremely important early study**. They gathered 48 volunteers, all of whom held strong opinions on capital punishment – twenty four in favour, and twenty four against. They showed the subjects a balanced range of studies, half confirming a deterrence effect for the death penalty, and half refuting it.
After looking over them, both groups found the evidence to be overwhelmingly in favour of the opinion they already held. Indeed, startlingly, their position had become reinforced; rational argument had polarised the group yet further. As the psychologists observed at the time, subjects had found clever ways to “reinterpret or set aside any contrary evidence so as to confirm their original positions”.
Perhaps even more interestingly, Drew Westen and team at Emory University used magnetic resonance imaging to find out what’s actually happening in our brains at the crunch points where worlds of opinion collide. They gathered thirty subjects – fifteen strong Republicans and fifteen strong Democrats – and showed them a series of self-contradictory statements emanating from the mouths of John Kerry (the Democrat) and George Bush (the Republican). They found that both parties, without too much effort, rationalised the statements of their own candidate, while finding their opponent’s self-contradictions to be absurd and laughable.
Looking at the grey matter during this process, Emory found no evidence at all of reasoning activity: “What we saw instead was a network of emotion circuits lighting up, including circuits hypothesised to be involved in regulating emotion and circuits know to be involved in resolving conflicts”. Even more startlingly, once subjects had found a way to reconcile the inconsistencies, there was clear activity in the part of the brain involved in reward and pleasure. Put another way, subjects’ successful rationalisations were reinforced by the elimination of negative emotional states and the activation of positive ones – a reward for seeing off the threat of an alternative point of view.