Confirmation bias is the idea that we interpret things based on our preconceptions. Our brains have a talent for recognising patterns through the theory of selective attention, and as a result, we give more credit to information we believe in and disregard information we believe is wrong: the same cognitive process that causes the Frequency Illusion.
For example, someone who believes vaccines cause autism might hear about a young boy suffering from behavioural problems shortly after the MMR jab. Despite major studies showing no link between MMR and autism, and millions of others seeing no side effects, the person will automatically give more weight to the story of the young boy.
In medicine, a doctor might seek information to support their initial diagnosis of IBS and disregard information indicating pregnancy. By focusing on the information they’re looking for – that stomach pain, bloating and nausea are symptoms of IBS – the doctor might argue that pregnancy symptoms are much more frequent and intense, leaving IBS as the only possible cause.
This isn’t by chance.
Despite our profession, personality, and everything our horoscope tells us, the chemicals in our brain fabricate incoming data to intensify our perception of the world: emphasising supportive evidence and minimising anything contrary.
To demonstrate this effect, Pines (2006) wrote about a 51-year-old diabetic male going to the emergency department with lower back pain. The patient reported being in severe pain for the seventh day after heavy lifting at work and said he’d been taking high-dose Ibuprofen every 6 hours.
The nurse told the physician:
“Mr W. is here again. He is here all the time requesting pain medicine and work excuses for lower back pain. He was even here yesterday and was seen by your colleague.”
Since no beds were available, Mr W. was examined in the hall and it was found he had muscle tenderness in his lower back. No rectal or perineal examinations were performed and he was given a new prescription for pain relief.
As the physician was only looking to prove his theory that the man was seeking medication he did not need, rather than objectively diagnose the man’s pain, he missed the problem that required immediate surgery.
In this case, confirmation bias could rest on four claims:
- Biased search, causing the physician to search for information that supported his existing beliefs and ignore what did not.
- Biased favouring, causing the physician to give more weight to information that supported his beliefs, and less to information that did not.
- Biased interpretation, causing the physician to interpret information in a way that confirmed his beliefs, even if it could be interpreted to contradict them.
- Biased recall, causing the physician to only remember the information that supported his beliefs and forget information that did not, or to incorrectly remember the information all together.
You can’t argue with science
Scientists from the University of London examined the part of our brain associated with decision-making, known as the posterior medial frontal cortex, suggesting it could underpin confirmation bias. The findings have since been published in the Nature Neuroscience journal, implying the theory is much more than its previous descriptions of a “curious case”.
During this particular study, 42 adults were split into pairs and asked to individually judge whether the price of real estate properties cost more or less than $1,000,000. Once they’d made a decision, they were asked to wager 1-60 cents to show how confident they were: 1 being not at all, and 60 being 100%.
Next, each participant was placed in an fMRI scanner, shown the properties again and reminded of their initial answer and wagers. They were then shown other people’s answers and wagers – said to be their partners – and asked to submit a final wager based on how confident they were about their initial answer.
The authors found that this brain activity increased for each participant when they encountered another person’s judgement that agreed with their own – significantly more than when they encountered another person’s judgement that disagreed.
It was also determined that when the partner’s judgement confirmed the participant’s initial assessment, the participants’ increased their final wagers and these wagers unknowingly matched their partner’s.
To investigate further, Ben Tappin, Leslie van der Leer, and Ryan McKay explored an alternative hypothesis to explain why we emphasise confirming evidence. They considered the possibility that humans have a desirability bias in addition to confirmation bias, and carried out a study to tell the hypotheses apart.
In the study, 900 participants were asked to state which candidate they wanted to win the American election, and which candidate they believed was most likely to win. This separated the group into two: those who believed the candidate they wanted to win was most likely to win (e.g. the Trump supporter who believed Trump would win), and those who believed the candidate they wanted to win was, in fact, less likely to win (e.g. the Clinton supporter who believed Trump would win).
Both groups were then given recent polling data emphasising that either Clinton or Trump was more likely to win. After reading the information given, all participants stated for the second time which candidate they believed.
The results, now reported in the Journal of Experimental Psychology: General, showed that the people who received desirable evidence – that their preferred candidate was going to win – solidified their subsequent belief about which candidate was more likely to win, where the people who received undesirable evidence barely changed their belief.
While different, both of these studies confirm the link between false hypotheses and our limited human capacity to process information: coming together as one to create the force of the confirmation bias.
Wason’s rule discovery
Cognitive psychologist Peter Wason discovered confirmation bias in a class experiment based on the 2-4-6 hypothesis rule (1960). The aim of the experiment was to surprise students with how easily their thinking could be led astray and lead to illogical and irrational conclusions.
The students were told that Wason had a rule in mind that only applied to sets of threes, and that the sequence “2-4-6” was produced by this unknown rule. They were then tasked to discover the rule by testing a number of sequences, with the understanding that the rule was in “any ascending sequence”.
For every three numbers the students proposed, they were given a positive or negative reaction depending on whether or not it satisfied the rule, leading them to keep certain numbers and eliminate others. The students tried “4-8-10”, “6-8-12”, and “20-22-24”.
Almost all students formed the same hypothesis and only tried number sequences that would prove each hypothesis and few that would disprove them. They didn’t ask questions in doubt of their hypothesis in fear or breaking their own rule: their pre-existing belief that the initial sequence (“2-4-6”) was based on increasing even numbers.
The same theory was demonstrated by Wason’s Selection Task, otherwise known as the 4-Card Task carried out in 1966. It’s known to be one of the most repeated tests of logical reasoning in psychology and involved students observing a set of four cards.
The students were shown the cards 3, 8, red, and brown, and told that each card had a figure on one side and a coloured block on the other. Their task was to identify which card or cards must be turned over to identify the idea that if a card shows an even number on one side, then the opposite face is red.
Less than 20% of students found the correct solution, and evolutionary psychologists found the task much easier to solve when placed in the context of a social rule – rather than just shapes and colours. For example, “If a person drinks an alcoholic drink, then they must be over the age of 21 years old.”
A higher percentage of students succeeded in this task, supporting the evolutionary theory that mechanisms have evolved through natural selection to solve social adaptive problems: one of the very reasons confirmation bias works well in marketing.
“The eye sees only what the mind is prepared to comprehend”
said Robertson Davies in his book Tempest-Tost.
Confirmation bias influences our lives in ways we do not recognise. Even in bureaucratic circumstances like the Iraq war, the report to the President admittedly disregarded important evidence: “When confronted with evidence that indicated Iraq did not have [weapons of mass destruction], analytics tended to discount such information. Rather than weighing the evidence independently, analysts accepted information that fit the prevailing theory and rejected information that contradicted it.”
This decision would have been driven by the two cognitive mechanisms used to explain confirmation bias: challenge avoidance, where people don’t want to find out they are wrong, and reinforcement seeking, where people want to find out they are right.
Where challenge avoidance could help the analysts to avoid cognitive dissonance – the psychological stress that occurs when a person holds two or more contradictory beliefs – the cogs in their brain would’ve forced them to seek approval: the exact reason confirmation bias exists.
Moreover, a theoretical approach by Klayman and Ha (1987) suggests that participant behaviour in Wason’s 2-4-6 Rule Discovery could rather be interpreted as a “positive test strategy”. In other words, where a subject tests a hypothesis by examining instances in which the event is expected to occur, or by examining instances in which it is known to have occurred.
The result of the strategy lies on the basis that people seek to verify the strategy that will have the greatest impact on their beliefs of the hypothesis. With confirmation bias having links to the Law of Attraction, in that we receive in person the positive and negative thoughts in our minds, this theory makes logical sense.
Confirmation bias in marketing
Just as the physician argued Mr W. was seeking medication he did not need, we often find ourselves searching for confirmation within our deeper conscious: ignoring the angel on our right shoulder and trusting the devil on our left. A prime example is when we try to rationalise our purchases, falling victim to the bias through confirmation mechanisms like an emotional investment.
When this happens, as well as looking for evidence to confirm what we already believe: that we are deserving of the purchase and other people spend their money on more conflicting things, we continue to validate these decisions through what is known as the Endowment Effect.
This theory was proposed by psychologist and Noble Prize winner Richard Thaler, who in his book “Toward A Positive Theory of Consumer Choice”, argued that people have a general tendency to value items more when they own them.
As an example, he wrote:
“Mr R. brought a case of good wine in the late ‘50’s for about $50 a bottle. A few years later his wine merchant offered to buy the wine back for $100 a bottle. He refused, although he has never paid more than $35 for a bottle of wine.”
He also became famous with Daniel Kahneman for identifying confirmation bias, anchoring (the tendency to rely too heavily on initial information), and loss aversion (feeling the pain of losing ten dollars more intensely than the joy of winning ten dollars).
These concepts have influenced advertising since the days of David Oglivy, allowing brands to tap into consumer behaviours and inject benefits of their product or service in our unconscious mind. Through the theory of confirmation bias, we will automatically give more weight to brands we’ve had a positive experience with and less to the products they’re actually advertising.
“You can find these anywhere”, we say. “But at least I know they’ll arrive on time and the quality will be good. I buy from here all the time.”
This emotional impact is what leads us to focus less on conflicting behaviour and more on brand loyalty, as it’s much easier for our brains to affirm our existing beliefs than go through the decision-making process again: opening up a world of opportunity for brands to fall into our selective attention and drown out their competitors.
Now with everything we know, the phrase “I’ll believe it when I see it” seems false: despite what’s in front of us, we only believe what we already think to be true, i.e. “I’ll see it when I believe it”.