In partnership with

How confirmation bias stops us solving problems

May 25, 2018 Michael Hallsworth and Mark Egan

This is the third blog in our Behavioural Government series, which explores how behavioural insights can be used to improve how government itself works.

Confirmation bias is the tendency to seek out, interpret, judge and remember information so that it supports one’s pre-existing views and ideas.

Confirmation bias can make people less likely to engage with information which challenges their views. An example of this is a recent study of 376 million Facebook users, which found that many preferred to get their news from a small number of sources they already agreed with.

Even when people do get exposed to challenging information, confirmation bias can cause them to reject it and, perversely, become even more certain that their own beliefs are correct.

One famous experiment gave students evidence two scientific studies – one that supported capital punishment, and one that opposed it. The students denigrated whichever study went against their pre-existing opinion, and left the lab embracing their original position even more passionately.

The mental process which helps explain this behaviour is called motivated reasoning. What is worrying is that motivated reasoning may actually reduce our ability to understand and interpret evidence, and so make us less likely to be swayed by reasoned argument.

This is illustrated by a recent Danish study which showed elected politicians (hypothetical) satisfaction statistics for two different schools, then asked them to identify the best-performing one. Around 75% answered correctly when the options were labelled innocuously (e.g. “School A” and “School B”). However, these results changed dramatically when the options were framed in terms of public vs private services (e.g. “Private School” and “Public School”), a contentious issue in Danish politics.

Figure 1 shows that when the correct answer was in line with their pre-existing beliefs about public services (i.e. the politician strongly believed in the value of public services and the correct answer was that the public school was better), 92% of politicians chose correctly. But only 56% got it right when the answer was at odds with their beliefs (i.e. the politician strongly believed in the value of public services and the correct answer was that the private school was better).

Figure 1. Relationship between prior attitudes and correct interpretations of statistical data among 127 Danish politicians.

Worryingly, when the politicians were given more pieces of information on performance, they actually performed worse, relying more heavily on their prior attitudes. That means the issue cannot simply be addressed by relying on civil servants to provide more or better evidence for policy making – especially since civil servants are not immune from motivated reasoning themselves.

In our view, confirmation bias is one of the most pervasive and problematic cognitive biases that affects policy making. For that reason, it is also one of the hardest to tackle. However, we think that there are realistic improvements to be made.

Sign up to our mailing list to be among the first to hear them when we release our full Behavioural Government report.

Subscribe for updates about BIT’s work