Confirmation bias: examples and how to counteract it

The human brain is designed to unconsciously seek confirmation of our beliefs and ignore facts that contradict them. This phenomenon, known as confirmation bias, affects science, politics, medicine, and even everyday decisions. From debates about evolution to misinformation during a pandemic, here’s how this cognitive mechanism distorts our perception of reality.

In the previous section, we looked at the phenomenon of confirmation bias, its causes, and threats. Now, let’s look at real-life examples of how this type of cognitive bias manifests itself and how to overcome it.

Examples of confirmation bias

  • Creationists vs. evolutionary biologists

A vivid example of confirmation bias can be seen in the debate between creationists and evolutionary biologists. The latter use scientific data and experiments to uncover the process of biological evolution over millions of years. The first believe that the Bible is literally true and think that the world is only a few thousand years old. Creationists tend to mitigate the cognitive dissonance caused by evidence that contradicts their ideas. Many people believe that non-empirical “evidence” for their beliefs is more valuable than empirical evidence for evolution.

– Evolutionary biologists have used the fossil record to prove that evolution took millions of years. Meanwhile, some creationists claim that fossils are proof of the worldwide flood described in the Bible. They ignore evidence that contradicts these ideas and instead use it to confirm what they already think.

  • Homeopathy

The multi-billion-dollar homeopathic industry is an example of massive confirmation bias. The pseudo-scientific idea of “water memory” arose from the experiments of the French immunologist Jacques Benveniste. He claimed that a highly diluted solution of bee venom, with a concentration well below Avogadro’s number, could produce the same structural effects in living organisms as the real venom. In 1988, Benveniste published a paper in Nature reporting these results, promising to confirm them in his own lab before a scientific committee. However, he was unable to reproduce his findings. Later, an independent group of scientists repeated the experiment, but their attempts also ended in failure. Thus, Benveniste’s experiment and the hypothesis of “water memory” have not been confirmed and are considered scientifically unfounded, but despite this, the number of supporters of homeopathy is still large.

  • Brexit (2016, UK)

An analysis of more than 1 million users and 5,000 posts about Brexit on social media showed that the debate has divided Britons into two isolated communities: those who support and those who oppose Britain’s exit from the European Union:

– The Leave camp ignored the economists’ warnings, focusing only on the “independence” argument. The Remain camp, on the other hand, failed to consider the potential benefits of the decision.

– At the same time, users interacted almost exclusively with content that confirmed their views, ignoring opposing opinions. This led to the formation of “echo chambers” — “information bubbles” where opinions were amplified by a homogeneous audience. Even mainstream media (e.g. The Guardian vs. Daily Mail) became part of different echo chambers, which increased the division.

– It’s noteworthy that the same topics (for example, the economic consequences of Brexit) were presented and perceived in diametrically opposed ways in different camps.

– The emotional reaction in the comments also varied: negative posts in one group caused anger, while in another they caused approval.

– Polarization was exacerbated by the social network’s algorithms, which showed users content that matched their preferences.

– The study confirmed that social media, contrary to expectations, did not contribute to pluralism of opinions, but rather increased the polarization of society.

– The Brexit debate showed that attempts to refute false information (fakes) in such conditions are often futile — users rejected facts that did not correspond to their beliefs.

Brexit thus became an example of how social media turns complex issues into binary conflicts, where facts give way to emotions and group identity. Without mechanisms to overcome echo chambers (such as algorithmic diversity), such scenarios will be repeated.

  • COVID-19

The study found that confirmation bias played a critical role in decision-making regarding the COVID-19 pandemic at all levels, from politicians to ordinary people.

– Opponents of vaccination ignored scientific data, relying on isolated cases of complications.

– Conspiracy theorists took advantage of the general confusion to be the first to fill the information space with their versions. Conspiracy theories were quickly picked up and spread because they provided simple answers to complex questions.

– Governments in some countries underestimated the threat of the virus and the importance of vaccinating the population until they were faced with a sharp increase in mortality.

– Confirmation bias was enhanced by the Dunning-Kruger effect (a cognitive distortion in which people with low levels of qualifications overestimate their abilities, while competent specialists, on the contrary, tend to underestimate their knowledge and skills), including in assessing the scale of the spread of the virus, the rate of immunization and the fatality rate.

– Informal sources (social networks) often outpaced official data, but their unreliability provoked panic and increased polarization.

Thus, the COVID-19 pandemic has shown that even in the era of Big Data, human thinking remains vulnerable to errors. The fight against future pandemics requires not only medical vaccines, but also “cognitive vaccines” — methods to counteract biases in decision-making.

The experiment showed that even a simple reminder to fact-check reduced trust in fakes.

Guide to Counteracting Confirmation Bias

  • Recognize your vulnerability.

– Recognize that your brain automatically filters information.

– Ask yourself, “What evidence could change my mind?”

  • Look for disconfirming data.

– Deliberately read sources with opposing views.

– Example: if you are confident in the effectiveness of homeopathy, read meta-analyses showing its ineffectiveness.

  • Ask critical questions.

– “Why do I believe this?”

– “Could my opinion be wrong?”

– “What alternative explanations exist?”

  • Use the “premortem method”.

– Imagine that your decision turned out to be a failure. Think about what could have led to this. What arguments did you miss?

  • Apply the “three sources” rule.

Before you make a conclusion, check the information in:
– scientific research;

– a news source with an opposing position;

– an international media outlet.

  • Develop critical thinking.

– Read scientific articles, not just news headlines.

– Avoid echo chambers on social media: follow people with different views.

– Be willing to change your mind in light of new evidence, even if it means updating or even changing your current beliefs.

Confirmation bias is not just an abstract psychological concept. It affects our health, politics, justice, and even international relations. Combating it is difficult, but not impossible: critical thinking, fact-checking, and a willingness to consider alternative points of view help us make better decisions. In a world where information is being weaponized, recognizing your cognitive biases is the first step toward more rational thinking.