When Facts are Powerless: The Psychology of Denial

Encountering an unpleasant fact causes the brain to react similar to a physical threat, often disabling rational thinking. This is not a sign of stupidity or stubbornness, but a universal psychological shield that protects our worldview from destruction, even at the cost of distorting reality.

The phenomenon known as denial of facts is rooted in deep cognitive distortions that once helped our ancestors survive, but in the modern world they often prevent us from adequately perceiving information. From investment decisions and medical diagnoses to political views and vaccinations, the inability to accept evidence has the same psychological causes. Knowing these mechanisms — from the fear of admitting a mistake to the fear of falling out of a social group — is the first step towards not only avoiding these traps yourself, but also learning how to communicate the truth to others.

Why do people deny evidence:

  • Fear of admitting that you are wrong. It is pleasant for a person to always be right, and accepting the fact that he was wrong and made a mistake pulls him out of his comfort zone.

A research by Barry Stowe of the University of Illinois vividly demonstrates how people stubbornly deny their mistakes even when there is irrefutable evidence of failure. For example, an entrepreneur begins to invest even more actively in a deliberately unsuccessful project, ignoring all objective signals about the need to stop. Scientists have concluded that the fear of admitting a mistake often surpasses rational thinking, sometimes completely suppressing the ability to adequately assess the obvious facts.

  • Confirmation bias. People tend to accept information that confirms their beliefs and deny facts that contradict those beliefs.

The review by O’Sullivan and Shoffield (2018) examines in detail how confirmation bias is one of the most common causes of diagnostic errors. Doctors, like all people, tend to look for confirmation of their initial idea, which can lead to the consolidation of an erroneous diagnosis. Having formed a hypothesis about the patient’s disease, the doctor unwittingly begins to pay more attention to the symptoms and test results that are consistent with this diagnosis, and may underestimate or write off symptoms indicating another disease as a minor accident.

  • A defensive reaction to cognitive dissonance. Unlike confirmation bias, which acts as a passive filter when new information appears, this mechanism works as an active defense when confronted directly with incontrovertible facts. The threat to the integrity of the worldview triggers a physiological response — the ability to rationalize is suppressed, and defense mechanisms force not only to ignore, but actively reject evidence, sometimes paradoxically strengthening the initial belief (the effect of a reverse reaction).

In Nyhan and Reifler’s 2010 experiment, conservative participants were shown a denial of information about the presence of weapons of mass destruction in Iraq. Instead of the expected result, the «backlash effect» worked: this group not only did not accept the facts, but also began to believe more strongly in the original statement. At the same time, the liberal-minded participants who received the same refutations reacted rationally and adjusted their opinions. This demonstrates that when confronted with facts that challenge ideological beliefs, people may not be persuaded, but, on the contrary, strengthen their case.

  • Emotional and psychological factors. New information can cause cognitive and emotional instability if it threatens a person’s worldview, which leads to ignoring the evidence. Fear, anxiety, and disgust can also play an important role, for example, in avoiding vaccination.

Scientists have found that believing in vaccine conspiracy theories serves as a defense against emotional instability, replacing the disturbing uncertainty of real life with a simple scheme with a clear «culprit» — a pharmaceutical company. Thus, refusal to vaccinate creates the illusion of control, while scientific evidence that recognizes the diversity of risks cannot offer an equally reassuring explanation.

  • Social and identity factors. Fear of social exclusion causes people to reject information that contradicts their group’s beliefs. This protects their social identity, but at the same time creates and maintains an echo-chamber within which any dissent is excluded, and trust extends only to «their» sources.

In 1974, Second Lieutenant Hiroo Onoda of the Japanese army surrendered to the Philippine authorities. He hid in the jungle on Lubang Island for almost 30 years, refusing to believe that World War II was over and the Japanese were defeated. He believed that he was waging a guerrilla war behind enemy lines, although in reality he was fighting only with the Philippine police and local peasants. Hiroo had heard reports on the radio about the Japanese government’s surrender, the Tokyo Olympics, and the economic miracle, but he considered it all enemy propaganda. He admitted his mistake only when a delegation led by a former commander arrived on the island, who 30 years ago gave him the order «not to surrender.»

Onoda surrendering his sword to Philippine President Ferdinand Marcos and received a pardon for his actions over the previous decades. Photo: Kyodo/Reuters
  • Distrust of the source. Bias, doubts about competence, or general distrust of scientists can cause people to reject their findings.

Research shows that distrust of scientists as a source of information is often rooted not in scientific data, but in social stereotypes. Scientists are more often portrayed as emotionally detached and insensitive, and their conclusions and recommendations are of little use to the average person. In addition, certain groups are convinced that the scientific community as a whole is opposed to conservative and Christian worldviews.

  • Uncertainty aversion: Some people find uncertainty difficult to tolerate, which is why they reject scientific information that they find difficult or insufficiently studied.

Matthew Fernandez and his colleagues found in their 2017 research that the individual trait of «uncertainty aversion» was a significant predictor of skepticism about climate change data. People who have a psychologically difficult time with uncertainty were more likely to reject scientific consensus because they were intimidated and repelled by the ambiguity inherent in climate science, rather than the evidence itself.

  • Presentation and framing: Information may be presented in such a way that it contradicts a person’s way of thinking or understanding, and for this reason may be rejected.

Cognitive scientists Stephen Sloman and Philip Fernbach show in their review that negative attitudes towards GMOs are often rooted in the «naturalness heuristic» — a deeply rooted intuition that «natural» = good and «artificial» = bad. When information about GMOs is presented in technical and utilitarian terms (increased yields, pest resistance), it does not resonate with people whose thinking is based on this heuristic. Their rejection is not caused by an analysis of risks and benefits, but by the fact that the very essence of technology contradicts their intuitive understanding of the «right» order of things.

  • Motivated thinking: Because encountering facts that refute our beliefs causes discomfort (cognitive dissonance), people tend to unconsciously ignore new information in order to maintain a familiar picture of the world.

In the 1950s, psychologist Leon Festinger observed a sect awaiting the end of the world on December 21, 1954. When the apocalypse did not happen, the sectarians did not lose faith, but, on the contrary, received a «message from God» that their faith had saved the world. This led to a drastic change in their behavior: the group turned from a closed group into an active missionary group. The scientist explained this phenomenon by cognitive dissonance: when recruiting new followers, people unconsciously sought confirmation of their correctness, because the faith shared by many seems more reasonable.

The cult members observed by Festinger return home after Christmas carols failed to entice aliens to save them on December 24, 1954. Photo: Charles E. Knoblock / AP

Denying evidence is not a dead end, but a complex psychological challenge that requires strategy rather than force. As the above examples show, an attempt to «overwhelm with facts» often leads to the opposite effect — paradoxically strengthening a person in his faith. Fortunately, there are effective alternatives. A practical guide awaits you in our next article.: how to establish contact before making arguments, how to inoculate against disinformation, and how to turn an argument into a joint search for the truth using the power of the right questions.