"The backfire effect": a scientific view of the phenomenon

It would seem that everything is so simple: it is enough to show a person the truth — and he will give up his false beliefs. But in practice, it turns out to be more difficult: sometimes the refutation of a fake causes the opposite effect — a person clings to the misconception even more. This phenomenon is called the «backfire effect» and has long raised doubts about the effectiveness of fact-checking. 

However, recent scientific research shows that the reality is much less disturbing than previously thought. Read more in this article. 

What is the «backfire effect»?

The «backfire effect» is a cognitive distortion in which an attempt to refute false information does not weaken, but increases faith in it. When a person is confronted with facts that contradict his beliefs, he may not only not change his opinion, but also strengthen it even more.

The term was first coined in 2010 by American researchers Brendan Nyhan and Jason Reifler in their article «When Corrections Fail: The Persistence of Political Misperceptions», which is published in the journal Political Behavior. 

In their experiment, they tested reactions to the topic of weapons of mass destruction (WMD) in Iraq. One group of participants was given a speech by George W. Bush, where he claimed that Iraq has WMD. To the second group — the same speech, but with the addition of an official denial that the weapon was never found.

The result was unexpected: conservatives (supporters of the invasion) after the denial became even more confident that WMD were there. The Liberals, on the other hand, have increased their doubts. However, when tested on other topics — the impact of tax cuts on government income and Bush’s policies on stem cell research the effect was either not apparent or extremely weak. In subsequent tests, even for Iraq, it did not reproduce stably.

American journalist Joe Keohane then noted: people rarely form views based on objective facts. More often, they select only those facts that confirm already existing beliefs, ignoring or distorting everything else. This is what makes us vulnerable to fakes, especially when they «prove» us right.

A vivid historical example is the Seekers sect, whose members expected the end of the world on December 21, 1954. When the prophecy did not come true, they did not abandon the faith, but even more actively began to preach it. 

Leon Festinger, Henry Riecken, and Stanley Schechter introduced the concept of cognitive dissonance when observing «Seekers» in their study «When Prophecy Fails». They concluded that people rationalize their beliefs rather than abandon them in order to relieve tension from the contradiction between faith and reality. The more you invest in them (time, social connections), the stronger this effect is.

Types of «backfire effect»

In the scientific literature, there are three types of backfire effects:

1. Ideological — when a refutation threatens a person’s values and strengthens their initial position. 

2. The effect caused by recognition — a false statement repeated in the refutation becomes «familiar» and is perceived as more plausible.

3. The «brute force» effect — an excess of arguments in refutation overloads cognitive resources, and a person clings to a simple, familiar misconception.

Until the mid-2010s, the «backfire effect» was perceived as a serious threat. It seemed that refutations might be useless or even harmful. However, subsequent research has called this into question.

Studies that have shown a reduction in the «backfire effect» 

Today, the scientific community is coming to a consensus. The «backfire effect» has already lost its status as a systemic phenomenon. Most refutations either reduce or do not affect misconceptions, but almost never reinforce them.

The fear of it turned out to be largely exaggerated and related to the methodological limitations of early works. Modern data shows that you should not be afraid to refute fakes.

For fact-checking, this means that clear, neutral, verified refutations remain an effective tool for combating disinformation. Key success criteria are trust in the source, clarity of wording, no overloading of arguments, and consideration of audience characteristics. If they are observed, the risk of a backfire effect is minimal and cannot be a reason for silence in the face of misinformation.