“Holding and defending false beliefs is part of human nature.” — Psychiatrist Joe Pierre, M.D.
“When it comes to reasoning, identity trumps truth.” — Psychologist Steve Rathje, Ph.D.
It seems logical to assume that when presented with strong evidence, people will change their minds. If someone holds a false belief, all we need to do is show them the facts, and they will correct their thinking—right?
Unfortunately, that’s not how the human mind works. In reality, facts often fail to persuade, and in some cases, they even make people more entrenched in their beliefs.
This resistance to facts is not due to ignorance or lack of intelligence. Instead, it stems from the way our brains process information. Understanding why facts don’t change minds—and what actually does—can help us communicate more effectively and navigate an increasingly polarized world.
The Backfire Effect: Why Facts Can Make Beliefs Stronger
One of the most well-documented reasons facts fail to persuade is the backfire effect. This occurs when people, instead of updating their beliefs in response to contradictory evidence, double down on their original position.
Political scientists Brendan Nyhan of Dartmouth College and Jason Reifler of the University of Southampton conducted a study where they presented people with factual corrections to misinformation. Instead of changing their views, many participants became more convinced of their original, incorrect beliefs. This happens because people see challenges to their worldview as threats, triggering a defensive response.
Motivated Reasoning: Protecting Our Identity
Humans are not purely rational thinkers; we are emotionally driven beings. Motivated reasoning is the tendency to interpret new information in a way that aligns with our preexisting beliefs rather than objectively evaluating the evidence.
For example, if someone strongly identifies with a political ideology, they are more likely to reject facts that contradict their party’s stance. Accepting the evidence would mean admitting they were wrong, which can feel like an attack on their identity. Instead of changing their views, they will look for ways to dismiss the new information, such as by questioning the source or interpreting the data in a way that fits their beliefs.
Tribalism and Social Influence
Our beliefs are not just personal—they are tied to our social groups. Changing our mind on a deeply held belief can mean risking rejection from our community, whether that be family, friends, church, or political allies.
Psychologist Jonathan Haidt argues that people are more likely to align their beliefs with their group than to seek objective truth. This is why debates over science, politics, and religion often become emotionally charged: they are not just about facts; they are about loyalty to a tribe.
.
What Changes Minds?
Build Trust Before Presenting Facts
People are more receptive to information when it comes from a source they trust. Instead of immediately bombarding someone with evidence, it helps to establish a connection and show empathy.
Use Stories, Not Just Data
Since the brain is wired for narratives, personal stories can be more persuasive than statistics. Instead of citing studies, sharing real-life examples that evoke emotion can make information more relatable and impactful.
Encourage Curiosity Instead of Confrontation
Directly telling someone they are wrong often triggers defensiveness. A more effective approach is to ask open-ended questions that encourage reflection. Questions like “What led you to that belief?” or “Have you ever considered an alternative perspective?” can open the door to self-discovery.
Allow People to Save Face
Changing a core belief can feel humiliating, so giving people a way to shift their views without feeling attacked is important. Framing new information in a way that allows them to adapt without feeling like they are losing an argument can make them more open to reconsideration.
Small Steps Over Time
Radical shifts in thinking rarely happen overnight. Instead of expecting immediate change, presenting ideas gradually and allowing time for reflection can be more effective. People often need multiple exposures to new perspectives before they start to reconsider their stance.
.
References
Why Do People Believe Things That Aren’t True? by Joe Pierre M.D.
Why People Ignore Facts by Steve Rathje D.Phil.
The Psychology of Self-Righteousness by Jonathan Haidt Ph.D.
Very insightful piece. This is precisely what I see when I present my liberal friends with facts and evidence which might appear to land in Musk(rat)'s favor. My friends and I abhor Musk/Trump for their lies, racism, subterfuge and tactics, however when DOGE uncovers facts which hurt the Dem narrative I find I'm not supposed to talk about it. I find I should not even acknowledge the facts as people will get angry!
I'm certain I have blindspots too, but I try to listen and discuss without getting angry.