“Real knowledge is to know the extent of one's ignorance."— Confuscius
One of the most pervasive cognitive biases we encounter and exhibit in our daily lives is the illusion of understanding—the belief that we comprehend complex topics far more deeply than we do.
From politics to science to everyday decision-making, while we may feel confident in our knowledge, a closer examination often reveals gaps, misconceptions, and an overreliance on others' expertise.
A well-known experiment by cognitive scientists Steven Sloman, of Brown University, and Philip Fernbach, of the University of Colorado, illustrates this phenomenon.
When asked how a toilet flush works or how a zipper functions, many people initially express confidence. However, when prompted to provide a detailed, step-by-step breakdown, they quickly realize they know far less than they assumed. This disconnect between perceived and actual understanding is known as the illusion of explanatory depth. We believe we understand things better than we truly do because we rarely have to explain them in detail.
Why the Illusion Persists
Several factors contribute to this cognitive bias:
Reliance on Collective Knowledge
Much of what we think we know comes from social sources—books, articles, experts, and even casual conversations. We absorb conclusions without fully grasping the reasoning behind them, creating a false sense of personal expertise. This is particularly evident in discussions on complex subjects like economics, medicine, science, psychology, and climate change, where many people confidently express opinions despite lacking fundamental knowledge.
Simplified Narratives
The human brain prefers simplicity over complexity. To make sense of the world, we often reduce intricate issues to digestible, oversimplified explanations. This tendency is especially apparent in political and social discussions, where nuanced debates are condensed into slogans or sound bites. These simplified narratives provide a sense of clarity but distort reality.
The Dunning-Kruger Effect
The Dunning-Kruger effect is a cognitive bias in which people with limited knowledge on a subject tend to overestimate their expertise, while true experts are more aware of their limitations. This paradox explains why those who know the least often feel the most confident, where those who possess deep knowledge recognize the vastness of what they still do not understand.
We often think the Denning-Kruger Effect applies only to others, but it can happen to any of us. No matter what our levels of expertise, education, and experience, there are many areas we know little about.
Emotional Investment
When we hold strong beliefs about a subject—whether it's climate change, economic policy, or vaccines—we tend to scrutinize our knowledge less critically. Instead of seeking out new or opposing information, we gravitate toward evidence that confirms our views and dismiss anything that challenges our assumptions. This emotional investment in our beliefs strengthens the illusion that we fully understand the topic.
A striking example of this is seen in online political debates, where people confidently argue about intricate subjects like medicine or economics despite lacking formal expertise. Their conclusions are often shaped by tribal loyalty, social influences, hearsay, and the desire to win an argument rather than by genuine understanding.
As we often gain our understanding of topics from our friends and curated social groups and media, what we think are objective understandings are often unknowingly partisan.
How to Overcome the Illusion
Ask “Why?” and “How?”
When faced with a belief or assumption, challenge yourself to explain it in detail. If you struggle to articulate it clearly, you may not understand it as well as you thought.
Seek Opposing Views
Engaging with different perspectives forces us to question our knowledge and consider complexities we might have overlooked. Actively reading opposing viewpoints or debating with knowledgeable people can help refine our understanding and prevent us from falling into the trap of oversimplification.
Adopt Intellectual Humility
Recognizing the limits of your knowledge isn’t a weakness—it’s a strength. True understanding begins with acknowledging what we don’t know.
“I have approximate answers and possible beliefs and different degrees of uncertainty about different things, but I am not absolutely sure of anything and there are many things I don't know anything about.”— Physics Nobel Prize winner Richard Feynman
The Beginner's Guide Paradox: The Power of Elementary Questions
As a university textbook writer, I can attest to an important truth: writing a beginner's guide requires an advanced level of understanding. The writer must address elementary questions that are often ignored, dismissed, or not even considered by so-called experts. When a child asks, "Why is the sky blue?" an adult might respond, "What a silly question" or "Because it is," without knowing the answer. A true educator, however, must be prepared to explain it thoroughly.
A personal anecdote illustrates this point well. My late mother never attended college, but she was infinitely curious and a lifelong learner. I always had her review the early drafts of my academic books for her input and questions. When she read a draft of my textbook on ultraviolet radiation, she said, "I don’t understand what light is." I responded, "That's a big question, and even Isaac Newton said he didn't fully know." Her question was insightful, leading me to add a section explaining the nature of light.
This experience reinforced the idea that seemingly elementary questions expose deeper layers of complexity and, when taken seriously, lead to deeper understanding.
An excellent description of the power of subjectivity. The bias of the human brain toward the urgent actions that may be needed for survival has limitations that also bring limitations that also can put survival at risk.