Why You Believe The Things You Do

I remember reading an article years ago about a father in Yemen who lost a son to starvation, only to have another child fall dangerously ill. Desperate, he turned to tribal elders who recommended a folk remedy: Shove a burning stick through the sick child’s chest to drain the illness. The father agreed.

Asked about the horrific procedure, the father said: “People said burn him in the body and it will be O.K. When you have no money, and your son is sick, you’ll believe anything.”

When you have no money, and your son is sick, you’ll believe anything.

That is such a powerful statement, and some version of it applies to everyone.

Here’s a universal reality: What you believe to be true is influenced by how much you want it to be true. The more something helps you deal with uncertainty, the lower the bar is for you to believe it’s true.

It’s been like that forever. Describing the Great Plague of London, Daniel Defoe wrote in 1722: “The people were more addicted to prophecies and astrological conjurations, dreams, and old wives’ tales than ever they were before or since.” You’ll believe just about anything that offers hope when a plague is killing a quarter of your neighbors.

Visa Founder Dee Hock once said, “We are built with an almost infinite capacity to believe things because the beliefs are advantageous for us to hold, rather than because they are even remotely related to the truth.”

Think about the investor whose strategy is so tied to their identity they can’t let it go when it stops working.

Or the business that so desperately wants to be good it overlooks all the ways it’s bad.

Charlie Munger once said:

If you turn on the television, you find the mothers of the most obvious criminals that man could ever diagnose, and they all think their sons are innocent. The reality is too painful to bear, so you just distort it until it’s bearable.

Even things that appear to be hard science fall into this category. There’s a thing in the legal world called Gibson’s Law, which states that, “For every PhD there is an equal and opposite PhD.” No matter what argument you’re trying to make, you can find a qualified expert witness who’s willing to make it, under oath, for $500 an hour.

People can be led to believe and defend almost anything, because the goal of a belief is often not to discover what’s true – it’s to justify past actions, or protect your reputation, or provide hope when it’s lacking, or to maximize your income, or to signal to others that you belong to the tribe.

So there’s one reason you believe the things you do: A belief’s allure can eclipse its truth.

A couple other reasons beliefs tend to form:

Memories of past events are filtered. You keep what makes sense and throw out the confusing details.

In the late 1940s, two psychologists – Gordon Allport and Leo Postman – came up with what they called leveling and sharpening effects in memory.

It goes something like this: In any big event – a car accident, or a terrorist attack, or a recession, pandemic, etc. – there can be literally millions of variables that you experience and observe. You see people’s emotions, you hear different opinions, you read about different perspectives. There’s no way you can remember all the details; there are just too many. So in your recollection you emphasize some memories (sharpening) and discard others (leveling).

People tend to sharpen details that make a good story and fit their existing views, and level the ones that are confusing, painful, or contradict pre-existing beliefs.

So here we have a belief-generating system: What you think is true is heavily based on what you’ve experienced, and you remember the parts of your experiences that make good stories, confirm stereotypes, and connect dots between other experiences.

Think of how complex the 2008 financial crisis was. There were billions of people making complex, emotional decisions at the same time. No one could keep track of everything. So in hindsight people tend to believe what makes sense to them, in simple narratives – the Fed ruined the economy, bailouts were unfair, wealth was destroyed. All of those could be true and believed by one person, but argued against by another person who experienced the same economy but remembers different details. Maybe the Fed saved the economy, bailouts were the best option, and it was the buying opportunity of a lifetime. Depends what you remember.

The dangerous thing is that beliefs based on experience seem evidence-based. But when we’re overwhelmed with observations in a complex world, we cherry-pick the most attractive evidence to appease our simple, story-loving minds.

It is far easier to fool yourself into believing a falsehood than admit a mistake. Changing your mind is rarer than it should be, leading us to cling to false beliefs.

Physicist Max Tegmark writes in the book This Will Make You Smarter:

The core of a scientific lifestyle is to change your mind when faced with information that disagrees with your views, avoiding intellectual inertia, yet many of us praise leaders who stubbornly stick to their views as “strong.” The great physicist Richard Feynman hailed “distrust of experts” as a cornerstone of science, yet herd mentality and blind faith in authority figures is widespread. Logic forms the basis of scientific reasoning, yet wishful thinking, irrational fears, and other cognitive biases often dominate decisions.

You can agree with every word written there and still struggle to change your mind – or take seriously those who do – for two reasons.

One, when you change your mind you can feel like all the hard-fought effort you put into establishing your previous beliefs was wasted. Even a little pain from that reality can be enough to persuade you to stick to the original belief.

Two, when you change your mind, moving from one belief to another, it can be hard to take the new belief seriously – changing beliefs offers proof that the new belief may be short-lived, especially in the eyes of others.

Both recognize an important truth: Beliefs are not just about what we know; They are social signals that offer clues about how we established our beliefs, our confidence in our intelligence, and our ability to pass reliable truth to other people. The quirk is that those signals can be counterintuitive: We should want an expert who is willing to change their mind, but what we actually want is someone who’s confident enough to never have to.

A lot of times we’re not interested in truth – we’re interested in the elimination of uncertainty, and that fact alone causes us to believe things that have little relation to reality.