Psychological Paths of Least Resistance

When faced with a problem, rarely do people ask, “What is the best, perfect, answer to this question?”

The more efficient question is often, “What answer to this question can I obtain with the least amount of effort, sacrifice, and mental discomfort?”

The psychological path of least resistance.

Most of the time that’s fine. You use a little intuition and common sense and find a practical answer that doesn’t rack your brain or bog you down with details.

Other times the easy answers lead you down a nasty path of misunderstanding, ignorance, and blindness toward risk.

A few paths of least resistances that everyone is susceptible to at some point:

1. The quick elimination of doubt and uncertainty.

Most people could not get out of bed in the morning if they were honest about how much of their future is unknown, hangs by a thread, or can be pushed in another direction by the slightest breeze. The solution is to eliminate doubt and uncertainty the moment they enter your head.

Uncertainty feels awful. So it’s comforting to have strong opinions even if you have no idea what you’re talking about, because shrugging your shoulders feels reckless when the stakes are high.

Life is complex, complex things are always uncertain, uncertainty feels dangerous, and having an answer makes danger feel reduced. It’s an easy path of least resistance.

If you were an adult in 2000 you probably had at least some vision of what the future would look like. Maybe even a vague vision of the next 20 years. But everyone was blind to 9/11, the 2008 financial crisis, and Covid-19 – the three risks that were both massive and unpredictable.

Then when those events happened people quickly moved to eliminate the uncertainty they brought.

Terrorist attack just happened? It’s definitely going to happen again, soon.

Recession coming? It won’t affect my industry and will be over by Q4 and interest rates will bottom at 3.42%.

Pandemic arrived? Two weeks to slow the spread.

No matter how wrong these answers might be, they feel better than saying, “I have no idea what’s going to happen next.”

2. Single-cause explanations for complex events.

The easiest way to explain a complex event is to assign a single cause.

This person built that company.

That law caused this problem.

That recession was caused by this one political party.

But that’s rarely how it actually works, especially for the big, complicated, things that we pay attention to.

Bill Gates and Steve Jobs were once interviewed on stage. The host pointed out that each man had built a defining company of their generations.

Gates responded: “Steve and I will always get more credit than we deserve, because otherwise the story gets too complicated.”

Otherwise the story gets too complicated. That’s the truth, and it’s the same for so many big things in the world.

One of the reasons people discount the possibility of big, crazy events is that we assume big events need big causes. A huge recession, in our minds, must be caused by a huge, rare, risk hitting us at once.

But most big things – companies, events, careers, pandemics, etc – are just a bunch of small, random, boring things that compound at just the right time and explode into something bigger and more powerful than anyone imagined.

Every recession has multiple causes.

Every economic boom has multiple causes.

Every successful company has hundreds or thousands of people who made it all happen.

Even Covid was the result of a dozen or so innocuous events that were individually easy to dismiss, but when taken together shut the world down for a year.

In 1932 the head of the NYSE was asked who caused the Great Depression and said, “Ask the one hundred and twenty-three million people of the United States.” Everyone played a role.

Big events have lots of causes.

3. The justifications of your own actions and the judgment of others’.

It is way easier to spot other people’s mistakes than your own, because we judge others based solely on their actions, but when judging ourselves we have an internal dialogue that justifies our mistakes and bad decisions.

If I see you make a bad decision, I just look at the results and say, “That was dumb, that was a mistake.”

But If I make a bad decision, I can tell myself, “Look, it was a calculated bet, and actually it didn’t turn out that bad, and actually this is the outcome I wanted, and I’m a good person who means well,” and so on.

Everyone is blind to 99.9% of what happens in other people’s heads, but you are keenly aware of 100% of what’s going on in your own.

And since most people are either good at filtering their crazy thoughts, or struggle to articulate their nuanced thoughts, we are blind to the vast majority of what people are thinking and how they justify their actions.

Take an extreme example: I read a book recently called Evil. It looks at why people do terrible things. One of its points is that most evil people do not think they are doing something wrong, because they can justify their actions in a way most victims aren’t privy to:

Evil usually enters the world unrecognized by the people who open the door and let it in. Most people who perpetrate evil do not see what they are doing as evil. Evil exists primarily in the eye of the beholder, especially in the eye of the victim.

Evil is but rarely found in the perpetrator’s own self-image. It is far more commonly found in the judgments of others.

The crazy part of this is that it makes everyone underestimate the odds that they themselves could do something evil.

Primo Levi, the Holocaust survivor, said, “Monsters exist, but they are too few in number to be truly dangerous. More dangerous are the common men, ready to believe and to act without asking questions.” Ordinary people are brainwashed into believing outrageous things that, in their minds, justify outrageous actions.

I remember reading a book about World War II through the eyes of German soldiers. One said he couldn’t understand why the American soldiers were so angry at Germany – couldn’t they see that the Nazis were there to save Europe, that they were the good guys? It’s astounding to read, both because it’s crazy and because it offers a glimpse into how evil actions are justified in the eyes of those committing them.

4. The belief that your own field of vision is the same as everyone else’s.

Nothing is more persuasive than what you’ve seen and experienced firsthand. And what you’ve experienced can be so persuasive – so influential in shaping your world beliefs – that you can’t understand why other people who have had different experiences than you might have different beliefs.

It can be so hard to contextualize someone else’s life experiences that you mistake different opinions for a lack of intelligence. You also underestimate how your own beliefs and actions would change if your life had gone down a different path – especially if you were born into a different generation, country, or socioeconomic class.

A common criticism of my book The Psychology of Money is that it’s written through the eyes of a college-educated American male born in the latter part of the 20th century. People will write to me and say, “You are naive. If you lived in Argentina you wouldn’t think long-term investing in the stock market is a good idea,” or “You would have a different perspective if you were buried in student debt.”

They’re 100% correct. I’m a product of my own bubble, like everyone else. What’s interesting to me is that the critics never seem to flip the logic around – it’s obvious to them that I am blind to their world, but rarely do they realize that they are just as blind to my world (both the good and bad parts).

It’s easy to say, “You don’t understand my world.” But what you’ve really uncovered is that both of us are blind to each other’s worlds, usually in equal parts. We are both guilty of implicitly assuming the version of the world we have experienced firsthand is the one that’s more accurate, more instructive, and more predictive of the future than someone else’s version.

I love this related quote from Daniel Kahneman: “You are more likely to learn something by finding surprises in your own behavior than by hearing surprising facts about people in general.”

5. The desire to supplant statistics with stories.

“People would rather believe than know,” said biologist E.O. Wilson.

I think another way to phrase it is that people desire stories more than statistics. They need a story they can tell themselves, not just a fact they can memorize.

Part of that is good. The gap between what works in a spreadsheet and what’s practical in real life can be a mile wide. This usually isn’t because we don’t know the statistics. It’s because real-life stories are so effective at showing us what certain parts of a statistic mean.

Part of it can be dangerous, when broad statistics are ignored over powerful anecdotes.

Government agencies published literally thousands of different economic data points, everything from unemployment to GDP growth to the historical cost of chicken legs, bone-in. It’s all free and easy to read.

How often do those data sites change average, ordinary people’s opinions about the economy?

It rounds to never.

What changes people’s minds and reaffirms their beliefs are conversations they’ve had with people close to them, social media, and cable news. Each is very good at telling stories especially when they provoke emotion or are easy to contextualize to their own lives.

When confronted with a pile of dull facts and a pile of compelling anecdotes, the anecdotes are always the path of least resistance for your brain to cling to.

6. Outsourcing your hard decisions to the opinions of pundits, consultants, and experts of various qualifications.

I heard a consultant once say that the main value they add to clients is letting middle managers justify decisions to senior managers. “We’re doing this project because the consultants told us to do it.”

It’s not the expertise. It’s not even the cover-your-ass aspect that lets you pawn off your decision to someone else. It’s just the idea that someone else besides you thought about this problem and came to an answer, and it’s easy to assume other people are smarter than you are, especially when you don’t know well enough to understand the limits of their intelligence.

There are so many similar examples. People justify their political beliefs off the recommendations of pundits and their financial decisions based on the opinions of analysts.

If you’re alone and have an opinion, it’s easy to push back with doubt, questions, and uncertainty. But if even one person steps in and says, “I think this is the answer,” boom, now you have the power of groupthink and intellectual interia to reinforce your thoughts.

7. Overconfidence as a way of shielding against the uncomfortable fact that the world is driven by probability, not black-and-white certainties.

Most people get that certainties are rare, and the best you can do is make decisions where the odds are in your favor. They understand you can be smart and end up wrong, or dumb and end up right, because that’s how luck and risk work.

But almost no one actually uses probability in the real world.

You can say, “there’s a 20% chance of that,” or “there’s a 90% chance of this.”

But most of what people care about is, “Were you right or wrong?” Did it happen, or did it not?

The world is driven by probability. But it’s so much easier to think in black and white, yes or no, right or wrong.

It’s the same with our own decisions.

So many investors say, “higher risk means higher reward.” They view it as black and white. If I take more risk, I’ll get more reward.

But that’s not how it works at all. Most of the time, taking more risk means you are most likely to earn lower returns, with a smaller chance that you will earn fantastic returns to compensate.

It’s a painful thing to contemplate. Far more appealing, and the path of least resistance, is to lean into overconfidence – If I take more risk, I’ll get more reward.

Being honest about the odds that your opinions and forecasts will actually come true can be so discouraging and uncomfortable that the warm blanket of denial and overoptimism becomes home to most people’s beliefs.

For more:

Little Ways The World Works

Five Lessons From History