Making Sense vs. Being Right

People tend to care about something making sense more than they care about something being right.

It’s why we’re bad at forecasting and easily persuaded and why the economy is driven by stories.

It’s not your fault. The world is complicated and everyone’s trying to fit what’s going on into the context of their own life experiences.

Two big things cause the gap between perception and reality.

1. The most important events are the hardest to sample, so extrapolating their lessons can be misleading.

There have been two world wars in the last century. But no one would say, “the average world war lasts five years,” or “the average world war kills 60 million people.” No forecaster would say we’re overdue for a world war based on historic trends. A sample size of two can’t tell us what to expect next.

The point may be obvious with world wars. But use a slightly more common case and people start drawing all kinds of false conclusions.

Studying things with inadequate samples causes more delusion than intentional deceit. That’s because the ease of explaining things based on what’s readily in front of you meets the intellectual cover of using statistics.

The most important events – things that are counterintuitive, disruptive, and swayed by emotion more than calculation – tend to share a few traits that make gathering a good sample of data hard:

Daniel Kahneman is one of the best social scientists of our time. Yet he once admitted:

Like most research psychologists, I had routinely chosen samples that were too small and had often obtained results that made no sense. I taught statistics and knew how to compute the sample size that would reduce the risk of failure to an acceptable level. But I had never chosen a sample size by computation. Like my colleagues, I had trusted tradition and my intuition in planning my experiments and had never thought seriously about the issue.

That is terrifying.

Two things happen when we don’t have good samples for important events. One is that we fool ourselves and are surprised when the world works differently than our data suggested it would. The second is that we can easily fool others, because if you look hard enough you can find a flawed set of data that shows whatever you want it to show.

This is a big problem and there is no solution to it other than giving yourself room for error in not only in your actions but your beliefs.

2. We underestimate how much more powerful stories are than statistics.

A friend once complained that everything Nassim Taleb has written is a rewording of a Philosophy 101. I don’t know if that’s true, but I’m not bothered if it is because Taleb has done a better job explaining concepts than any Philosophy 101 author before him. And history makes clear that the winner isn’t the person who discovers an idea, but the one who sells it most persuasively.

Stories are more powerful than statistics because the most believable thing in the world is whatever takes the least amount of effort to contextualize your own life experiences.

People go astray when forgetting that storytelling is like exponential fuel, so a great idea told poorly can be a fraction as powerful as an OK idea told persuasively.

The CEO who understands everything about their field except the most important thing – rallying employees behind a mission and vision.

The financial advisor who aced the CFA exam but can’t maintain clients’ faith during a downturn.

The doctor who knows the cure but can’t get patients to stick to a plan.

Even contrarians eventually need to be persuasive. James Grant said: “Successful investing is about having people agree with you … later.”

A flaw that frustrates many people is not realizing that virtually every job is a sales job, especially as you move up the ladder:

More on this topic:

The Full Reset

The Theory of Maybes

It’s Hard to Predict How You’ll Respond to Risk