The Most Dangerous Kind of Learning
Thomas McCrae was a young 19th Century doctor still unsure of his skills. One day he diagnosed a patient with a common, insignificant stomach ailment. McCrae’s medical school professor watched the diagnosis and interrupted with every student’s nightmare: In fact, the patient had a rare and serious disease. McCrae had never heard of it.
The diagnosis required immediate surgery. After opening the patient up, the professor realized that McCrae’s initial diagnosis was correct. The patient was fine.
McCrae later wrote that he actually felt fortunate for having never heard of the rare disease. It let his mind settle on the most likely diagnosis, rather than be burdened by searching for rare diseases, like his professor:
The moral of this is not that ignorance is an advantage. But some of us are too much attracted by the thought of rare things and forget the law of averages in diagnosis. There is a man who is very proud of having diagnosed a rare abdominal disease on several occasions. But as for some years he made this diagnosis in every obscure abdominal condition, of course being nearly always wrong, one cannot feel that he deserves much credit
This applies to so many things in business and investing.
Case studies dig into the causes of outlier events. They can be helpful. But armed with knowledge of hyper-specific causes of outcomes leads you to start making rare diagnoses everywhere you look. Which can backfire.
“Uber, but for X.”
“The next Google.”
“This reminds me of 1999.”
“The next financial crisis.”
“1929 all over again.”
“Our Sheryl Sandberg.”
Having a seed of a rare-but-powerful trait in your head makes you want to put it to practical use. The irony is that the only reason you became interested in it is because it was rare, if not unprecedented. But once you learn about it, you fool yourself into seeing it four times a day. It’s not that you shouldn’t look for the next Google, or fear the next Great Depression. It’s that these are staggeringly precise things, so your pursuit of them pushes your mind away from more probable outcomes.
More important is a rule of thumb: The more extreme the outcome, the more complex the factors were that led to that outcome. Case studies can be dangerous because we study extreme examples, and extreme examples are often the least applicable to other situations, given their complexity.
Xerox has survived by pursuing new business lines. That’s a case study. But AIG practically killed itself pursuing new business lines (derivatives). Same for GE (financing) and Toshiba (nuclear power). So what do we learn?
Blockbuster failed because it didn’t see the distribution convenience of Netflix as serious competition. That’s a case study. But Wells Fargo survived because it didn’t see the distribution convenience of subprime underwriting as serious competition. So what’s the takeaway?
Jason Zweig of the Wall Street Journal once said that when people keep making mistakes, it’s not always because they didn’t learn their lesson. It’s because they learned too precise a lesson.
And that, I think, is the most dangerous kind of learning.
We should (and do) study successes and failures. And like McCrae’s doctor, I want to think about black swans, because they shape the world. But you can make more progress in a complex world focusing on broad lessons, rather than hyper-specific ones. Things like:
-
The timelessness of building trust and solving people’s problems.
-
The power of brands.
-
The necessity of moats.
-
How people respond to surprises.
-
How people get in their own way.
-
The relentlessness of competition.
-
How much dedication and grit founders need.
-
How common yet unpredictable change can be.
McCrae summed it up: “It is the simple things which require to be kept constantly before us and which must form the foundation of our diagnostic ability.