Fool Me Three Times And I Give Up
The most significant part of the financial crisis that peaked ten years ago this month was that it took place within eight years of both the dot-com crash and 9/11. Fool me once, shame on you. Fool me twice, shame on me. Fool me three times – oh, hell, why even try anymore?
Nassim Taleb’s The Black Swan was published in April, 2007, 90 days before the financial crisis began. It sold well for two reasons: It’s a good book, and – as authors have known for centuries – the most effective writing is putting into words something people intuitively feel but haven’t yet articulated. The Black Swan’s message – improbable outlier events run the world – wasn’t just interesting; it resonated. “Yes, that’s my life! That’s what I’ve experienced over and over again for the last eight years!”
My biggest lesson of the last 10 years is that we’ve clung to that idea. Probably a little too much.
What I remember most about the financial crisis are the predictions from smart, credible, people forecasting what would happen next.
Getting hit by something painful hurts, but fight-or-flight instantly kicks in and you focus on what’s going to happen next.
Hours after Pearl Harbor was attacked in 1941, the Navy planted mines in San Francisco Bay. Trenches were dug along the West Coast. The thought that Japan would follow up with an even greater attack on the mainland wasn’t if, but when.
Two things happen to predictions after you get hit with something big and unexpected:
You extrapolate what just happened, but happening with even greater force and consequence.
You forecast with great conviction, despite the original event being improbable and something few, if anyone, predicted.
Which is exactly what happened in 2008.
The forecasts published soon after the financial crisis were as certain as they were dire.
There was the front-page WSJ story about a Russian professor predicting that America would disintegrate after a civil war, with pieces of the country being adopted by China, Russia, and Mexico. This was reported to be a 55% probability.
There were predictions that the Dow would become equal in price to an ounce of gold, implying something like stocks falling 90% or gold increasing 10-fold.
Or that oil would reach $300 dollars a barrel.
There are always, and will always be, quacks. But after a painful surprise we start taking them seriously. Yes, it’s possible any of these predictions could have come true. Hindsight is 20/20. But we weighted their potential with more conviction than we should have.
Charles Mackay writes in Extraordinary Popular Delusions and the Madness of Crowds:
During the great plague of London, in 1665, the people listened with avidity to the predictions of quacks and fanatics. Defoe says, that at that time the people were more addicted to prophecies and astronomical conjurations, dreams, and old wives’ tales than ever they were before or since. Almanacs, and their predictions, frightened them terribly.
What actually occurred in the decade after the financial crisis – average market returns, tame volatility, pretty average economic growth, falling unemployment – is an outcome you would have have been laughed out of the room for suggesting in 2008, even if all of it was just predicting normalcy. We fell for an irony of thinking extreme tail events were probable, predictable, and visible. The literal opposite of a Black Swan.
Confidence in predicting outliers has gone both ways. Part of the reason there are so many great products and so few great businesses within startups is because there is free-flowing confidence in something becoming the next Facebook, where early user growth is the only thing that matters, revenue and unit economics be damned.
Like Pavlov’s dogs, the learned condition we lost after the triple-hit trauma of the 2000s was accepting that average, normal, typical results are what you should expect to happen most of the time – practically all of the time – even if outliers are more impactful.
Maybe that’ll go 180. After a decade of calm and average growth I’m starting to see more forecasts of low-and-slow growth continuing indefinitely.
But Daniel Kahneman put it best when he described how we should think about surprises:
What you should learn when you make a mistake because you did not anticipate something is that the world is difficult to anticipate. That’s the correct lesson to learn from surprises: that the world is surprising.