Trying Too Hard
Thomas McCrae was a young 19th Century doctor still unsure of his skills. One day he diagnosed a patient with a common, insignificant stomach ailment. McCrae’s medical school professor watched the diagnosis and interrupted with every student’s nightmare: In fact, the patient had a rare and serious disease. McCrae had never heard of it.
The diagnosis required immediate surgery. After opening the patient up, the professor realized that McCrae’s initial diagnosis was correct. The patient was fine.
McCrae later wrote that he actually felt fortunate for having never heard of the rare disease.
It allowed his mind settle on the most likely diagnosis, rather than be burdened by searching for rare diseases, like his more-educated professor. He wrote: “The moral of this is not that ignorance is an advantage. But some of us are too much attracted by the thought of rare things and forget the law of averages in diagnosis.”
A truth that applies to almost every field is that it’s possible to try too hard, and when doing so you can get worse results than those who knew less, cared less, and put in less effort than you did.
It’s not intuitive, so it can drive you crazy. And it’s hard to pinpoint when it occurs – maybe McCrae’s professor was being appropriately cautious?
But there are mistakes that only an expert can make. Errors – often catastrophic – that novices aren’t smart enough to make because they lack the information and experience needed to try to exploit an opportunity that doesn’t exist.
Two big ones:
Being an expert from an era that no longer exists
Investor Dean Williams once said, “Expertise is great, but it has a bad side effect. It tends to create an inability to accept new ideas.”
Henry Ford banned his factory workers from documenting new ideas that didn’t work, because he feared it would create a list of things people refused to try again even when new technologies improved their chances of success. What was impossible in one era might later not only be doable, but the key to success. Ford wrote in his biography:
I am not particularly anxious for the men to remember what someone else has tried to do in the past, for then we might quickly accumulate far too many things that could not be done … Hardly a week passes without some improvement being made somewhere in machine or process, and sometimes this is made in defiance of what is called “the best shop practice.”
Marc Andreessen explained how this has worked in tech: “All of the ideas that people had in the 1990s were basically all correct. They were just early.” The infrastructure necessary to make most tech businesses work didn’t exist in the 1990s. But it does exist today. So almost every business plan that was mocked for being a ridiculous idea that failed is now, 20 years later, a viable industry. Pets.com was ridiculed – how could that ever work? – but Chewy is now worth more than $10 billion.
Experiencing what didn’t work in 1995 may have left you incapable of realizing what could work in 2015. The experts of one era were disadvantaged over the new crop of thinkers who weren’t burdened with old wisdom.
The same thing happens in investing. Michael Batnick has made the point that having experienced a big event doesn’t necessarily make you better prepared for the next big event. Interest rates have mostly fallen for 40 years, so few bond investors – even grizzled veterans – have lived through a sustained rise in interest rates. But, he writes:
So what? Will the current rate hike look like the last one, or the one before that? Will different asset classes behave similarly, the same, or the exact opposite?
On the one hand, people that have been investing through the events of 1987, 2000 and 2008 have experienced a lot of different markets. On the other hand, isn’t it possible that this experience can lead to overconfidence? Failing to admit you’re wrong? Anchoring to previous outcomes?
Of course. It happens all the time. The feeling of power you get from hard-fought experience is stronger than the urge to change your mind, even when it’s necessary.
Career incentives can push complexity in a field where simplicity leads to the best outcomes.
Jason Zweig of the Wall Street Journal says there are three ways to earn money as a writer:
-
Lie to people who want to be lied to, and you’ll get rich.
-
Tell the truth to those who want the truth, and you’ll make a living.
-
Tell the truth to those who want to be lied to, and you’ll go broke.
Some variation of this applies to many fields, especially service industries where someone pays for an expert’s opinion. There can be a difference between knowing what’s right and making a living delivering what you know to be right.
This may be most common in investing, law, and medicine, when “do nothing” is the best answer, but “do something” is the career incentive.
Sometimes it’s amoral, but it can be an innocent form of “cover your ass.” Mostly, though, I think an advisor just feels useless if they tell a client, “we don’t need to do anything here.” In the quest to be helpful they add complexity even when none is needed, or when it might backfire.
Years ago Jon Stewart interviewed Jim Cramer. When pressed on CNBC content that ranged from contradictory to inane, Cramer said, “Look, we’ve got 17 hours of live TV a day to do.” Stewart responded, “Maybe you can cut down on that.” He’s right. But if you’re in the TV business, you can’t.
Most of this I think is truly innocent. Experts believe their complexity adds value because reality is too painful to bear, especially in a competitive career with stress and long hours.
A doctor once told me the biggest thing they don’t teach in medical school is the difference between medicine and being a doctor – medicine is a biological science, while being a doctor is often a social skill of managing expectations, understanding the insurance system, communicating effectively, and so on.
The gap between the two, which applies to many fields beyond medicine, can lead to mistakes only experts can make, or only an expert can advise.
“Half of the 15,000 mutual funds in the US are run by portfolio managers who do not invest a single dollar of their own money in their products,” the FT writes. Doctors have their own version, as one article highlights:
Almost all medical professionals have seen what we call “futile care” being performed on people. That’s when doctors bring the cutting edge of technology to bear on a grievously ill person near the end of life. The patient will get cut open, perforated with tubes, hooked up to machines, and assaulted with drugs. All of this occurs in the Intensive Care Unit at a cost of tens of thousands of dollars a day.
What it buys is misery we would not inflict on a terrorist. I cannot count the number of times fellow physicians have told me, in words that vary only slightly, “Promise me if you find me like this that you’ll kill me.” They mean it. Some medical personnel wear medallions stamped “NO CODE” to tell physicians not to perform CPR on them. I have even seen it as a tattoo.
The trouble is that even doctors who hate to administer futile care must find a way to address the wishes of patients and families. Imagine, once again, the emergency room with those grieving, possibly hysterical, family members. They do not know the doctor. Establishing trust and confidence under such circumstances is a very delicate thing. People are prepared to think the doctor is acting out of base motives, trying to save time, or money, or effort, especially if the doctor is advising against further treatment.
It’s a huge problem that affects many fields, and I don’t know a good solution.
But it’s good to acknowledge: There is one set of skills that comes from being an expert, and another that comes from being a novice, unburdened by the weight of experience or incentives. The former is obvious, the latter too easy to ignore.