Italian psychologist Massimo Piattelli-Palmarini was once asked why people keep making the same mistakes.
Inattention, distraction, lack of interest, poor preparation, genuine stupidity, timidity, braggadocio, emotional imbalance, ideological, racial, social or chauvinistic prejudices, and aggressive or prevaricatory instincts.
Let me add some more:
Incentives can tempt good people to push the boundaries farther than they’d ever imagine. Financial boundaries, moral boundaries, all of them. It’s hard to know what you’ll consider doing until someone dangles a huge reward in your face, and underestimating how adjustable the boundaries can become when rewards rise is a leading cause of terrible decisions.
Tribal instincts reduce the ability to challenge bad ideas because no one wants to get kicked out of the tribe. Tribes are everywhere – countries, states, parties, companies, industries, departments, investment styles, economic philosophies, religions, families, schools, majors, credentials. Everyone loves their tribe because there’s comfort in knowing other people who understand your background and share your goals. But tribes have their own rules, beliefs, and ideas. Some of them are terrible. But they remain supported because no one wants to argue with a tribe that’s become part of their identity. So people either willingly nod along with bad ideas, or become blinded by tribal loyalty to how bad the ideas are to begin with.
Ignoring or underestimating the full range of potential consequences, especially tail events that seem rare but have catastrophic effects. The most comfortable way to think about risk is to imagine a range of potential consequences that don’t seem like a big deal. Then you feel responsible, like you’re paying attention to risk, but in a way that lets you remain 100% confident and optimistic. The problem with low-probability tail risks is that they’re so rare you can get away with ignoring them 99% of the time. The other 1% of the time they change your life.
Lots of little errors compound into something huge. And the power compounding is never intuitive. So it’s hard to see how being a little bit of an occasional jerk grows into a completely poisoned work culture. Or how a handful of small lapses, none of which seem bad on their own, ruins your reputation. The Great Depression happened because a bunch of things that weren’t surprising (a stock crash, a banking panic, a bad farm year) occurred at the same time and fed on each other until they grew into a catastrophe. It’s easy to ignore small mistakes, and even easier to miss how they morph into huge ones. So huge ones are what we get.
An innocent denial of your own flaws, caused by the ability to justify your mistakes in your own head in a way you can’t do for others. When other people’s flaws are easier to spot than your own it’s easy to assume you have no/few flaws, which makes the ones you have more likely to cause problems.
Probability is hard. Black-and-white outcomes are more intuitive. It’s not easy to put effort into something you’re only 60% sure about. But people want to try their best, so it’s more comfortable to assume that what you’re working on has a 100% chance of success. That overconfidence breeds stubbornness, excessive risk-taking, and ignoring signs that you’re wrong until it’s too late.
Underestimating the need for room for error, not just financially but mentally. Ben Graham once said, “The purpose of the margin of safety is to render the forecast unnecessary.” If you know how hard forecasting is you know how important the quote can be. And room for error has two sides: whether you can survive an imperfect outcome financially without getting forced out, and whether you can survive it mentally without getting scared out. Bad decisions happen when there’s only one acceptable version of the future.
Underestimating adaptation, both present and future, leaving you convinced that history will repeat itself and bitter when it doesn’t. A lot of regrettable pessimism happens when you find a bad thing and assume it’ll stay bad forever. But things change. People adapt and figure out better ways. Same thing in the other direction: nothing too good stays that way for long because it breeds complacency and catches competitors’ attention.
Being influenced by the actions of people who are playing a different game than you are. The idea that advice can be good for one person and terrible for another is rarely obvious. Taking your cues and advice from people with different goals, abilities, and desires than you is an easy road to misery. But it’s common, because smart people you look up to tell you it’s good advice. The number of things that are true for everyone in finance is small; everything else is just figuring out how much risk you want to take and what you want out of life, which is different for everyone.
An inability to know how you’ll respond to stress causes you to take risks you think you can handle but you’ll actually regret when they turn on you. Everyone thinks they have a high risk tolerance when things are going great. Then things turn down and they say, “Ah, you know, actually, this hurts more than I thought.” When thinking about future risks you tend to think in isolation. If I think about a 40% market decline, I imagine everything in the world being the same except stocks being 40% cheaper. That doesn’t feel so bad. But the reason stocks fall 40% cheaper is probably because people think the world is falling apart – a brutal recession, a pandemic, a political meltdown, whatever. The stress of that is much harder to think about until it happens.
Too much extrapolation of past successes leads to overconfidence, stubbornness, and a narrow view of future risks. Buddhism has a concept called beginner’s mind, which is an openness to trying new things and studying new ideas, unburdened by past assumptions, like a beginner would. Having a little past success is the enemy of beginner’s mind, because doing well reduces the incentive to explore other ideas, especially when those ideas conflict with your proven strategy. Jason Zweig puts it: “Being right is the enemy of staying right because it leads you to forget the way the world works.”
Wrongly assuming that the information you have at your disposal tells a complete picture of what you’re dealing with. Secretary of Defense Robert McNamara demanded everything about the Vietnam war be measured. He was obsessed with statistics. One day McNamara was going through his data when head of special operations Edward Lansdale interrupted and said “there’s something missing here.” McNamara asked what it was. “The feelings of the Vietnamese people. You can’t reduce that to a statistic,” Lansdale said. But it mattered more than anything. That realization applies to many fields.
Misreading the cause of others’ successes or failures in a way that tempts you to overemphasize parts of their strategy when attempting to copy what they did. Part of this is because most people don’t (and can’t) know exactly what caused their own successes and failures. So even if you ask them for a roadmap, they might lead you down a different path than the one they took themselves. This is especially true because people like good, clean, easy stories about cause and effect, meaning the story you get might be totally different from the complex set of circumstances that caused an outcome you’re trying to copy or avoid.