Humans aren’t very good at forecasting. Most of them, at least. But we have deeply rooted beliefs about what we should fear, hope for, and what we think will happen next.
It’s a dangerous combination. Few people have tackled it as wisely and eloquently as Dan Gardner.
Dan’s books, The Science of Fear, Future Babble, and Superforecasting: The Art and Science of Prediction (co-written with Philip Tetlock), taught us a lot about how we fool ourselves into thinking the world works the way we want it to, and what steps we can take to make more rational decisions about the future.
We recently asked him six questions.
CF: Has broader access to information made us better or worse at identifying risk?
Both. For those who question themselves, think carefully, and research deeply, this is a golden age. Thanks to technology, our ability to find quality information and fresh perspectives is spectacular and growing rapidly. But at the same time, and thanks to the same technologies, we also have the power to quickly and easily find junk that confirms whatever ridiculous thing we think, share that misinformation, and spend all our time connected exclusively to people as deluded as we are. There is no technological fix. What is needed is a rich understanding of psychology, how it renders our thinking vulnerable, and what we must do to improve our perceptions of reality. With that, we can accomplish great things. Without it, we are merely apes with smartphones, nukes, and hair loss.
CF: Optimism as the default long-term forecast: Smart or oblivious?
Neither theory nor experience gives us any reason to think we can accurately predict the long term. I think we should face that fact squarely. So on what basis should we proceed? The same way we approach any unknown: With hope inspiring us to keep going forward in pursuit of dreams and fear urging us to proceed cautiously and be prepared for disaster. This isn’t new. We’ve been doing it ever since the first stone age human said “ever wonder what’s on the other side of those hills?”
CF: What’s the most widespread, irrational fear we have?
It’s probably terrorism. There is a canyon between the probability of any one person being a victim of terrorism — it’s down there with lightning strikes and death by bathtub misadventure — and the number of people who fear they or a family member will be a victim of terrorism. Now, Nassim Taleb would object that measuring the risk exclusively with body counts in the past ignores the potential for attacks on a vastly different scale in the future. And he’s right, of course. So let’s assume that next year we experience an attack on the scale of 9/11 (which was an order of magnitude bigger than any attack before). Does that significantly change the risk to you and your family? No. It’s still tiny. That remains true even if we imagine considerably more nightmarish scenarios. (Please note I’m only talking about terrorism as a risk to any one individual. Terrorism as a collective risk, which is the risk governments must deal with, is another matter.)
So why is that gap so big? Psychology is the big one. Terrorism’s purpose is to terrify and it could not be better designed to do that. But the media and politicians share responsibility. Whatever their intentions — and I think they are usually honorable — they tend only to amplify already exaggerated perceptions of the risk. And in doing so they are the unwitting allies of the terrorists.
CF: The desire to forecast often overrides evidence of our ability to forecast. Is this part of being human, or will we ever learn?
Yes. And no. I can’t tell you how many times I’ve heard people who laughed at past failed forecasts, scoffed at the very idea of long-term prediction, and then told me, with great solemnity and confidence, exactly how the Chinese economy will perform over the next three decades and what the price of oil will be in 2045.
These tendencies flow from elementary psychology. The psychology isn’t likely to change so the tendencies aren’t either. But we’re not slaves to our cognitive wiring. Again, the key is self-awareness: Understand how you think and keep watch for the pitfalls and you may, possibly, at least reduce the number of tiger traps you stumble into. .
CF: What aren’t we talking enough about right now?
Of course the answer is whatever I happen to be interested in at the moment. And that is history. So much decision-making is based explicitly or implicitly — usually implicitly — on the past. And yet we seldom stop to consider how we use and misuse history. We are particularly bad at realizing that while the history that happens to come to mind when we make a decision may be helpful, it may also be flawed, or incomplete. And there may be reams more history we are unaware of and don’t think to look for, a la Daniel Kahneman’s WYSIATI (“what you see is all there is”). There’s so much more we could do with the past.
Hey, someone should write a book about that….
CF: What have you changed your mind about in the last decade?
Inequality. I used to think of it exclusively in justice terms — meaning that if someone made money via free and fair exchange how that money was distributed should be no one’s concern but its happy owner. Studying psychology and society forced me to realize that whether or not that view is correct even in terms of justice, it ignores how our species thinks and how we live together. The evidence is overwhelming that inequality has to be a collective concern. This doesn’t mean I’ve gone Bolshevik. Clearly, societies with considerable inequality can nonetheless be happy, trusting, cohesive, and cooperative. (Americans weren’t exactly living in communes during the Eisenhower era.) But there are limits. And they must be respected or we will pay the price.