Fear the Known

The key of investment is the management of risk, both upside and downside, and one of the things that made clear by the last market meltdown is that the tools we have developed to characterize, trade, and manage risk didn’t work as well as we thought — or at least as we said — they would. This doesn’t mean we won’t eventually develop and use new ones, as they are a cornerstone component of any complex economy. Hopefully, but perhaps not likely, with better organizational and conceptual foundations for them.

But even at its best and most rigorous, risk analysis is but a form of structured thought, and although it is potentially much more powerful than blind estimations and common wisdom, it’s still framed by the cognitive characteristics of both the creator and the consumer of the analysis. We just cannot analyze what we cannot conceive of, and will not analyze what we cannot take seriously.

Among the large literature dealing with this phenomenon, it can be interesting to revisit a 2005 paper from the University of Chicago studying psychological factors behind the evaluation of different risks, including that of climate change. This article is not specifically about climate change but rather general risks (although we believe that climate change will indeed be one of the most influential economic risks of the next decades), but it serves to illustrate the cognitive patterns that influence, and sometimes interfere with, our evaluation of risks.

Psychologically, we tend to overestimate the risk of events that we have witnessed or been informed of often (frequency), and those that have left an strong impression on us (salience), both factors that can be strongly influenced by our social and informational environment. This is one version of the so-called availability heuristic, a subconscious mental shortcut that gives priority to items that are easily recalled from memory — something enhanced by both frequency and spectacularity.

One of the best examples of this phenomenon during the last ten years has been the post-9/11 focus on airplane security, something that went beyond the rational security measures that could have been suggested by the attacks. Low-probability or even infeasible but spectacular possible modes of attack justified security measures like the prohibition of liquids in cabins, and in more than one case the grounding of passengers on account of “speaking suspiciously in arabic,” while less psychologically gripping possibilities received less concern.

Investment fared no better during the last economic expansion, specially before the dot-com crash. For all the cold analytical skills expected and demanded from experts, imaginations had been gripped, and that affected deeply the framing of all analysis. It’s easy to overestimate the diversity of frames in the analytical community: interpersonal connections, the universal availability of the same highly-regarded sources, and the fast dispersion of opinions makes possible a “global imagination” far more homogeneous that would have been possible otherwise.

Today the same thing is happening. Despite the fact that the global economy did recover from the depression of the 1930′s and even the extensive physical destruction of WWII, there have been many voices supporting the idea that there has been an economic paradigm shift, even going so far as to call it “the end of capitalism.” It is undeniable that the economic events of the last few years have not been lacking in salience, so is natural that they affect to a very large degree any analysis. And, indeed, it would be a failure of imagination to consider impossible any dramatic shift in the economic structure of the world.

But it would also be a failure of imagination to base all scenarios for the future on the direct progression of recent events. As much weight as they, understandably, have in our thoughts, risks, in the sense of both positive and negative surprises, cover a wider spectrum of possibilities.

In a world of universal access to information and opinion, and with our risk assessments structurally biased by what we can imagine and react to, perhaps a strong imagination capable of envisioning scenarios outside the consensus of the week, and taking them seriously enough to pay them due attention, is a necessary prerequisite for the cold analysis of risks.

Editor’s Note: You might find our regular Nearby Realities section of very, very short science fictional stories useful in that regard.

Laisser un commentaire

Votre adresse de messagerie ne sera pas publiée. Les champs obligatoires sont indiqués avec *