Some Thoughts on Extreme Events and Uncertainty, Part 4

Aphorisms and Axioms

After a long, very long hiatus, I am back!

Let’s go through what we have looked at so far: the asset growth effect, the foundational principle of capital cycle analysis, a concept which reaches beyond value v growth investing, and parts 1, 2 and 3 of my series on extreme events and uncertainty, a series which ends with this post.

Spread the word and share this post, and if you have not yet subscribed, please do so!

Share

And now to part 4 of “Some Thoughts on Extreme Events and Uncertainty”.

***

27. Roubini refers to the Great Recession as a white swan, saying, “Crises, we argue, are neither the freak events that modern economics has made them seem nor the rare “black swans” that other commentators have made them out to be. Rather, they are commonplace and relatively easy to foresee and to comprehend. Call them white swans.” Noting how spoilt the developed world has been in the post-World War II expansion, he says, “In most advanced economies, the second half of the twentieth century was a period of relative, if uncharacteristic, calm, culminating in a halcyon period of low inflation and high growth that economists dubbed the “Great Moderation.” As a result, mainstream economics has either ignored crises or seen them as symptoms of troubles in less developed economies. To gain a more expansive way of viewing and understanding crises of the past, present, and future, one must go back to an earlier generation of economists.” 

Didier Sornette, whose Why Stock Markets Crash is a work of remarkable clarity, poses another paradigm: the Dragon King Theory, which refers to high impact events that are outliers to their peers, originating as they do from reflexive processes within complex systems, processes which amplify Dragon Kings to extreme levels, for example, positive feedback loops can send valuations to extreme highs or a death spiral. Studying these processes makes them more predictable. Like Wucker, Sornette argues that most of the threats we face are indeed predictable to some degree. For example, a critical mass of people knew a pandemic was coming, but not when, or how broadly and deeply it would affect the world; also, at the most basic level, we know that there will be a pandemic after Covid-19, euphoric valuations and depressive crashes in the future, that a war between great powers is inevitable, etc, because 6,000 years of history tells us to expect these things. What is perhaps impossible to predict is “when”. What we know is that certain events are regular, frequent, predictable and high impact. A perspective that treats human history as if it began in 1945 yields distorted expectations of the future and an instinct to deny crises at a point in human history when we are more capable than ever to deal with them. 

Roubini, Sornette and Wucker’s theory of the foreseeability of many high impact events stands against the common feeling that they are unforeseeable. Indeed, even Taleb’s work on black swans suggests an element of foreseeability: we may not know what black swans will come, but we do know that they will come, and as the world becomes increasingly connected, the frequency of black swans increase. Historical processes are more open than we expect, episodes of extreme events are more frequent than we fathom, and their import more impactful than we imagine. What distinguishes Wucker’s gray rhino theory is the added layer of “ignored” in spite of flashing signals. 

28. Known and knowable and yet ignored: the knowledge of these things is not constitutive of the actions taken by government officials, investors, and others who are in a position to prepare for them and who yet respond in a dully sluggardized manner. This brings to mind a key distinction made by Aristotle

“But since we speak of knowing in a twofold sense (for both the person who possesses knowledge but does not use it and the person who uses it are said to know), one will differentiate the person who possesses knowledge but does not attend to it – and even attends instead to the things he ought not to do – from the person who possesses knowledge and attends to it. For the latter [if he still acts wrongly] seems bizarre, but if he does not attend to his knowledge, he does not seem bizarre. … For we see in possessing-and-not-using a diversity of disposition, so that in a way it is possessing-and-not-possessing …. Uttering the statements based on knowledge signifies nothing. … Incontinent people must be supposed to speak in just the way that actors do.”

Possessing information is not the same as using it as part of a decision-making process. The market as a collective may possess all the information in the world, but still not utilize vital information efficiently, as Berkshire Hathaway’s PetroChina trade shows. By way of example, the market ignores rising populist anger over inequality and racism and consequently, the political risks that may derail even the best run company, a historical analogy of which is the breakup of Standard Oil. The International Monetary Fund (IMF) recently said it believes markets have generally ignored the increasing frequency of natural disasters over the last 50 years, a view shared by James Anderson of Baillie Gifford. What counts is not the information but whether the information is used in making a decision. A man may be warned off cigarettes by his doctors, but when he picks up a cigarette as he leaves the doctor’s, his decision to smoke is taken without his doctor’s warning being constitutive of it thus of his actions. Thus, it is possible to profit from what is well known if what is well known is ignored. Everyone had access to PetroChina’s annual report, and Berkshire Hathaway operate at a “disadvantage” to trading desks and hedge funds with armies of analysts, but by picking a well-known  piece of information and exercising a longer time horizon than the market at that moment, Berkshire Hathaway was able to profit handsomely. 

29. Action is often waylaid by institutional imperatives such as a culture that sees change as aberrant and to be warded off; executive compensation packages tied to short-term performance; a focus on shorter-term measures, such as earnings per share; and other incentives that reward short-term thinking even when the actors are aware that long-term thinking is the best mental framework for their decisions. Cognitive biases developed to bring succour and reduce stress, work against us, making us push to the back of the mind negative thoughts. So pervasive is this knowing-but-not-acting-according-to-that-knowledge, that, not only are threats ignored or shilly shallied with, but, they rouse managers to destroy long-term shareholder value or pass up value-creating opportunities. In chasing short-term performance, they make the enterprise riskier in the long-term and more likely to fail. In 2006, a comprehensive survey of 400 chief financial officers found that, “The results show that the destruction of shareholder value through legal means is pervasive, perhaps even a routine way of doing business. Indeed, the amount of value destroyed by companies striving to hit earnings targets exceeds the value lost in recent high-profile fraud cases”. 80 percent of the CFOs admitted that they would reduce discretionary spending on potentially value-creating activities in their pursuit of short-term earnings targets. Further to that, 39% said they would give discounts to customers to prod them to spend in the current quarter rather than the next. 

30. To free our minds from the illusion of a linear march of time, we must see uncertainty from a Mencian perspective. Amy Webb, a quantitative futurist, and author of , The Signals Are Talking: Why Today’s Fringe Is Tomorrow,offers a fertile approach to thinking of uncertainty: the “Axes of Uncertainty”, which external and internal uncertainties are explored, placed on opposing sides of the axes, with each quadrant categorised to describe how that state would look if it happened, and the process used to discover actions to take. This framework throws open existential risks that were hidden, and opportunities that were ignored. In thinking of the future as a series of possible paths, rather than a straight line, the mind becomes looser in its thinking, more flexible, more prepared. The typical superforecaster learns to think in terms of options, of possible scenarios, ours is to bring this optionality to our approach to uncertainty. The benefit of this wide-ranging thinking, this willingness to probe the future for the worst and the best outcomes, is that we can prepare for gray rhinos before they become serious threats. 

31. The lesson of history is even under radical uncertainty there are threats and opportunities we can prepare for, even though we may not know when or to what extent they may occur. Ours is to explore uncertainty for threats and opportunities and to take bold and decisive action. The most elementary thing we must remember are cycles and their permanence. The ancients understood this, from the Mexican Day of the Dead, to the Buddhist practice of maraṇasati, the ancients knew the benefits of reminding themselves of the cycle of life and death. Tertullian claimed that during his triumph, the victorious general would have someone, while holding a crown above his head, whisper in his ear the words, “Respice post te. Hominem te memento” (“Look after you [to the time after your death] and remember you’re [only] a man.”).

***

Interesting Stuff:

Earlier in the series, I mentioned that our notions of what is normal in terms of interest rates are overly influenced by the recent past, Sam McBride of New Constructs in a piece titled, “The Fed Is Irrelevant: Low Interest Rates Are the New Normal”, suggests, and I believe him, that low interest rates are here to stay.

Beth Kindig is a wonderfully lucid and smart thinker in tech investments and she suggests a framework for “Playing Defense with Cloud Software Stocks”.

We have all heard of loss aversion and the classic formulation usually goes something like this: “Suppose you have $100 and are offered a gamble involving a series of coin flips. For each flip, heads will increase your wealth by 50%. Tails will decrease it by 40%. Flip 100 times. Should you take the bet?”. The idea of loss aversion is so widespread that the automatic answer to this question is, “yes”. Ergodicity economics, championed by Ole Peters of the London Mathematical Laboratory, and by Nassim Taleb, suggests that loss aversion may be perfectly rational. Read Jason Collins’ Ergodicity Economics: A Primer” before heading off to Ole Peters’ site. I plan on discussing what ergodicity means for your portfolio and how you should think about portfolio allocation.

Share The Mirandolan

Leave a comment