Deciding in a risky world
In a VUCA world, you need to understand volatility and uncertainty. Yet even CEOs, CFOs and senior business people I work with are ignorant of some of the basics of risk and probability, which can make them the “sucker” in the game. (In poker, we say if you can’t spot the sucker at the table, it is probably you!)
This short post is excerpted from The Science of Organizational Change – the first book on leadership and change to introduce probabilities, risk, and cognitive biases to the business community.
Also, listen to my podcast with Annie Duke (poker pro, and business strategy consultant) on this topic.
Here is the chapter excerpt – from a chapter called Governance and the Psychology of Risk.
This century’s research in psychology and economics has shown how deeply flawed human intuitions are especially about probability, and this lack of intuitive savvy, and of risk education compromises the human’s ability to make important personal decisions (for example, about health). In the business context, where the stakes are much higher, leaders need to build institutional and decision-making safeguards to protect their business (and the communities and environment they affect) from risk-oriented mistakes.
Seeing patterns that aren’t there
First, humans believe in trends and patterns in events that happen randomly. We believe in “winning streaks,” “hot hands,” “due for some rain,” or that seven black spins increases the chance that the next spin will be red.
Not enough data
Second, we place our faith in small samples. Our exquisite pattern-finding apparatus causes us to leap to conclusions about the world based on flimsy data. An institutional sampling error in business is the “case study” which although providing a textured analysis of a situation, cannot be extrapolated to other businesses without caution.
Trying to “get even”
Third, in troubled situations, our attitudes toward risk change. Rather than hunkering down until the storm passes, we tend to get more aggressive and risk seeking. Suppose I offer you the choice of two boxes. One contains $1,000 and the other contains a surprise 50 percent of the time, it contains $2,000, and the other 50 percent of the time, it contains nothing. You take the sure thing; in fact, you might take $950 rather than a fifty-fifty chance of $2,000, giving up some economic value for some certainty. This is called risk aversion.[1] However, when Kahneman and Tversky produced their famous paper, Judgment Under Uncertainty,[2] they showed that in neutral and positive situations (things are going well, or as planned), risk aversion dominated. However, when things became difficult, risk aversion gave way to risk-seeking behavior. This explains why people who are “stuck” stay much longer in casinos than people who are “up.” When their stocks go down, people risk-seek to “get it back.” When change projects go awry, even sensible leaders can try desperate measures.
[social_warfare]
Gross errors in relative probabilities
Fourth, humans are particularly bad at relative probabilities. After 9/11, people left the skies and took to the roads, sending fatalities soaring because road travel is substantially more dangerous than air travel. People will pay more for antiterrorism insurance than regular travel insurance despite the fact that terrorism is a vanishingly small probability when compared with less emotionally charged events such as losing luggage or missing a flight. This is especially acute in situations where qualitative judgments of risk are used, such as the project risk register described previously.
“It could never happen to me”
Fifth, when dealing with tiny probabilities with catastrophic consequences, people sometimes make irrationally conservative decisions. This has had a dramatic effect on parenting norms in the last decades. Modern parenting is dominated by “abduction monsters,”[3] despite the fact that the chance of stranger abduction, especially nonfamily members, is minuscule[4] (about 2,000 times less likely than dying in a car accident). Paradoxically, in business, the opposite sometimes happens in these low-probability, big-catastrophe scenarios: Leaders pretend there is no risk at all. This is called the zero-risk bias, that bias that turns small risks into zero risk or, as I like to call it, the “it could never happen to us” bias. Big events with significant, but low probabilities are completely discounted like two of Wall Street’s oldest and finest declaring bankruptcy within a 13-month period or housing prices collapsing 75 percent in some areas. Yet events such as these make or break countries and corporations (taking families and communities along with them).
Going home broke
Sixth, we misunderstand “risk of ruin.” Suppose you have $1 million in capital. I offer you a coin toss where you get $1.2 million if you win, but you can only lose your $1 million. This is obviously massively profitable, what is called plus EV (expected value). How much should you wager on the first coin toss? If you “bust,” you lose the ability to make future massively profitable investments. Even with this very large plus EV situation, only slightly more than 10 percent of the capital should be risked at any one time to avoid risk of ruin. This calculation (called the Kelly Criterion) applies to change projects and how much capital should be invested in risky ventures. Lack of attention to risk of ruin brought down Long Term Capital Management, a firm run by some of the biggest brains in investment banking, and Enron (allegedly some of the smartest brains in oil and gas).
[social_warfare]
References on risk
[1] When I offer this puzzle in programs, there are always executives who object saying they would always take the highest expected value. Try the thought experiment with a sure $9,000,000 or a fifty-fifty chance of $20,000,000 to test your own risk aversion.
[2] Kahneman, D., & Tversky, A. (1974, September 27). Judgment under uncertainty: Heuristics and biases. Science, pp.1124[nd]1131.
[3] See Hoffman, J. (2009, September 13). Why can’t she walk to school? The New York Times, p. ST1.
[4] Despite the prevalence of this fear, and the extraordinary measures to prevent its occurrence, in the United States, there are about 100 abductions per year from a population of 16 million children. About 75 percent of those are by estranged spouses, leaving 25 out of 16 million. The risk of stranger abduction is just greater than one in a million, approximately the same as being killed by lightning.
Leave a Reply