Blunders in decision-making

October 6, 2009 - Managing For Society, The Manila Times

On September 25, the Philippine Atmospheric Geophysical and Astronomical Services Administration (Pagasa) said, “Residents living in low-lying areas and near mountain slopes in areas affected by the southwest monsoon and those under signals No. 1 and No. 2 are alerted against possible flash floods and landslides.” Although I had read the alert and I live in the low-lying eastern side of Pasig, I made no preparations against flash floods at all. The day after, the first floor of my house, along with two cars, were buried waist-deep in muddy water. It took the family all of three days just to clear the mud from the house.

Why didn’t I prepare? I didn’t really look carefully at what the weather alert said. I simply looked at the strength of the rainfall, recalled the closest experience I had with rainfall of similar strength, adjusted a little downwards because my street is among the highest in our subdivision, and concluded: “My house is not at risk.” The tendency of people to make intuitive decisions this way has been called “anchoring and adjusting” by Daniel Kahnemann, a psychologist who won the Nobel Prize for Economics in 2002 for his work on psychological patterns used by people when making decisions.

Kahnemann, who collaborated with now-deceased Amos Tversky, discovered that humans often do not make decisions rationally at all, at least in the way they are often depicted in economic models, for example. Because of people’s limited calculating and reasoning abilities, they use so-called heuristics—simple rules-of-thumb—for making decisions. In my use of this heuristic above, the anchor was my recollection of similar rainfall in the past.

The problem with such decision-making heuristics is that they have been shown to be quite unreliable and often result in decision blunders because they lead to under- or over-estimation of risks. Humans tend to inaccurately base future events on what they remember about past events, irrespective of new information. This has been called the “availability bias.” Since I had never experienced house flooding given non-typhoon rain, my anchor was flawed to begin with, and the adjustment just made it worse.

What about flood insurance? I don’t have it, naturally. Richard Thaler and Cass Sunstein, in their book Nudge: Improving Decisions about Health, Wealth, and Happiness, explained that people who live in flood plains where floods have not been experienced in recent memory tend not to buy flood insurance.

Interestingly, people who know someone who has experienced flooding tend to buy flood insurance, irrespective of the actually risk of flooding where they live. The availability bias is at work in both instances.

Decision-making can be made more bias-free by taking a more comprehensive and balanced view of the situation. For example, in considering flood insurance, one should thoroughly study surrounding topography, especially nearby waterways and hills. In deciding on what to do about flood alerts, one should think about the actual impact of such risk if it happens and act accordingly. I was too lazy to do any of these, and I have paid dearly. Will I do better in the future? I hope so, but I honestly doubt it. I’m human.