Why You’re Not Prepared For Disasters (And What To Do About It)

The authors of Wharton Digital Press’s newest release, The Ostrich Paradox, explain why we underprepare for disasters, tell how to better strategize for crisis, and identify the key flaw to most organizational risk management plans.

 

Wharton Magazine: Why is the book titled The Ostrich Paradox?

Illustration by Vahram Muradyan

Illustration by Vahram Muradyan

Robert Meyer (professor of marketing; Frederick H. Ecker/MetLife insurance professor; co-director, Risk Management and Decision Processes Center): Ostriches are often characterized as hapless birds who bury their heads in the sand whenever danger approaches. In fact, they’re highly astute escape artists who use their great speed to overcome their inability to fly. The core thesis of the book is that in much the same way ostriches are limited in their defensive actions because they can’t fly, we need to recognize that when we make decisions, our biases are part of our cognitive DNA. In the same way the ostrich has adapted to risk by taking into consideration its physical limitations, we humans, when thinking about risk, need to develop policies that take into consideration our inherent cognitive limitations. We suggest that we need to learn to be more, not less, like ostriches, if we are to be better prepared for disasters. Hence the paradox.

 

WM: We often see post-disaster coverage about what people should have done in a particular situation: put up their storm shutters, evacuated, purchased earthquake insurance and so on. Why do people tend to disregard these warnings?

Howard Kunreuther (professor of decision sciences, business economics and public policy; professor of operations, information and decisions; James G. Dinan professor; co-director, Risk Management and Decision Processes Center): The reasons vary from person to person, but they’re explained by six major decision-making biases we discuss in the book. People have a hard time foreseeing future consequences (myopia), are too quick to forget losses from the past (amnesia), are inclined to think losses will occur to others rather than themselves (optimism), are too inclined to prefer inaction over action when faced with risks and maintain the status quo (inertia), fail to base decisions on all the information that is made available about a risk (simplification), and are overly prone to imitate the behaviors of others who exhibit the same biases (herding).

 

WM: Of these six cognitive biases, which are most common in terms of impacting our decisions?

Meyer: Their relative importance varies from situation to situation, but if there’s one that is most fundamental, it’s excessive optimism. We have a hard time fully anticipating the physical and emotional toll that disasters can impart, and we’re too prone to believe that disasters happen to other people in other places in other times. A second bias that can create serious problems is myopia. There’s a tendency for individuals to focus on short time horizons, so they don’t undertake protective measures that have long-term benefits, such as investing in loss reduction measures, because of their high up-front costs.

 

WM: Could you explain the herding bias and how it impacts our decision-making?

Kunreuther: When disasters threaten—be it a fire in a crowded room or a hurricane striking the coast—we’re often unsure of what action to take to reduce the risks of being injured or possibly dying. In such cases, we often imitate the behavior of others: In other words, we follow the herd. Sometimes herding instincts reflect an unconscious desire to stay with others when faced with fear; other times, they reflect a more conscious belief that others are more informed than we are. Either way, it’s a problem-solving technique that can lead to fatal consequences if misapplied.

 

“Ostriches are depicted as hapless birds who bury their heads when danger approaches,” says Robert Meyer. “In fact, they’re highly astute escape artists.”
WM: Where do most policy makers, firms, and organizations err when engineering preparedness solutions for the population, and how does The Ostrich Paradox differ in its proposed approach?

Meyer: Most modern approaches to risk management start by analyzing the objective likelihood and consequences of risks faced by individuals or communities, then design measures that could mitigate these risks—and hope people choose to implement them. For example, people in areas prone to earthquakes might be provided with checklists for how to prepare for such events and urged to buy earthquake insurance. But since people often don’t adopt these measures, we argue, effective risk management has to proceed in the reverse order, starting with an understanding of why people may not choose to adopt risk-reduction measures and then designing approaches that work with, rather than against, our natural biases.

 

WM: You provide a behavioral risk audit—a series of guided questions that should be considered by a planning team or organization. Can this be applied to the individual, or do you have other tips we can use to counteract our individual biases?

Kunreuther: We envision that the behavioral risk audit can—and should—be used as a source of guidance not just for communities, but also for individuals and households. We focus in the book on how individuals can improve their decision-making processes. The audit may also be particularly effective as a risk-management tool for households, since it should foster a discussion between family members about the biases we’re most prone to have and suggest measures for overcoming them that the household can agree to implement.

 

WM: Can you give us examples of how being given default options could both help and harm our decision making?

Meyer: When unsure how best to prepare for a disaster, we often choose the option that requires the least active mental effort—such as accepting the basic deductible in an insurance policy, or deciding to stay at home rather than evacuate. Unfortunately, in many cases, accepting these “defaults” can have tragic consequences. In our book, we suggest that this propensity to look for easy ways out in decision making can sometimes be flipped on its head by making safety something one needs to actively opt out of rather than opt into. As an example, one might overcome the hesitancy of people in flood-prone areas to buy flood insurance by providing it automatically with the payment of property taxes each year and allowing people who would actively prefer not to have it to apply for a refund of the premium.

 

WM: How can we embrace protective action as a society?

Kunreuther: This, of course, is the greatest challenge we face, and we see it as a long-run goal. The behavioral risk audit offers a tool that can help individuals overcome the psychological biases that often impede preparedness, such as failing to see the future benefits of protective investments and believing that disasters are things that happen to others. Many of the truly long-run risks we face, however—such as those posed by climate change—are even more difficult to deal with, since they require collective rather than just individual action. We argue that achieving effective collective action requires us not only to address individual biases, but also to embrace a series of guiding principles of societal-level safety, such as demanding that safety and long-run preparedness be a top priority in government planning and insisting that social equity be a consideration in the formation of policies.

 

Published as “Heads Up, Not In The Sand” in the Spring/Summer 2017 issue of Wharton Magazine.

 

 

Wharton Magazine - Background

Type to Search

See all results