The thought is always head scratcher, but all communities have to continually ask it: How well is a company, community, region, country, prepared to recover from a disaster?
A couple cases in point are the Katrina disaster in New Orleans that occurred 11 years ago, or even closer in, the devastating floods that swept through Baton Rouge and South Louisiana last month.
How to Prevent the Next Disaster
Cleaning Up Big Data
Security Alert: Securing Supply Chain
Why People Ignore Software Security Warnings
Researchers and policy makers worry about such questions because they know that some places are more vulnerable to widespread destruction while some are more prepared to bounce back quickly. But which places are more vulnerable and which are more likely to be resilient?
Scientists and policy officials are seeking good answers to such questions using what are known as disaster vulnerability and resilience indices.
“Natural disaster indices are already increasing in number and being used by many public and private organizations, as current environmental risks continue to plague communities across the globe,” said Dr. Igor Linkov, who leads projects implementing resilience management for cyber systems, critical infrastructure, energy, and environment as the leader of the Risk and Decision Science Team and Focus Area at the U.S. Army Engineer Research and Development Center.
Among the efforts that use disaster indices now in place is a $1 billion initiative launched by the U.S. Department of Housing and Urban Development to increase natural disaster resilience across communities.
In the wake of Typhoon Yolanda in 2013, the Philippine government created a program called Reconstruction Assistance in Yolanda, an $8.2 billion plan to recover from the storm and increase resilience.
“Climate change is expected to increase the severity of many types of disasters,” Linkov said. However, because the damaging impacts will not be equal across space and time, “disaster vulnerability and resilience indices can be very useful tools for decision makers to quickly assess and target resources to places of greatest need.” Up until now, however, few indices received empirically validation, until now.
In a paper published in the online version of Risk Analysis, a publication of the Society for Risk Analysis, Linkov and colleagues validated five of the top U.S. disaster indices.
Observing that many common indices use much of the same readily available data but come to somewhat different conclusions, the researchers explored important questions, such as: What are the strengths of each index? Which index is best suited for each particular application? Are there better tools and methods to quantify resilience and vulnerability?
Their paper, “Validating Resilience and Vulnerability Indices in the Context of Natural Disasters,” systematically tests how well the indices perform in explaining common disaster outcomes.
Using observed losses, fatalities, and disaster declarations from the southeastern United States to empirically validate each index, the authors found disaster indices “are not all created equal,” though they are thoughtfully substantiated by literature and theoretically persuasive. On the positive side, four of the five indices performed as predicted in explaining damages. But in explaining fatalities, only three are consistent with theory, and in explaining disaster declarations, only two comport with theory.
As a start toward better indices, the researchers said experts who develop resilience models should specify their intended purposes and use constraints and conduct empirical validation to better guide end users in effectively using the models. To support further improvements in quantifying vulnerability and resilience, the authors propose a Resilience Matrix, a decision analytical framework to help organize metrics of vulnerability and resilience into domains (physical, information, cognitive, social) and stages of disaster management (prepare, absorb, recover, adapt).