- About Us
- News & Publications
- Education & Events
- Corporate 100
- Member Resources
|Disaster Creep – Why Disaster and Catastrophe are the Norm - not the Exception|
Issue 71: Disaster Creep – Why Disaster and Catastrophe are the Norm - not the Exception
By Michael D. Larrañaga, P.E., Ph.D.
The frequency, intensity, and severity of disasters globally are increasing, and this "disaster creep” has profound effects on modern society.1,2,3,4 During the year 2011, the U.S. had 14 separate billion-dollar weather disasters followed by 2012 with 11 billion-dollar weather disasters. The economic consequences of these disasters are growing exponentially as a result of increased population densities, and the transition to coastal megacities to foster global trade. As global commerce continues to sustain the world’s economies, the world becomes increasingly networked. As networks grow both in size and flow capacity, they tend to self organize towards efficiency, optimization, and decreasing redundancy. And this has profound effects on the magnitude of disasters.
Recent weather related disasters that support this premise are the 2005 Hurricanes Katrina and Rita ($85 billion), Japan’s 2011 earthquake-Tsunami-nuclear meltdown (>$235 billion), the 2008 Sichuan, China earthquake ($29 billion), and Superstorm Sandy, which is expected to become the costliest disaster in U.S. history. The Chernobyl meltdown ($200 billion), Space Shuttle Columbia ($13 billion), Prestige Oil Spill-Spain ($12 billion), and the BP/Macondo Incident in the Gulf of Mexico ($20 billion-$40 billion) are the costliest non-weather related accidents in world history.5,6 Other examples include terrorist attacks, earthquakes, electrical blackouts, transportation system collapses, industrial accidents, food shortages, epidemics, and political and social upheaval. The challenges presented by these disasters threaten global supply chain resilience (banking, agriculture, natural resources, energy, etc.). These catastrophes are intensified by the modern evolution of highly connected (networked), optimized, and cost-efficient systems that have eliminated the surge capacity and error tolerance found in complex systems.
Many modern energy, industrial, transportation, health care, telecommunications, and political systems are highly vulnerable to small changes that propagate and develop into major disasters. The systemic tendency to unravel, decay uncontrollably, or move from order to disorder (e.g., disaster) is a characteristic of all natural and human-made systems and is defined by Newton’s Second Law of thermodynamics as "entropy."
Before one can overcome this entropic tendency of all systems to unravel, one must understand the nature of system failure across multiple disciplines. In addition, decay of networked systems is non-Newtonian, meaning that networked disasters decay at a rate much faster or different than what can be predicted. Humans have a tendency to view disasters as outliers, or uncommon events, yet experience shows that catastrophes are the norm and stable systems are the outliers. Human nature forces people to view the world through naïve "disaster” lenses. Perhaps this is the human condition – perpetual optimism. Ted Lewis, author of Bak’s Sand Pile, maintains that life is a series of passages from catastrophe to catastrophe with inconsequential periods of calm in between, thus staking the claim that catastrophe is the new normal.3
Predictable unpredictability is a characteristic of most complex systems. This seems to be contradictory, but it can be seen in natural systems. Natural catastrophes such as epidemics, forest fires, earthquakes, insect foraging, avalanches, population shifts, and extinctions all present a similar curve that can be obtained by plotting the probability of an event occurring (y-axis) against the consequences of the event occurring (x-axis).3 One example is an avalanche in the Swiss Alps. There is a high probability that an avalanche will occur, but the overwhelming majority of these avalanches will be small and inconsequential. At the other extreme, there is a very small chance that a large and catastrophic avalanche will occur. Yet catastrophic avalanches occur every year around the world. The curve obtained by plotting avalanche frequency vs. magnitude exhibits power law behavior and can be used to explore variations in avalanche size through space and time in addition to contributing factors, which can provide valuable information to avalanche managers who strive to prevent catastrophic avalanches.7
Another example is forest fires; it's not possible to predict where the next forest fire will ignite due to random lightning strikes, but forest fire data with regards to frequency and magnitude of forest fires follows a power law.
Power laws are defined by the curve 1/xq (x > 0, q > 0) and have a shape as shown in Figure 1. These patterns are found throughout natural, man-made, and random systems. The exponent, q, is of great importance. The horizontal axis can be anything of interest, including storm damages, casualties of war, or elapsed time between accidents. Hazards with power law exponents below 1.0 are high-risk, while hazards with exponents above 1.0 are low-risk. As such, the exponent has important implication for risk-informed decision-making. Table 1 gives some common examples of power law exponents, the x-axis values, and their associated risk classifications. From this, one can conclude that forest fires present a high risk to society, as do hurricanes, earthquakes, wars, and epidemics.3,8
Table 1: Power law exponents of various incidents and disasters.3
Clearly, the west coast analysis presents tremendous PML risk (Figure 3), as almost any random fault in the west coast crude delivery system can result in 100% loss of the network, dwarfing the PML risk of a random fault in the entire national network. PML risk is determined by multiplying the probability values (y-axis) by the percent consequence values (x-axis) in terms of percent of network loss (e.g., risk = probability x consequence) where percent consequence (x-axis) is defined as the percent of the network lost due to a single random fault.9
So if one wanted to increase the resiliency of the entire national crude oil network, the west coast would be a good place to start; to become more resilient against catastrophic failure by adding redundancy, hence decreasing optimization while increasing resiliency against catastrophic failure.
Michael D. Larrañaga is with Oklahoma State University
3rd Quarter 2010 -- Wildfire: Past, Present and Future – Ronny J. Coleman and Kate Dargan
Summer 2008 -- Planning for Disasters – Donald L. Schmidt
For questions concerning delivery of this e-Newsletter, please contact our Customer Service Department at (216) 931-9934 or magazine.sfpe.org.
Copyright 2013, Penton Media, Inc. All rights reserved. This eNewsletter is protected by United States copyright and other intellectual property laws and may not be reproduced, rewritten, distributed, re-disseminated, transmitted, displayed, published or broadcast, directly or indirectly, in any medium without the prior written permission of SFPE and Penton Media, Inc.