Shit Doesn't Just HappenExcerpt from Shit Doesn’t Just Happen: The Gift of Failure

Why be concerned about catastrophes?

  1. False Assumptions

What is a catastrophe?

  1. The final event of the dramatic action, especially of a tragedy
  2. An event causing great and often sudden damage or suffering; a disaster
  3. Utter failure

We are usually surprised when a catastrophe strikes. There is a tendency to believe that a catastrophe is something that is unexpected, always happens suddenly, and is caused by a single thing going wrong.

These are false assumptions. The vast majority of catastrophes can easily be predicted with some attention and focus. If predicted, they can often be planned for and averted. If unavoidable, they can be planned for and their results blunted and minimized. Catastrophes occur suddenly only in terms of the final event, the catastrophe itself; however, the buildup, via a series of what we will term cascade events, can be very long in the unfolding. And at least one of these cascade events involves human error. Thus most catastrophes can be avoided.

In this book I walk through seven well-known catastrophes, showing the six cascade events leading to the seventh and final event. I list the events, pointing out how each could either be noted (knowledge often can prevent the cascade of events that lead to #7, the final event) or corrected. The key for us to focus on is what was learned and changed because of each, saving the lives of countless others afterward.

  1. A catastrophe is closer than you think.

While you might not have personally been in a catastrophe or a tragedy, I can assure you that we have all come close more often than we realize. Many times we’ve been to a #4, #5 or #6 cascade event and not gone into the final event; therein lies one of the key deceptions that lulls us into complacency.

As we will see in the seven examples, there are many places along the cascade of events where a single person saying or doing something, could have stopped the cascade and prevented the catastrophe or, at the very least, minimized the effect of the final event. Thus it’s very important for us to understand how seemingly innocuous events can play a tragic role if left unchecked. This book is also about the gift of failure: how we can learn from past catastrophes in order to avoid ones in the future. The aviation industry works off the gift of failure in that practically every safety innovation introduced is invented in response to a plane crash.

Ultimately, it’s about gaining the proper catastrophe mindset, which goes against our natural instincts because . . .

  1. Delusion events fool us.

We often look at narrow escapes or near misses as ‘fortunate’ events where disaster was averted; indeed, we get to the point where we normalize near misses. Instead, we need to look at these ‘fortunate’ events as cascade events where we came close to catastrophe and were simply fortunate that we didn’t hit the final event. Relying on luck is a very dangerous mindset yet we immerse ourselves in it on a daily basis. We often call it ‘dodging the bullet’ forgetting that when a bullet hits, the results are catastrophic to the target.

We need to focus on cascade events, see their negative potential, and reduce their occurrence. A cascade event that doesn’t lead to a final event we will label a delusion event. A cascade event and delusion event are exactly the same: the only difference is that a delusion event doesn’t result in a final event.

This time.

challengerDelusion events lead us into delusional thinking: that we will continue to dodge the bullet by doing nothing. In fact, a delusion event, where something goes wrong, but doesn’t lead to the final event, reinforces our complacency to do nothing about correcting a delusion event and increases our risk of a final event, a catastrophe. We take it as the status quo, not an aberration. Delusion events lead to the normalization of unacceptable risk. For a very simple example, the further you drive with the check engine light on in your car, the more you think it’s normal for that light to be on.  This is called normalization by Diane Vaughan in her book The Challenger Launch Decision.(1) We’ll discuss this catastrophe as one of our seven in the second book in this series, focusing on organizational thinking about delusion events.

How many times have you been in a hotel or restaurant or store and the fire alarm goes off? How many times did you hurry to the exit? Rather, didn’t you, and everyone around you, with no smoke or fire, stand around, and wait for someone to actually announce what’s going on? We’ve been desensitized by false alarms to the point where the alarm serves little purpose any more.

The Harvard Business Review did a study in 2011 (2) and found that delusion events (multiple near misses) preceded every disaster and business crisis they studied over a seven-year period. Besides delusional thinking leading to normalization, the other problem is outcome bias. If you flip a coin six times and it come up heads six times, even though statistically rare (1 chance in 64 attempts), you will tend to start focusing on the result, believing all coin tosses end up heads. While we know this isn’t true, we tend to base our probabilities of future occurrences not on the statistics of reality but on our experiences.

This is called heuristics and is at the root of many disasters. Hueristics is experience-based techniques for learning and problem solving that give a solution which isn’t necessarily optimal. We generalize based on the things we value most: our own experience and information related to us from sources we trust. Think how many ‘truths’ you have heard that turn out to be nothing more than an urban legend or a superstition. Yet we base many of our daily and emergency actions around these.

A small example from The Green Beret Survival Guide: every so often there is a news article about someone in a desperate survival situation who claims drinking their urine helped them make it through. That’s absolutely the wrong thing to do. But it’s one of those stories that gets repeated enough, we believe it to be true. Because we only hear from survivors, who did so despite doing the wrong thing.

It is human nature that we focus on successful outcomes much more than negative ones. It’s irrational, but that’s part of being human. In the same way, managers and leaders are taught to plan for success, not failure, since it’s believed planning for failure is negative thinking. In fact, I would submit that many people are part of a cult of positive thinking that often excludes reality.

The good news is we tend to be predictably irrational and understanding our tendency to make a cascade event a delusion event, is the first step in correcting this problem.