If you’ve watched the news over the last few months, you’ve probably heard that the US Navy is struggling to keep its ships from running into cargo vessels. And if you’ve read the press reports and the official political explanations, you’ll find “human error” to be the root cause.
To this explanation, we take issue – the same issue we would have with an arson investigation concluding “flame” to be the root cause of a house fire. At best, it is an unhelpful explanation; at worst, it is an attempt to cover up the real causes.
Jens Rasmussen, a pioneer in the field of safety science, explains that we humans investigate until we find a cause that is abnormal, but familiar. It’s like our embedded investigation protocol. You have a flat tire (the bad outcome). You find a nail in your tire (the cause). The nail is both abnormal, and familiar. Hence, the investigation is complete. Yet, when you see green ooze coming from the valve stem of the tire – it’s admittedly abnormal, but it’s definitely not familiar. So, you continue your search. Now, the problem with human error is that it is both abnormal and familiar. We humans are fallible. Mistakes happen. So when we hear that a Navy ship collided with a cargo vessel, and that human error was the root cause, we humans generally feel like the investigation is complete. “Oh, there it is again, that dreaded human error. Great investigation. Time to move on.”
With little doubt, we would all quickly reject flame as the root cause of a house fire. We would want to know what caused the flame. The same should go for human error. It should be a basic axiom of modern event investigation that any human error requires the search for preceding causes. That is, if you as an investigator are willing to point the finger at a human being for making a mistake, you should be willing to go the next step to understand why the error occurred. Yet, that simple next step is often not even contemplated. When we hear that a patient was harmed in a hospital, we seem to settle in with the explanations of medication error or diagnostic error. In fact, in our review of event investigations within US hospitals we find that roughly 75% of event investigations stop at the human error itself. Yes, it may give a cursory explanation as to why a patient was harmed, but it gives us nothing to fix, other than to tell a caregiver not to make the mistake again. And we know how well that works.
It’s OK to use the term, mistake, error, slip, or lapse. They are not words of blame – they are only words to describe one immutable aspect of the human condition – to err is human. Yet, when we use them, we need to commit to further investigation to explain the errors we see. Human error cannot be the end of the search for causes, and it’s up to us to hold investigators accountable for their choice to stop the investigation short. Let’s face it, it’s just too tempting for some leaders to see the label of human error as a tool to distract us all from the often more inconvenient truths about the inadequate systems we designed around our employees, or the risky behavioral choices we all made as precursors to the error and its unfortunate outcome. What we’ll likely find in the Navy mishaps is a disturbing sequence of at-risk choices that led ship commanders to lose situational awareness. We might also find a system that was woefully thin in its design, allowing the ship to operate perilously close to harm.
There’s always more to the story than mere human error. Don’t let someone’s proclamation of human error as the root cause fool you into missing the underlying causes that need to be addressed. Don’t let human error be the root cause.
CEO Outcome Engenuity