No products in the cart.
By: John Westphal, Senior Advisor
The achilles heel of high consequence industries I have worked in over the years is the over reaction to a single event. In other words, we allow one event to drive systemic changes throughout the organization when that one event may not necessarily tell us enough about the risk that existed within that particular socio-technical system, or the learning system lacked the sophistication to offer a robust enough view of the risk within the identified system.
The question then becomes, which one is it, the sophistication of the learning system or the insignificance of the event? Generally, the failure resides in the learning systems capability to fully explore the richness of the single event. Failing to combine that learning with other event investigations and developing an accurate view of the inherent and system risk existing within the operation.
The described failure within the learning system generally occurs because of two reasons. First, our single event investigation struggles to define the appropriate cause and effect relationships that existed within the event. Often we insert non-causal data or non-duties into the event causing such significant noise we are unable to articulate the actual risk. Secondly, due to this failure in our single event investigations, we become limited in our ability to do precise common cause failure analysis when looking across multiple events. Simply put it becomes a “garbage in/garbage out” learning system.
In addition to the stated failures above, our learning systems often fail to correctly identify causes related to a class of events. In other words, the causal factor is not relevant to what occurred yesterday but is relevant to what may happen tomorrow or six months from now. A great example of this was played out in the movie, “Flight,” starring Denzel Washington. In the movie, Washington plays an airline captain with a significant drug and alcohol problem. In an aircraft emergency, Washington engages in extraordinary actions saving hundreds of lives on board even though he was intoxicated while flying the aircraft. The question becomes, was his intoxication causal in regards to the loss of the aircraft? No. It was a mechanical failure that caused the loss of the aircraft. However, does a pilot who is willing to fly under the influence of alcohol and drugs represent a risk to the system? Yes!
At the end of the day our learning systems must exhibit three layers of examination. First, we must be able to identify the relevant cause and effect relationships within a single event. Secondly, once the cause and effect relationships are identified within these single events, we conduct the proper systemic analysis to identify risk mitigation strategies. A general rule of thumb is 70% to 80% of our interventions should come from systemic analysis. Third, our learning system must also assess causal factors related to the class of event. Working to move from the reactive realm to proactive and eventually a predictive learning system.